Data Engineer (Remote)

Apply for this position Please mention DailyRemote when applying
Posted 9 days ago United States Salary undisclosed
Before you apply - make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

Job Description

A financial services firm in North Carolina is currently seeking an experienced Data Engineer for a great opportunity with their team in Charlotte. In this role, the Data Engineer will be responsible for developing framework for scalable data infrastructure solutions to integrate with heterogeneous data sources.

This is a hybrid role requiring the qualified professional to work two days (Tuesdays and Thursdays) onsite.



Responsibilities:


The Data Engineer will:


  • Architect and build to enhance an ever-expanding data platform supporting business process needs for internal and external integration via APIs, data models, self-serve reporting solutions, and interactive querying

  • Define, build, test, document and audit data platform artifacts including data models, data flow processes, integrations, etc.

  • Develop standard methodologies and frameworks for unit, functional and integration tests around data pipelines, and drive the team towards increased overall test coverage

  • Design Continuous integration and deployment processes and best practices for the production data pipelines

  • Work with the latest and greatest technologies in the Microsoft cloud stack including Azure SQL, Synapse, Data Lake, Data Factory, Databricks, Azure function, Service Bus, etc.

  • Collaborate and influence Users, Engineers and Products partners to ensure the company's data infrastructure meets constantly evolving requirements

  • Perform other duties, as needed



Qualifications:




  • 2+ year of related work experience

  • Bachelor's Degree

  • Experience working with Cloud platforms, such as Azure or AWS

  • Solid understanding of Real-time Data Processing, Data Pipelines, Transformation and Modeling using traditional and distributed systems

  • Experience with development of Test Automation solutions

  • Detailed knowledge of Relational, Multi-Dimensional databases and No-SQL solutions

  • Experience with OO Programming language, like C#, Java or Python

  • Experience with developing and architecting Data Ingestion models, ETL jobs, and alerting to maintain high availability and data integrity

  • Strong programming skills, especially in SQL, Data Modeling and related Data Processing concepts

  • Great interpersonal skills

  • Excellent communication skills (written and verbal)

  • Strong attention to detail

  • Highly organized