Data Engineer - Azure Databricks/Remote 12 mth+ contract

Apply for this position Please mention DailyRemote when applying
Posted 3 days ago United States Salary undisclosed
Before you apply - make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

Job Description

Need a SQL Wizz, DataBricks and coding experience (Python, .Net etc) Snowflake is NOT required. 12+ month contract remote Job Description:The Data Engineer will report to the BI/DW Supervisor, partners with BI and Software Engineers, Analysts, business stakeholders and Enterprise Analytics leadership. This individual will be building the data pipelines and data structures that create and support our cloud data warehouse. These datasets are highly used by our business analysts, managers and data scientists. They serve as the foundation for self-service, BI and Advanced Analytics. Responsibilities: Work closely with business and technical teams to deliver enterprise grade datasets that are reliable, flexible, scalable, and provide low cost of ownership. Understands common analytical data models like Kimball. Ensures physical data models align with best practice and requirements. Build and maintain raw data pipelines from varied sources. Build and maintain the data warehouse pipelines. Updates and creates azure pipelines to support our continuous deployment model. Recommend ways to improve data reliability, efficiency and quality Analyzes and estimates feasibility, costs, time, and resources needed to develop, and implement enterprise datasets as needed. Research opportunities for data acquisition and new uses for existing data Recommend ways to improve data reliability, efficiency and quality Collaborate with Enterprise Architecture to publish and contribute to architecture standards and roadmaps. Achieves and maintains relevant technical competencies and helps to foster an environment of continued growth and learning among colleagues on existing and emerging technologies. Qualifications: A Bachelor's Degree in Computer Science or related field is required. A high school diploma and/or equivalent combination of education and work experience may be substituted. A minimum of 5 years relevant experience of development using integration platforms. Prefer recent experience using cloud data engineering toolsets. Highly prefer recent experience in Azure using Azure Data Factory, Azure Databricks and Snowflake. A minimum of 2 years experience building database tables and models. Must be able to write TSQL for DDL and DML operations fluently. Strong understanding of enterprise integration patterns (EIP) and data warehouse modeling. Experience with development and data warehouse requirements gathering, analysis and design. Possess strong business acumen and consistently demonstrates forward thinking.