Lead Data Engineer

 Published 2 months ago
    
 United States
    
 $90 - $125 per hour
Apply Now Please mention DailyRemote when applying

Disclaimer: Before you apply, please make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

The Role:

We are looking for a Lead Data Engineer to spearhead the design, development, and optimization of scalable data pipelines and enterprise data solutions. This hands-on leadership role will focus on building modern cloud-based data infrastructure and driving best practices in data integration, transformation, and performance across the organization.

Key Responsibilities:

  • Design, build, and optimize scalable ETL/ELT pipelines using modern data engineering tools and frameworks (full/incremental and CDC).
  • Develop and maintain robust enterprise data models by integrating data from internal and third-party sources.
  • Collaborate with architects, analysts, and business stakeholders to translate requirements into technical solutions.
  • Ensure the performance, scalability, and reliability of data platforms and systems.
  • Implement and enforce data governance, quality, and security standards.
  • Drive automation and operational efficiency across data workflows and platforms.
  • Participate in code reviews, document solutions, and contribute to continuous integration and deployment (CI/CD) practices.
  • Mentor junior engineers and contribute to a high-performing data engineering culture.

Essential Knowledge, Skills, and Abilities:

  • 10+ years of experience in data engineering and integration.
  • 5+ years in data warehousing (Kimball methodology preferred).
  • Expertise in Snowflake, SQL, and Informatica (required).
  • Strong programming skills in Python and familiarity with REST/SOAP APIs.
  • Solid understanding of ETL/ELT patterns and data pipeline orchestration.
  • Experience working with enterprise platforms such as Salesforce, SAP Commerce Cloud, or NetSuite.
  • Hands-on experience with GitHub, JIRA, and Agile methodologies.
  • Cloud experience with AWS and/or Azure ecosystems.
  • Exposure to Master Data Management (MDM) implementations.
  • Strong problem-solving skills and attention to detail.

Nice to Have Skills:

  • Experience with infrastructure as code (Terraform), containerization, or DevOps practices.
  • Snowflake SnowPro Core and/or SnowPro Data Engineer certifications.
  • Experience with data cataloging, data lineage, and metadata management tools.

What We Offer:

  • Remote Work Opportunities
  • Flexible Work Hours
  • Professional Development Opportunities

Expected Compensation:

  • $90-125 per hour

 

*Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor or take over sponsorship of an employment Visa at this time

Ace Your Job Interview

Read our advice on how to answer the most common interview questions.