Sr. Data Engineer/100% Remote/$150-170K

Apply for this position Please mention DailyRemote when applying
timePosted 7 days ago location United States salary$150k - 170k (US Dollars)
Before you apply - make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

Job Description


Position: Sr. Data Engineer

Location: 100% Remote

Salary: $150-170K (Negotiable)

Product: Google Cloud Platform


The client is a Google Cloud Premier Partner and privately-held global leader in providing business and technology consulting services that transform organizations through cloud-based solutions. They provide expert services randing from enterprise consulting, cloud platform migration, custom application development, managed services, user adoption and change management.


As a Sr. Data Engineer, you will work collaboratively with architects and other engineers to recommend, prototype, build and debug data infrastructures on Google Cloud Platform (GCP). Engagements vary from being purely consultative to requiring heavy hands-on work and cover a diverse array of domain areas, such as data migrations, data archival and disaster recovery, and big data analytics solutions requiring batch or streaming data pipelines, data lakes and data warehouses. You will be expected to run point on whole projects, end-to-end, and to mentor junior Data Engineers. You will be recognized as an expert within the team and will build a reputation with Google and our customers. You will also participate in early-stage opportunity qualification calls, as well as lead client-facing technical discussions for established projects.


  • Google Professional Data Engineer Certified or able to complete within the first 45 days of employment


Mastery in at least one of the following domain areas:

  • Data warehouse modernization: building complete data warehouse solutions, including technical architectures, star/snowflake schema designs, infrastructure components, ETL/ELT pipelines, and reporting/analytic tools. Must have hands-on experience working with batch or streaming data processing software (such as Beam, Airflow, Hadoop, Spark, Hive).
  • Data migration: migrating data stores to reliable and scalable cloud-based stores, including strategies for near zero-downtime.
  • Backup, restore & disaster recovery: building production-grade data backup and restore, and disaster recovery solutions. Up to petabytes in scale.
  • Experience writing software in one or more languages such as Python, Java, Scala, or Go
  • Experience building production-grade data solutions (relational and NoSQL)
  • Experience with systems monitoring/alerting, capacity planning and performance tuning
  • Experience in technical consulting or customer-facing role

*W2 ONLY, no sponsorship or third parties available at this time*