Senior Data Engineer, Platform - divvyDose - Chicago, IL or Telecommute

Apply for this position Please mention DailyRemote when applying
timePosted 12 days ago location United States salary$79k - 142k (US Dollars)
Before you apply - make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

Job Description

divvyDOSE is a rapidly growing healthcare startup headquartered in Chicago. Our vision is a life where medicine does what it's supposed to, and people get the attention and care they deserve. We strive to improve the quality of life through innovative design and compassionate customer service that allows medicine to get out of the way of our customers' lives.

divvyDOSE is seeking an innovative, passionate, and positive-minded Senior Data Engineer to join our rapidly growing Data Platform Engineering Team. We are the Center of Excellence for data and analytics engineering, turning millions of data points into insights and data sets that power key business and product decisions to help build a best-in-class digital pharmacy.

As a senior member of the Data Platform Engineering team, you will build and deploy platform-level tools and applications to build high quality data products and the infrastructure to support them. You have the opportunity to collaborate with our Data Architect, Data Scientists, Data Analysts, and Software Engineers focused on developing multiple areas of our product as well as the platform itself. You will implement data pipelines with requirements for high scalability, availability, security, and quality.

We believe that data is a first-class concern and not just a byproduct of our day to day processes. Data Platform Engineering at divvyDOSE focuses on self-service data strategies that encourage everyone to be data-driven. Above all, we believe in empowering all of our engineers through good DevOps and DataOps practices; on the Platform Engineering team, your focus will be on crafting an environment that is a joy for technical and non-technical minds to build upon.

You'll enjoy the flexibility to telecommute* from anywhere within the U.S. as you take on some tough challenges.

Primary Responsibilities:

  • Help build a scalable data platform to accelerate data ingestion, processing, orchestration, discoverability, and usage for Engineering, Product, and Business teams
  • Help define, build, and own key datasets and the quality and evolution of these datasets in a Data Catalog as use cases grow
  • Implement data ingestion and processing frameworks, both real time and batch, using best practices in data modeling, ETL/ELT processes by leveraging AWS technologies and big data tools
  • Collaborate with product engineers to uphold a Data Mesh architecture
  • Collaborate with data scientists to create rich data sets for optimization, statistical analysis, prediction, clustering and machine learning
  • Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for data consumers
  • Help junior data engineers use and adopt new tools and best practices
  • Develop and maintain automated solutions, tools, libraries, and/or infrastructure related to the following areas:

○ Data Ingestion & Processing

○ Data Quality

○ Data Modeling

○ Data Versioning & Management ○ Data Security & Compliance

You'll be rewarded and recognized for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role as well as provide development for other roles you may be interested in.

Required Qualifications:

  • 4+ years of experience in software engineering with a deep understanding of SDLC and agile practices
  • 2+ years of experience in a data engineering or data science role with a deep understanding of DevOps, DataOps, and/or MLOps processes
  • Proficient in one or more of the following: Python, Scala, Java
  • Proficient in SQL
  • Experience with SQL/NoSQL databases (PostgreSQL, MySQL, DynamoDB, MongoDB, Cassandra, BigTable)
  • Experience building scalable data and analytics pipelines using cloud technologies (AWS, Azure, GCP)
  • Experience with a cloud data warehouse (some examples are; Snowflake, BigQuery, Redshift)
  • Experience with streaming technologies (some examples are; Kinesis, Flink, Dataflow, Pub/Sub)
  • Experience with CI/CD technologies (some examples are; Jenkins, CircleCI, Bamboo, BitBucket)
  • Experience with dataflow orchestration (some examples are; Airflow, Luigi, Prefect, Dagster)
  • If you need to enter a work site for any reason, you will be required to screen for symptoms using the ProtectWell mobile app, Interactive Voice Response (i.e., entering your symptoms via phone system) or a similar UnitedHealth Group-approved symptom screener. Employees must comply with any state and local masking orders. In addition, when in a UnitedHealth Group building, employees are expected to wear a mask in areas where physical distancing cannot be attained.

Preferred Qualifications:

  • Experience with IoT
  • Experience with HIPPA, PII, and/or PHI data
  • Experience with DBT
  • Experience with Looker/LookML
  • Experience with ML/AI infrastructure and lifecycles
  • Experience with Data migration processes
  • Experience with Sagemaker, Google AI Platform
  • Experience with Containerized services
  • Experience with Graph Database

Technologies we use:

  • Python and SQL
  • Snowflake
  • Looker
  • Kinesis
  • Git
  • Terraform
  • RDS (PostgreSQL), DynamoDB, and ElasticSearch
  • CircleCI and Jenkins
  • AWS Lambda (Serverless framework)
  • Datadog

Careers with Optum. Here's the idea. We built an entire organization around one giant objective; make health care work better for everyone. So when it comes to how we use the world's large accumulation of health-related information, or guide health and lifestyle choices or manage pharmacy benefits for millions, our first goal is to leap beyond the status quo and uncover new ways to serve. Optum, part of the UnitedHealth Group family of businesses, brings together some of the greatest minds and most advanced ideas on where health care has to go in order to reach its fullest potential. For you, that means working on high performance teams against sophisticated challenges that matter. Optum, incredible ideas in one incredible company and a singular opportunity to do your life's best work.(sm)

*All Telecommuters will be required to adhere to UnitedHealth Group's Telecommuter Policy.

Colorado Residents Only: The salary range for Colorado residents is $79,700 to $142,600. Pay is based on several factors including but not limited to education, work experience, certifications, etc. As of the date of this posting, In addition to your salary, UHG offers the following benefits for this position, subject to applicable eligibility requirements: Health, dental, and vision plans; wellness program; flexible spending accounts; paid parking or public transportation costs; 401(k) retirement plan; employee stock purchase plan; life insurance, short-term disability insurance, and long-term disability insurance; business travel accident insurance; Employee Assistance Program; PTO; and employee-paid critical illness and accident insurance.

Diversity creates a healthier atmosphere: UnitedHealth Group is an Equal Employment Opportunity / Affirmative Action employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin, protected veteran status, disability status, sexual orientation, gender identity or expression, marital status, genetic information, or any other characteristic protected by law.

UnitedHealth Group is a drug-free workplace. Candidates are required to pass a drug test before beginning employment.

Job Keywords: Senior Data Engineer, Platform, divvyDose, Chicago, IL, Illinois, Telecommute, Telecommuter, Telecommuting, Work from home, work at home, remote