Help at Home is the leading national provider of in-home personal care services, where our mission is to enable individuals to live with independence and dignity at home. Our team supports 66,000 clients monthly with the help of 49,000 compassionate caregivers across 12 states. We're looking for people who care about others, who are willing to listen, lean in and make impactful change. Each role at Help at Home can have a positive impact in supporting our caregivers and clients. If you are someone who leads with passion and integrity and are looking to join a rapidly growing, industry leading team, Help at Home may be a good fit for you.Job Summary:
The Data Engineer is responsible for delivering data warehouse solutions by building enterprise data models and writing ETL/ELT processes to map, cleanse and standardize multiple source systems of data to populate the enterprise data models for business consumption. They will be working throughout a multi-layered data warehouse environment in order to support a wide variety of business needs. This role is responsible for delivering solutions that meet our growing business needs as they relate to our enterprise data and analytics strategy for Care Coordination and Help at Home. The ideal candidate should be comfortable with driving creation of a platform with focus on the Data Mesh architecture.
As a key member of the team:
- You are flexible and can embrace change
- You value progress over perfection
- You care about your work, the team you're on, and the people we are helping
- You make it a priority to get to know the people around you - build relationships with your colleagues and business partners
- You say what needs to be said, while considering how it'll affect culture and output
- Hold others to a high standard
- Leverages CDC to optimize the ETL/ELT processes they develop including being able to develop routines to determine changed records when they are not provided by the source system
- Creates GitHub actions and builds pipelines for dev, stage, and prod
- Effectively communicates with stakeholders to understand business requirements with the ability to translate requirements into technical designs and solutions and convey the requirements and designs to team members
- Profiles the source system data and assess its data quality to design and develop solutions to improve the data quality such that it maps properly into the data warehouse structures and meets the data warehouse standards
- Works with the business to determine survivorship rules, builds the golden record based on the rules and then builds and maintains structures for the integrated dimensions and facts; understands master data guiding principles and best practices in terms of the technical de-duplication process, which includes enhancing data quality to support matching and grouping
- In alignment with Data Mesh, builds ingestion, integration and sharing patterns and frameworks for better data access
- Maps source system data structures into the data warehouse data model (source-target mapping) and enhance the data warehouse data model as needed to meet the business needs
- Improves our overall data security posture; strengthens our SDLC and Devops strategy in support of sustained business growth
- Maintains knowledge of current trends and developments in the field and actively explore emerging technologies
Education and Experience:
- Cloud-first mindset
- Ability to work in a fast-paced dynamic, environment delivering solutions that significantly impact the business
- Knowledge of testing frameworks and TDD or BDD
- Self-starter, self-managed, quick learner, problem-solver with a positive, collaborative, and team-based attitude who is willing to support and teach fellow team members
- Strong data analysis skills
- Strong relational database skills including advanced SQL knowledge and the ability to create complex queries and stored procedures.
- Strong understanding of data warehouse and business intelligence design principles and industry best practices, including relational and dimensional modeling and ETL/ELT methods
- Understanding of trunk-based development.
- Working knowledge of Snowflake
- Working knowledge of Snowflake DBMS and JSON
- AWS architecture, developer, security, and networking experience.
- 3+ Years of experience in data engineering required.
- Bachelor's Degree in Computer Science, Data & Analytics, Information Management, Healthcare Informatics, Business Administration, Statistics, or related field required.
- Demonstrated experience with automation.
- Demonstrated experience with one or more of the following languages: Go, Python, Typescript.
- Strong experience with various AWS services like (S3, Lambda, Glue, EMR, CloudFormation, MWAA, Kinesis, MSK).
- Cloud (AWS), Warehousing, and Snowflake experience preferred