Help at Home is the leading national provider of in-home personal care services, where our mission is to enable individuals to live with independence and dignity at home. Our team supports 66,000 clients monthly with the help of 49,000 compassionate caregivers across 12 states. We're looking for people who care about others, who are willing to listen, lean in and make impactful change. Each role at Help at Home can have a positive impact in supporting our caregivers and clients. If you are someone who leads with passion and integrity and are looking to join a rapidly growing, industry leading team, Help at Home may be a good fit for you.
Job Summary:Under the guidance of the VP Data Management, the Sr. Data Engineer is responsible for delivering data warehouse solutions by building enterprise data models and writing ETL/ELT processes to map, cleanse and optimize multiple source systems of data to populate the enterprise data models for business consumption. This role is responsible for delivering solutions that meet our growing business needs as they relate to our enterprise data and analytics strategy for Care Coordination and Help at Home. The ideal candidate should be comfortable leading teams and driving the creation of a platform with focus on the Data Mesh architecture.
As a key member of the team:
- You are flexible and can embrace change
- You value progress over perfection
- You care about your work, the team you're on, and the people we are helping
- You make it a priority to get to know the people around you - build relationships with your colleagues and business partners
- You say what needs to be said, while considering how it'll affect culture and output
- Hold others to a high standard
Duties/Responsibilities:
- Leads platform and cloud engineering
- Drives the SDLC towards full automation with code and release quality
- Understands and implements data warehouse best practices like change data capture (CDC) and slowly changing dimension (SCD) concepts
- Works closely with stakeholders to understand the business requirements and collaborate with data team members to design and develop flexible, reusable solutions that are consistent with data warehouse architecture standards
- Effectively communicates with stakeholders to understand business requirements with the ability to translate requirements into technical designs and solutions and convey the requirements and designs to team members
- Identifies releases that do not meet defined standards for code, data quality, etc. and delays or block implementation
- Improves our overall data security posture; strengthens our SDLC and Devops strategy in support of sustained business growth
- In alignment with Data Mesh, builds ingestion, integration and sharing patterns and frameworks for better data access
- Maintains knowledge of current trends and developments in the field and actively explore emerging technologies
- Builds frameworks and promotes common patterns
- Promotes and helps with the adoption of Infrastructure as code
Required Skills/Abilities: - Cloud-first mindset
- Ability to work in a fast-paced dynamic, environment delivering solutions that significantly impact the business
- Familiarity with salesforce integrations using AWS services
- Knowledge of Airflow or MWAA
- Knowledge of Spark, Kafka and other batch and stream processing platforms including AWS Kinesis
- Knowledge of testing frameworks and TDD or BDD
- Self-starter, self-managed, quick learner, problem-solver with a positive, collaborative, and team-based attitude who is willing to support and teach fellow team members
- Strong data analysis skills
- Strong relational database skills including advanced SQL knowledge and the ability to create complex queries and stored procedures.
- Strong understanding of data warehouse and business intelligence design principles and industry best practices, including relational and dimensional modeling and ETL/ELT methods
- Understanding of trunk-based development.
- Working knowledge of Snowflake
- Working knowledge of Snowflake DBMS and JSON
Education and Experience: - AWS architecture, developer, security, and networking experience.
- 7+ Years of experience in data engineering required.
- Bachelor's Degree in Computer Science, Data & Analytics, Information Management, Healthcare Informatics, Business Administration, Statistics, or related field required.
- Cloud (AWS), Warehousing, and Snowflake experience required.
- Demonstrated experience with automation.
- Demonstrated experience with one or more of the following languages: Go, Python, Typescript.
- Strong experience with various AWS services like (S3, Lambda, Glue, EMR, CloudFormation, MWAA, Kinesis, MSK).
- Strong experience working with a variety of data warehousing models and design fundamentals (i.e., Kimball, Inman)