We are hiring a Principal Data Architect who will play a pivotal role in designing, implementing, and optimizing our data architecture to support the ingestion, processing, and analysis of healthcare data. This role requires a deep understanding of healthcare data standards, data integration, and advanced analytics to drive value-based care initiatives.
The ideal candidate has specific experience working with data in healthcare, health-tech or the life sciences commercial space.
What you'll do:
- Create scalable and efficient data models and architectures to support the ingestion and processing of healthcare data from payers and providers.
- Architect and implement robust data integration solutions to ensure seamless data flow between various healthcare systems and platforms; internally and externally.
- Work closely with cross-functional teams, including product managers, engineers, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.
- Implement data modelling, warehousing, and optimization best practices to enhance performance.
- Continuously monitor and optimize data architecture for performance, scalability, and reliability.
- Establish best practices for Git-based data version control and DevOps integration.
- Develop and maintain comprehensive documentation of data architecture and provide training to team members on best practices and standards.
- Ensure compliance with HIPAA, HITRUST, and other healthcare data regulations.
What you'll need:
- Demonstrated experience in data architecture, data engineering, or related roles in the healthcare or life science industry.
- Demonstrated experience with data modeling and engineering of Claims and/or EHR data.
- Proficiency in SQL, Python, SQL and NoSQL databases, ETL/ELT processes, data warehousing, and cloud platforms (e.g., Azure, AWS, GCP).
- Strong understanding of healthcare data standards (e.g., HL7, FHIR, OMOP) and regulations (e.g., HIPAA, HITRUST).
- Understanding of healthcare insurance claims and risk adjustment processes.
- Strong top-down approach to problem-solving skills and the ability to think critically and strategically.
- Experience with Azure Data Services, Microsoft Fabric, Databricks, and SQL-based architectures.
- Experience deploying code into CI/CD pipelines, DevOps practices, and Git-based workflows.
- Excellent communication and collaboration skills, with the ability to work effectively with diverse stakeholders.