Data Architect (Remote)

Apply for this position Please mention DailyRemote when applying
Posted 2 days ago United States Salary undisclosed
Before you apply - make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

Job Description

Overview: Help us architect and innovate the future together! Quadax, an award-winning leader in healthcare revenue cycle technology, is looking for a permanent, full-time Data Architect to join a greenfield Enterprise Architecture and Application Development program with proven industry leading design techniques, frameworks, and leadership that helps create the next generation of Quadax products. Are you that innovative thought leader with an understanding of technology trends and how to leverage these trends as solutions to assist Quadax in delivering award-winning revenue cycle optimization to providers and/or partners?
The Data Architect, which is an emerging role in Quadax enterprise architecture team, will play a pivotal role in operationalizing the most-urgent data initiatives for Quadax digital business initiatives. The bulk of the data architects work would be in building greenfield architectures, managing and optimizing data pipelines and then moving these data pipelines effectively into production for key data and analytics consumers (like business/data analysts, DevOps, data scientists or any persona that needs curated data for data and analytics use cases).
Data Architects also need to guarantee compliance with data governance and data security requirements while creating, improving and operationalizing these integrated and reusable data pipelines. This would enable faster data access, integrated data reuse and vastly improved time-to-solution for Quadax data initiatives. The data engineer will be measured on their ability to integrate analytics and (or) data science results with new Quadax business models and processes. We are able to hire remote employees in OH, PA, MI, IN, KY, WV, TN, GA, FL, TX, MO, SD, VA, or NC only.
Responsibilities:
  • Build large heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using modern data integration technologies
  • Conduct a detailed assessment of a data landscape including data platforms, technology architecture, data flows, data consumption, integrations and document current state
  • Consult with application project teams and acts as project advisor to other domain functions
  • Develop future state architecture and process/data flows to realize the modern data strategy
  • Develop proof-of-concepts and present ideas
  • Develop and design architectural blueprints and evolutionary roadmap that defines and communicates the strategic direction for IT architectures in support of business and technical strategies
  • Develop and document detailed source-to-target data mapping and data transformation rules
  • Develop and enhance IT standards, principles, policies and guidelines and secure their endorsement by IT Management
  • Develop, support and continually improve the Enterprise Architecture (EA) practice
  • Drive digital innovation by leveraging innovative new technologies and approaches to renovate, extend, and transform the existing core technology base and IT estate
  • Elicits requirements using interviews, document analysis, requirements workshops, surveys, site visits, business process descriptions, use cases, scenarios, business analysis, task and workflow analysis
  • Ensure compliance with approved IT standards, principles, policies and guidelines in the design of projects, applications and solutions
  • Implement architecture scope, lead and develop conceptual, logical and physical architectures for assigned projects and applications
  • Introduces and accelerates new digital transformation technologies and methodologies to support technology transformation
  • Participate in the definition, documentation and presentation of Data Design Standards, Patterns and Best Practices
  • Research, evaluate, and prototype new methodologies, technologies, and products
  • Support technology direction, build consensus, and architectural decisions to deliver superior RCO solutions
  • Support the migration of data from legacy systems to new solutions
  • Translate business requirements, using patterns and standards, into application level design artifacts
  • Work with IT Domains to determine the right design and architecture for the project and applications
Qualifications: Position Requirements:
  • 5+ years of design, implementation and maintenance of complex business applications
  • A Bachelors degree in Computer Science, Computer Engineering, Information Systems or related field or the equivalent relevant experience
  • Collaborate and communicate effectively with business and IT groups such as project managers, architects, and developers both internal and external
  • Defines, document and researches requirements on departmental or multiple team projects with low to medium complexity
  • Designs aspects of the architecture of a data architecture application, including components such as user interface, middleware and infrastructure with low to medium complexity
  • Experience in data governance and operating model with low to medium complexity
  • Experience working with and optimizing existing ETL processes and data integration and data preparation
  • Identify linkages between data architecture, business requirements and application architecture components with low to medium complexity
  • Strong presentation, verbal, and written communication skills
  • Understand master data strategies and information hubs with low to medium complexity
Preferred:
  • Basic experience working with popular data discovery, analytics and BI software tools like Tableau, Qlik, PowerBI and others for semantic-layer-based data discovery
  • Experience working with open-source and commercial message queuing technologies such as Kafka, JMS, RabbitMQ, Azure Service Bus, Amazon Simple queuing Service, others, stream data integration technologies such as Apache Nifi, Apache Beam, Apache Kafka Streams, Amazon Kinesis, others and stream analytics technologies such as Apache Kafka KSQL Apache Spark Streaming Apache Samza, others
  • Experience with IT compliance and risk management requirements (eg. security, privacy, SOX, HIPAA etc.)
  • Knowledge of Big Data and Cloud technologies such as Hortonworks, Cloudera, AWS, MS Azure, Google Cloud, Visualization Tools, NoSQL, Graph databases
  • Knowledge of ANSI X/271 and 278 transactions sets and HL7 ADT transaction sets
  • Prior experience in Healthcare Revenue Cycle Management
  • Strong ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata and workload management. The ability to work with both IT and business in integrating analytics and data science output into business processes and workflows
  • Strong experience in working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using traditional data integration technologies. These should include ETL/ELT, data replication/CDC, message-oriented data movement, API design and access and upcoming data
  • Strong experience with popular database programming languages including SQL, PL/SQL, others for relational databases and certifications on upcoming NoSQL/Hadoop oriented databases like MongoDB, Cassandra, others for non-relational databases
  • Understanding of design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata and workload management. The ability to work with both IT and business in integrating analytics and data science output into business processes and workflows