REMOTE: Lead DevOps Data Engineer

Apply for this position Please mention DailyRemote when applying
Posted 11 days ago United States Salary undisclosed
Before you apply - make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

Job Description

REMOTE POSITION: Lead DevOps Data Engineer w/experience designing, building data pipelines (Python & AWS)

Open to remote. Candidate would need to adhere to EST work schedule.

ROLE: Data and Analytics is an evolving space which includes more software engineering, distributed systems, and cloud skills.
  • On this team, you'll develop, maintain, and enhance the data platform capabilities in an open and collaborative environment to build the central platform.
  • Will collaborate with internal data customers across IT and the Business to minimize the time from idea inception to analytical insight.
Some of the job responsibilities will be:
  • Leading data infrastructure design efforts and collaborating with other platforms to integrate infrastructure into the client's systems and researching the feasibility and effectiveness of various technology options and making recommendations
  • Designing complex tools and solutions to manage a wide variety of data and implement machine learning solutions which includes the orchestration, data pipelines, and infrastructure as code solutions the Data Engineering team builds.
Required skills:
  • Proven experience in designing, building, and supporting complex machine learning pipelines
  • Software engineering (Version Control) and associated best practices
  • Advanced programming experience in programming languages used in analytics and data science (e.g. Python, Java, Scala). Comfortable with Linux environments and shell scripting.
  • Experience with Cloud-based infrastructures (AWS) and infrastructure as code (Terraform is what is used at the client, but Ansible is acceptable, as well)
  • Skilled in Linux, Bash, SQL
  • Experienced with Kubernetes
  • Deploy machine learning models for real-time use cases
  • Understanding of machine learning algorithms
  • Design and evaluated approaches to high volume real time data streams
  • Foundation in computer science, system architecture, statistical/quantitative modeling with ability to process large volumes of structured and unstructured data
  • Verbal CommunicatioN
Preferred skills and experiences:
  • Analysis
  • API Development
  • CI/CD
  • Distributed Systems
  • Domain Knowledge
  • Visual Communication
Education and/or experiences listed below are the minimum requirements for job entry.
  • Bachelor's Degree or higher in an Information Technology discipline or related field of study and minimum of five years of work experience designing, programming, and supporting software programs or applications.
  • In lieu of degree, minimum of six years related work experience designing, programming, and supporting software programs or applications may be accepted.