Sr. Big Data Engineer

Apply for this position Please mention DailyRemote when applying
timePosted 13 days ago location United States salarySalary undisclosed
Before you apply - make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

Job Description

The Big Data Engineer will create and manage the uninterrupted flow of information by designing and maintaining pipelines that make data easily accessible across the enterprise. You will build automated data pipelines to ingest, store, process, and analyze our data and data systems. This includes building and maintaining the data structures and architectures for data ingestion, processing and deployment for large-scale, data-intensive applications. The Big Data Engineer must ensure that optimal ETLELT solutions are developed by applying best practices to the data modeling, code development and automation. Key Responsibilities bull Design, develop and maintain an optimal data pipeline architecture using both structured data sources and big data for both on-premise and cloud-based environments in both streaming and real time. bull Develop and automate ETL code using scripting languages, ETL tools and job scheduling software to support all reporting and analytical data needs. bull Design and build dimensional data models to support the data warehouse initiatives. bull Assemble large, complex data sets that meet the analytical needs of the data science team. bull Assess new data sources to better understand availability and quality of data. bull Identify, design, and implement internal process improvements automating manual processes, optimizing data pipeline performance, re-designing infrastructure for greater scalability and access to information. bull Participate in requirements gathering sessions to distill technical requirements from business requests. bull Collaborate with business partners to productionize, optimize, and scale enterprise analytics. bull Collaborate with data architects and modelers on data store designs and best practices EducationCertifications bull Bachelorrsquos degree in Computer Science, Engineering, Information Science, Math or related discipline bull Data engineering, data management or cloud certification is a plus ExperienceMinimum Requirements bull Five (5)+ yearsrsquo experience in traditional and modern Big Data technologies (HDFS, Hadoop, Hive, Pig, Sqoop, Kafka, Apache Spark, hBase, Oozie, No SQL databases, PostgreSQL, GIT, Python, REST API, Snowflake, etc.) bull Two (2)+ yearsrsquo experience building data platforms using Azure stack (Azure Data Factory, Azure DataBricks, etc.) bull Experience with object-orientedobject function scripting languages Python, Java, C++, Scala bull Experience extractingqueryingjoining large data sets at scale bull Experience utilizing Snowflake to build data marts with the data residing in Azure storage is a plus Other SkillsAbilities bull Thorough understanding of relational, columnar and NoSQL database architectures and industry best practices for development bull Understanding of dimensional data modeling for designing and building data warehouses bull Excellent advanced SQL coding and performance tuning skills bull Experience with parsing data formats such as XMLJSON and leveraging external APIs bull Understanding of agile development methodologies bull Ability to work in a team-oriented, collaborative environment good interpersonal skills bull Strong analytical and problem-solving skills ability to weigh various suggested technical solutions against the original business needs and choose the most cost-effective solution bull Keen attention to detail and ability to access impact of design changes prior to implementation bull Self-driven, highly motivated and ability to learn quick bull Ability to effectively prioritize and execute tasks in a high-pressure environment bull Strong customer service orientation bull Ability to present and explain technical information to diverse types of audiences in a way that establishes rapport and gains understanding bull Work experience with geospatial data and spatial analytics is preferred Working Conditions Works in a normal office setting with no exposure to adverse environmental conditions. During COVID19 conditions, this role will work remotely to mitigate risk. Provides off-hours support for all developed data pipelines in an on-call rotation.