JLA

Big Data/Azure Engineer/Contractor

Apply for this position Please mention DailyRemote when applying
timePosted 9 days ago location United States salarySalary undisclosed
Before you apply - make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

Job Description

Seeking a candidate with strong skills in core areas of Big Data in Azure cloud environments Data Lake File Systems, Apache Hadoop and itrsquos ancillary distributions, ELK, Data movement and transformation tools such as Databricks and Azure Data Factory, and exposure to Machine Learning and AI tool sets. They should possess strong communication skills, strong analytical aptitude with critical thinking, a solid understanding of reporting dashboarding capabilities, and the tools and platforms that support them. The role requires advanced skills that enable the individual to deliver a high level of quality for ingesting, persisting and archiving vast amounts of enterprise data and to meet the expectations of the other teams within Global Data Analytics, as well as supporting the broader Global Technology Services (GTS) organization for Data Management. Responsibilities Role Specific Responsibilities Build, Test and Run of data assets tied to tasks and user stories from the Azure DevOps instance of GDA Bring a level of technical expertise of the Big Data space that contributes to the strategic roadmaps for Enterprise Data Architecture, Global Data Cloud Architecture, and Global Business Intelligence Architecture, as well contributes to the development of the broader GDA Engineering community. Actively participate in regularly scheduled contact calls to transparently review the status of in-flight projects, priorities of backlog projects, and review adoption of previous deliveries from GDA with the GDA management team. Work to resolve any issues with the portfolio of deliveries that are a part of the Big Data teamrsquos work pipeline. Act as the last line of defense of ensuring deliveries are of the appropriate level of quality for downstream work to continue or against userrsquos expectations. Knowledge Sharing Documentation middot Contribute to, produce and maintain processes, procedures, operational and architectural documentation middot Change Control - ensure compliance with Processes and adherence to standards and documentation middot Assist in mentoring other members of the GDA team middot Adoption ndash lead efforts to communicate and support overall adoption of new and or enhanced Big Data capabilities Qualifications Education (degree) Bachelorrsquos Degree Other (Explain) College Diploma in Computer Science, or equivalent industry experience Years of Experience middot 5+ years of demonstrated delivery experience with technical knowledge Big Data environments and tool sets, particularly on Microsoft Azure, ELK and Apache Hadoop. Outcomes First Month Critical Outcomes middot Absorb strategic projects from the backlog and complete the related Big Data engineering work middot Inspect existing run-state Big Data assets and identify optimizations for an potential development middot Deliver new Big Data assets assigned as needed