"Big Data Engineer"

Apply for this position Please mention DailyRemote when applying
timePosted 4 days ago location United States salarySalary undisclosed
Before you apply - make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

Job Description

Hello, Please find the below JD and let me know if you have any resource. Big Data Engineer Duration 12 month contract + potential to extend andor convert Location Remote (Client site located in Nashville, TN) Required Skills Azure Data Factory, Databricks, LogicApps, SQL Server Overview Seeking a candidate with strong skills in core areas of Big Data in Azure cloud environments Data Lake File Systems, Apache Hadoop and itrsquos ancillary distributions, ELK, Data movement and transformation tools such as Databricks and Azure Data Factory, and exposure to Machine Learning and AI tool sets. They should possess strong communication skills, strong analytical aptitude with critical thinking, a solid understanding of reporting dashboarding capabilities, and the tools and platforms that support them. The role requires advanced skills that enable the individual to deliver a high level of quality for ingesting, persisting and archiving vast amounts of enterprise data and to meet the expectations of the other teams within Global Data Analytics, as well as supporting the broader Global Technology Services (GTS) organization for Data Management. Role Specific Responsibilities Build, Test and Run of data assets tied to tasks and user stories from the Azure DevOps instance of GDA Bring a level of technical expertise of the Big Data space that contributes to the strategic roadmaps for Enterprise Data Architecture, Global Data Cloud Architecture, and Global Business Intelligence Architecture, as well contributes to the development of the broader GDA Engineering community. Actively participate in regularly scheduled contact calls to transparently review the status of in-flight projects, priorities of backlog projects, and review adoption of previous deliveries from GDA with the GDA management team. Work to resolve any issues with the portfolio of deliveries that are a part of the Big Data teamrsquos work pipeline. Act as the last line of defense of ensuring deliveries are of the appropriate level of quality for downstream work to continue or against userrsquos expectations. Knowledge Sharing Documentation Contribute to, produce and maintain processes, procedures, operational and architectural documentation Change Control - ensure compliance with Processes and adherence to standards and documentation Assist in mentoring other members of the GDA team Adoption ndash lead efforts to communicate and support overall adoption of new and or enhanced Big Data capabilities Qualifications Education (degree) Bachelorrsquos Degree Other (Explain) College Diploma in Computer Science, or equivalent industry experience Years of Experience 7+ years of demonstrated delivery experience with technical knowledge Big Data environments and tool sets, particularly on Microsoft Azure, ELK and Apache Hadoop. First Month Critical Outcomes Absorb strategic projects from the backlog and complete the related Big Data engineering work Inspect existing run-state Big Data assets and identify optimizations for an potential development Deliver new Big Data assets assigned as needed Thanks Regards, Raviteja Veritis Group, Inc. 1231 Greenway Drive, Suite 1040, Irving, TX 75038 Phonex 127 Mobile Fax Email mailto www.veritis.com httpwww.veritis.com A Certified MBE AWS Select Partner HashiCorp Partner Docker Partner