Please find below opportunity with our client and let me know if you are interested.
Position: Big Data- Cloud Data Engineer
Location: Chicago, IL (Remote till Covid Ends)
Duration: 12 Months
100% remote till COVID subsides, after that relocation to Chicago is MUST
We are looking for a strong Hadoop data engineering talent to be a part of the data integration team for the company. We are building a new application integration platform which will allow bi-directional real time integration between SAP, MDM and Salesforce and also bringing new data into current Enterprise Data Lake. Role offers opportunity to be key part of a challenging, fast-paced environment and build ground-up core data integration offerings and shape the technology roadmap in our high-growth, knowledge-driven team.
- Designs and develops multiple, diversified applications using big data platform leveraging hybrid clouds. These applications are primarily consumed for parsing, analyzing, discovering, and visualizing the potential business insights.
- Works with critical data stake holders and technical teams for optimal solutions and service delivery. Provides strategic, tactical direction in the delivery of big data solutions
- Works with various cross functional teams such as infrastructure, data and enterprise architects to build scalable, optimal, self-service analytics solutions, both on premise and in the cloud.
- Ability to conduct data profiling, cataloging, and mapping for technical design and construction of technical data flows.
- Bachelor's degree in Computer Science, Mathematics, or Engineering
- 7+ years of experience in Data and Analytics and Data Integrations
- 5+ years of experience in big data stack and solutions, including cloud technologies
- 5+ years of designing, implementing and successfully operationalizing large-scale data lakes solutions in production environments using Big data Stack (On-prem and Azure)
- 3+ years of experience in architecting and implementing end to end Azure cloud big data solutions.
- 3+ years of experience implementing Real-time Solutions & data integrations
- Experience with Big Data Management Tool (Zaloni) is a Plus.
- Hands on experience with building, optimizing the data pipe - CI/CD, integrated - build and deployment automation, configuration management, test automation solutions.
- Professional training and certifications in various big data solutions (Preferred).
- Solid understanding of Azure Cloud Stack including ADF Data flows, Event Hub, Databricks, HDInsight, Azure DevOps
- Deep hands on experience with Hadoop, HIVE, HBase, Spark, Kafka, Snowflake, Python, R, SQL, Java, Scala, Zeppelin, RStudio, Spark RDDs and Data Frames, Ambari, Ranger, Kerberos, Atlas and Collibra etc.
- Informatica MDM, BDQ, BDM, ETL architecture experience is a Plus
- Communication and presentation skills, to articulate functionality, issues and risks with business communities
Shravan Kallem | IT Recruitment - US Staffing
Charter Global Technologies |
One Glenlake Parkway | Suite 525 | Atlanta, GA 30328
- provided by Dice