Overview Responsibilities: Basic Qualifications: Strong in Python scripting, minimum 4+ yrs Must have hands on experience implementing AWS Big data lake using EMR and Spark Strong exp with Snowflake Database Architecture , SQL , Database performance/Optimization Experience with Airflow tool and DAG's creation, Jobs Orchestration Good to have Media AdSales experience Experience leveraging open source big data processing frameworks, such as Apache Spark. Experience developing and deploying data pipelines within a cloud native infrastructure preferably AWS Experience in using CI/CD pipeline (Gitlab) Experience in Code Quality implementation (Used Pep8/Pylint) tools or any other code quality tool. Experience of Python Plugins /operators like FTP Sensor, Oracle Operator etc. Implement Industry Standards /Best Practices Excellent analytical and problem-solving skills Excellent verbal and written communication skills Preferred Qualifications: Additional Information: 5+ years of experience working as an Oracle / Snowflake database developer (Oracle 11g or greater) 5+ years of working in a data warehousing / big data environment SQL ETL/ELT development and performance tuning Ability to develop, implement and maintain standards established by the architecture and Development teams. Robust data analysis and root cause analysis skills Self-motivated independent thinker and collaborative team member.