Data Architect - Snowflake

Apply for this position Please mention DailyRemote when applying
timePosted 12 days ago location United States salarySalary undisclosed
Before you apply - make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

Job Description

Hi, Please find the new requirement for Architect ndash Snowflake. Location Plano, TX Long Term Job Summary As part of Data Engineering team, you will be architecting and delivering highly scalable, high performance data integration and transformation platforms. The solutions you will work on will include cloud, hybrid and legacy environments that will require a broad and deep stack of data engineering skills. Must have leadership skills and should be able to guide and mentor team. You will be using core cloud data warehouse tools, hadoop, spark, events streaming platforms and other data management related technologies. You will also engage in requirements and solution concept development, requiring strong analytic and communication skills. Skills and Qualifications Bachelorrsquos Degree or foreign equivalent in Computer Science, Electrical Engineering, Mathematics, Computer Applications, Information Systems or Engineering is required Experience building high-performance, and scalable distributed systems Strong experience with Snowflake database AWS cloud experience (EC2, S3, Lambda, EMR, RDS, Redshift) Experience in ETL and ELT workflow management Familiarity with AWS Data and Analytics technologies such as Glue, Athena, Spectrum, Data Pipeline Experience building internal cloud to cloud integrations is ideal Experience with streaming related technologies ex Spark streaming or other message brokers like Kafka is a plus Data Management Experience Batch ETL tool experience (DataStage Informatica Talend) Experience in developing, deploying and supporting scalable and high-performance data pipelines (leveraging distributed, data movement technologies and approaches, including but not limited to ETL and streaming ingestion and processing) Experience with Hadoop Ecosystem (HDFSS3, Hive, Spark) Experience in a software engineering, leveraging Java, Python, Scala, etc. Advanced distributed schema and SQL development skills including partitioning for performance of ingestion and consumption patterns Experience with distributed NoSQL databases (Apache Cassandra, Graph databases, Document Store databases) Experience in the banking industry Sincerely, HR Manager nFolks Data Solutions LLC Phone 425-999-4933 Emailarun(AT)nfolksdata.com