Data Engineering Intern - Remote U.S.

Apply for this position Please mention DailyRemote when applying
Posted 3 days ago United States Salary undisclosed
Before you apply - make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

Job Description

Graduating in 2022 with your Master's in Data Analytics or Data Engineering and looking for an internship opportunity before you graduate? Join our team in January 2022 for six months as Data Engineering Intern on our innovative digital team. This is a remote 40-hour per week paid internship.

When you join TTEC as an intern, you'll experience a highly knowledgeable and collaborative team, and cutting-edge work. You'll do real work that matters for some of the most iconic brands in the world and make a tangible impact with your team. This isn't a "grab me coffee and make some copies" kind of internship!

**About TTEC**

We help global brands provide a great experience to their customers, build customer loyalty, and grow their business. We were founded on one guiding principle: customer experiences that are simple, inspired, and more human deliver lasting value for everyone. Your role brings that principle to life.

**What you'll be doing:**

Looking to use your experience with software languages to manipulate large amounts of data? Ready to solve data analysis problems? If you're a high motivated self-starter, you'll like this internship opportunity. On a typical day you will:

- Aid in management and manipulation of multiple large data sets, including: defining populations and variables, performing calculations and summarizations, and creating solutions to address client business questions
- Help ingest data from outside sources into TTEC solutions
- Convert existing SAS code into Python for use in new system
- Assist senior team members with portions of large analytic deliverables

**What you'll bring:**

- Undergraduate degree in a STEM field, Working towards your Master's degree
- Experience working with large-scale relational databases, and the ability to manipulate/analyze/maintain relational databases
- Proficiency in Python, Apache Spark, or PySpark, bash scripting, SQL is a must
- Ability to work independently
- Strong attention to detail
- Strong verbal and communication skills
- Comfortable working in a team-oriented, deadline-driven environment