Apply Now Please mention DailyRemote when applying

Disclaimer: Before you apply, please make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

As a Sr Specialist Solutions Engineer (Sr SSE) - Data Warehouse, you will guide customers in building big data solutions on Databricks that span a large variety of use cases. You will be in a customer-facing role, working with and supporting Solution Architects, that requires hands-on production experience with Apache Spark™ and expertise in other data technologies. You will help customers through design and successful implementation of essential workloads while aligning their technical roadmap for expanding the usage of the Databricks Intelligence Platform. As a deep go-to-expert reporting to the Specialist Field Engineering Manager, you will continue to strengthen your technical skills through mentorship, learning, and internal training programs and establish yourself in an area of specialty - whether that be streaming, performance tuning, industry expertise, or more. #LI-Onsite

The impact you will have:

  • Provide technical leadership to guide strategic customers to successful cloud transformations on large-scale data warehousing workloads - ranging from evaluation to architecture design to production deployment
  • Prove the value of the Databricks Intelligence Platform for customer workloads by architecting production workloads, including end-to-end pipeline load performance testing and optimization
  • Become a technical expert in an area such as data warehousing evaluations or helping set up successful workload migrations
  • Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing and performance, and tuning workloads for production
  • Provide tutorials and training to improve community adoption (including hackathons and conference presentations)
  • Contribute to the Databricks Community

 

What we look for:

  • Pre-sales or post-sales experience working with external clients across a variety of industry markets
  • Design performant, scalable, secure and cost effective cloud-based data solutions, and articulate architectures and design choices to customer's senior stakeholders.
  • Experience with design and implementation of a broad range of data technologies such as Hadoop, Apache Spark™, NoSQL, OLTP, OLAP, and ETL/ELT.
  • Hands-on experience working with MPP data warehouse appliances (Oracle Exadata, Teradata, IBM Netezza) or cloud data warehouses (Amazon Redshift, Azure Synapse, Snowflake)
  • Advise customers in Data Warehousing architecture, including anticipating blockers and address them before they become an issue
  • Familiarity with common data modeling methodologies such as dimensional modeling, Data Vault, Inmon
  • Experience in SQL language or any SQL dialect (PL/SQL, Transact-SQL or others)
  • Experience with BI tools such Power BI, Tableau, Qlik, or others
  • Knowledge of development tools and best practices for data engineers including CI/CD, unit and integration testing, plus automation and orchestration
  • Production programming experience in one of the following languages - Python, Scala, or R
  • Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent work experience
  • This role can be remote and can travel up to 30% when needed.

Benefits

  • Private medical insurance
  • Life, accident & disability insurance
  • Pension Plan
  • Vision Reimbursement
  • Equity awards
  • Enhanced Parental Leaves
  • Fitness reimbursement
  • Annual career development fund
  • Home office & work headphones reimbursement
  • Business travel accident insurance
  • Mental wellness resources
  • Employee referral bonus

Ace Your Job Interview

Read our advice on how to answer the most common interview questions.