Data Engineer

Apply for this position Please mention DailyRemote when applying
📅  Posted 11 days ago 📍 United Kingdom
Before you apply - make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

Job Description

About Us:

We're a vibrant and ambitious team, solving big problems in real estate data and analytics through our innovation. We have cross functional teams within our technology department and pride ourselves on being enthusiastic, creative and intelligent to solve the big challenges that lie within the Real Estate industry. We are looking for someone to help build our success through scaling and transforming our datasets for our upcoming new product lines (whilst ensuring the current product remains fully stable). We offer you a challenging yet supportive environment where you can clearly see your contribution to the company’s success.

The Job:

We’re looking for an ambitious and self-motivated Data Engineer to join our team. This role will be working with our expanding data engineering team that is responsible for the millions of datasets that are ingested from a variety of sources. This role will be responsible for the development and architecture of our data pipelines, automation, monitoring and alerting, and embedding new technologies to become the Real Estate’s first real-time intelligent data analytics platform. You will be working closely with our data science and platform technology teams to ensure that our product meets market demands through our state of the art data visualizations.

About You:

  • You care about getting the best possible outcome and are passionate about what you do
  • You have an eye for detail and order, being able to spot problems in code or data which others might miss or take longer to find.
  • You have a strong sense of responsibility, and the ability to breakdown, estimate and manage workflows with stakeholders and team members.
  • You have a keen interest in data engineering and automation.

Basic salary will be awarded based simply on what you bring to the business. We are a fair employer who believes strongly in a close relationship between performance and reward. However, an indication of basic salary is always useful, so our expectations are in the region of 40-60k Depending on Experience.

Our generous benefits package includes:

- Personal performance bonus

- Employer pension contribution

- 25 days annual leave plus generous parental leave

- Regular social events, meetups and game nights

- International and diverse team

- Employee equity scheme

- Educational resources and training available

- Fun & motivated team


Technical Requirements:

The ideal candidate is an experienced data engineer who enjoys working in a fast paced and interesting environment, at the cutting edge of what can be done with our datasets. You will be self-directed and comfortable supporting the data pipelines required for our platforms. The right candidate will be excited by the prospect of supporting our product team and clients alike.

What we’re looking for:

  • Bachelor’s degree or higher in an applicable field such as Computer Science, Statistics, Maths or similar Science or Engineering discipline
  • Strong knowledge of SQL databases (Postgres SQL)
  • Strong knowledge of AWS tools such as Glue, S3, Cloud Formation, RDS, EC2
  • Good coding skills with python• Experience with Spark (Scala version) is preferred
  • Experience with GO is preferred• Experience with version control tools like Git
  • Experience with working with Geospatial Data• Deployment and release cycle experience using Github
  • Ensure software development align with our best practices, overall architecture and with acceptance criteria.

Nice to haves or excited to learn:

  • Extensive experience of AWS
  • Experience of big data tools such as Apache Spark and/or Hadoop (Python or Scala flavours)
  • Experience of Geospatial tools such as PostGIS/GeoSpark
  • Experience of Data Modelling (Kimball Methodology).