Do you think that insurances are complicated, rigid and old-fashioned? We do too. Thats why we, at Gocleer, want to make insurance simple, fair and transparent.
We do believe that mobility and mobility habits are changing and we want to offer the best experience to our customers while choosing the products that best fit their life-style. We are compromised to provide the best experience not only to our customers but also to our employees and our community, we support local and international social and environmental projects and are in the process of being a certified B-corp company. The Gocleer movement is growing very fast and we are looking for motivated and passionate individuals who share our values and want to contribute to change the insurance world.
We are a data-centered company, and data is at the core of product development. In this role, you'll work with key business and product team, analysts and data leader to understand the business domain and how data can empower them. Your day-to-day will involve engaging with fellow engineers to develop a robust and scalable platform, to make the process of producing data and deriving insights efficient. You are passionate about the quality of the data you produce and take pride in having your data drive our business.
Passionate and motivated, you are convinced that things can changeLove the power of dataThink big and develop nimble dashboards to support the analytical needs of the business Collaborative mindset with may internal stakeholders: marketing, actuarial, pricing, ops and business directors. half creative, half analytical and communicative.
Your main tasks:
Be an ambassador of our values and beliefs Design, implement and maintain pipelines that produce business-critical data reliably and efficiently using cloud technologiesBuild a data architecture for ingestion, processing, and surfacing of data for large-scale applicationsCollect, process, and clean data from different sources using SQL, Python, or other scripting languagesImprove data discovery and literacy: create exploration and visualisation interfaces in BI tools and promote the use of these sources across the company
Between 2 to 3 years of experience working in business intelligence, analytics, data engineering, or a similar roleExperience designing and building scalable and robust data pipelines to enable data-driven business decisionsExperience in collecting requirements and creating data modeling designsKnowledge of modern data warehouses (Snowflake, Redshift, BigQuery) and big data structuresExperience implementing enterprise dashboarding toolsStrong proficiency and experience with SQL and PythonExperience working with modern BI tools like Google Data Studio, Tableau, Looker, QlikviewGood understanding of software development and agile methodologiesPassion for analyzing large complex datasets and converting them into information which drives business decisionsExcellent spoken and written English.Nice To Have: Experience working with AWS EMR, AWS Glue, Databricks
Experience with Docker, Kubernetes
Experience in building Data Lake
Orchestration of Machine Learning pipelines
Benefits & perks
We work from remote most of the time but we do enjoy share time together in the office on a regular basisCompetitive salary Regular team eventsMultiple discounts with our partners