Senior Data Engineer - Remote Working (UK)

Remote - London, England, United Kingdom

Full Time Mid-level / Intermediate
Zego logo
Apply now Apply later

Posted 2 weeks ago

We are Zego - a commercial motor insurance provider that powers opportunities for businesses, from entire fleets of vehicles to self-employed drivers and riders. We combine best-in-class technology with sophisticated data sources to offer insurance products that save businesses time and money.

Since our inception, we have believed that the problem with traditional insurance is that it holds businesses back. It’s too expensive and time consuming, and it no longer suits businesses who use vehicles to earn money. Our products represent a solution to this problem for businesses based across the UK, Europe and beyond.

So far, we have raised over $200 million in funding and we were the first UK insurtech to be valued at over $1 billion. We were also the first to be awarded our own insurance license and recently won Tech Company of the Year 2020.

At Zego, we are proud to say we have a diverse and inclusive team, unified by our shared values and mission. Our people are the most important part of our story and everybody at Zego, no matter their role, has an integral part to play.

Purpose of the Role:

PLEASE NOTE: This is a distributed role so applicants can be based anywhere in the UK. Please note that you may, on occasion, be required to visit the London office so we can only accept applications from residents based in the UK.

We are looking for a Data Engineer to help us build and improve our data analytics platform.

At Zego the Data Engineering team is integral to our data platform, working closely with Software Engineers, Data Scientists and Data Analysts along with other areas of the business. We use a variety of internal and external tooling to maintain our data repositories. We are looking for people who have worked with both the ETL and ELT paradigms, have a background in Python and SQL, good software engineering and data infrastructure principles, spent time working with complex and fast growing datasets and are able to communicate well.

Our stack involves but is not limited to Airflow, Data Build Tool (DBT), a multitude of AWS services, Stitch and Fivetran. As a Senior Data Engineer you will have the opportunity to promote emerging technologies where they can add value to the business and promote better ways of working.

It is an exciting time to join, and you’ll partner with world class engineers, analysts and product managers to help make Zego the best loved insurtech in the world.

About the role:

Over the next 12 months you will:

  • Assist in developing and maintaining our ETL and ELT pipelines.
  • Support our data scientists in the development and implementation of our ML pricing models and experiments.
  • Help evolve the architecture of our data ecosystem to support our long term vision by continuously iterating on our data infrastructure.
  • Work with the data team and the rest of engineering to develop and implement a clear data strategy.
  • Collaborate with product managers and across teams to bring new products and features to the market.
  • Own data as a product, building a data platform focusing on data structure, quality, usage and efficiency.
  • Build tailored data replication pipelines as our backend application is broken into microservices.

About you:

We are looking for somebody with a strong working knowledge of building data pipelines and the underlying infrastructure. You should have participated in warehouse design undertakings and followed best practices during implementation. You have worked with Data Analysts, Data Scientists and Software Engineers.

Practical knowledge of the following or similar technologies:

  • Python
  • Airflow
  • Data Warehousing (Redshift / Snowflake)
  • SQL (We use DBT for modelling data in the warehouse)
  • Docker

Otherwise an interest in learning these, with the support of the team, is essential. We're looking for people with a commitment to building, nurturing, and iterating on an ever-evolving data ecosystem.

Other beneficial skills include:

  • Familiarity with Kubernetes (EKS)
  • Implementation / Contribution to building a Data Lake or Data Mesh
  • Having worked with a wide variety of AWS services
  • Experience in using infrastructure as code tools (e.g. Terraform)
Job tags: Airflow AWS Data Analytics Data Warehousing Engineering ETL Kubernetes ML Python Redshift SQL
Job region(s): Europe Remote/Anywhere
Job stats:  14  2  0
Share this job: