Senior Data Engineer

London, England, United Kingdom

Applications have closed

Nutmeg

Nutmeg is an online investment management service. Invest money using our General Investment Account, ISA, Pension, Lifetime ISA or Junior ISA.

View company page

Who we are:

Nutmeg is Europe’s leading Digital Wealth Manager, but we don’t want to stop there. We’re continuing to build our platform to help us achieve our mission of being the most trusted Digital Wealth Manager in the world.

Since being founded in 2011 we've:

  • Grown to 160+ employees
  • Raised over £100M in funding
  • Launched 4 amazing products including JISA and Lifetime ISA
  • Won multiple awards including Best Online Stocks & Shares ISA Provider for the fifth year in a row!

We hit the 150,000 investor milestone in 2021 and now manage over £4 billion AUM.

*We offer flexible working*

Job in a nutshell:

We run a pure AWS-based cloud environment and deliver features using a continuous delivery approach. Our Data platform comprises a mix of services and open-source products fully running in Kubernetes and utilising AWS native Data solutions.

Nutmeg Data solution is a mix of batching and streaming processes leveraging Airflow, Apache Kafka and AWS Data tools. Our key characteristic is enabling a self-service experience for all Data stakeholders.

Nutmeg products are served by a polyglot mix of microservices designed following Domain-Driven Design principles and composing an Event-Driven Architecture powered by Apache Kafka.

As a Senior Data Engineer, you will closely collaborate with technical and non-technical teams to deliver Data solutions supporting Nutmeg’s Data strategy.

We are looking for someone with previous job experience as a senior engineer and a strong passion for Data challenges.

Requirements

Your skills:

  • Following Data engineering industry best practice
  • Full ownership of end-to-end Data pipelines
  • Designing, implementing, and maintaining Data models
  • Writing automated test around Data models
  • Understanding of CI/CD principles
  • Experience with cloud platforms for Data (ideally AWS)
  • Experience in converting business requirements into technical deliverables
  • Previous experience with two or more of the following: Airflow, dbt, Kafka Connect, Looker, Python, and Redshift

You might also have:

  • DataOps best practice
  • Experience in collaborating with BI and Data Science teams
  • Use of agile/lean methodologies for continuous delivery and improvement
  • Knowledge of monitoring, metrics or Site Reliability Engineering
  • Understanding of Data governance and security standards

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile Airflow AWS CI/CD Data pipelines Data strategy Engineering Kafka Kubernetes Looker Microservices Pipelines Python Redshift Security Streaming

Perks/benefits: Flex hours

Region: Europe
Country: United Kingdom
Job stats:  1  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.