Senior Data Engineer

United Kingdom - Remote

Oxwash

Discover the world’s first truly sustainable commercial laundry service provider, delivering a high-quality clean with minimal impact on the planet. B-Corp™ Certified.

View company page

About Oxwash

💧Our mission at Oxwash is to permanently eliminate the impact of washing on our planet and us. We are driving towards net-zero carbon emissions and revolutionising the laundry industry from our locations in Swindon, Oxford, Cambridge and London (Battersea).

The textile and garment washing industries have been largely untouched by modern development and production practices (like just-in-time and agile manufacturing).  Similarly, Oxwash are one of the very few companies aiming to build world-class bespoke software in this market.  The Platform team at Oxwash care about building valuable software in a sustainable way, to make linen and textile care the most efficient they can be.

We build the software which supports our daily operations across operations and HQ functions.  The result is an increasing ability to use data to navigate conversations and decisions within the business, especially between our Operations, Platform, and Commercial teams.  Our internal tooling helps our operatives (laundry and delivery staff) manage their workload and model our operational capacities, among other things.  

If you like the look of this role and identify with our mission then apply to work with us. You could be a part of the first Certified B Corporation® laundry and wet-cleaning company in the world!

About The Role

Your core responsibilities encompass evolving our current data architecture through well-structured ELT processes ensuring data security, quality, and scalability. Your expertise will be vital in architecting and implementing efficient data pipelines for analytics data ensuring crucial insights are available for Oxwash.

In this role you will lead expanding Oxwash’s data integrations, both with internal teams and with external data sources, with the goal of creating a unified data estate. By understanding the key data requirements of the business you will empower Oxwash to be data driven in its decision making.

Key Responsibilities:

  • Architect, implement and maintain Data Pipelines: Design and orchestrate efficient data pipelines, facilitating data flow crucial for decision-making across various business segments.
  • Deploy Data Infrastructure: Deploy data pipelines and analytics objects to the data estate while implementing access best practices.
  • Empower analytics: Curate easy to interact SQL self service layer for downstream analytics and business self service users and support analytics SQL work with advice and reviews.
  • Contribute to Engineering Practices: Work alongside the engineering team to contribute to enhanced codebase, refined architectural standards, and honed engineering practices in alignment with data engineering goals.
  • Remote Collaboration and Problem-Solving: Engage in remote collaboration with the team, addressing and solving challenges that emerge as operations scale, ensuring data infrastructure is equipped to support growing business needs.

What you'll need to be successful in the role

  • You care about quality in data engineering, and see it as craft
  • You are able to clearly vocalise your opinions, and are willing to listen to others
  • You can take a first-principles approach to problem solving and debugging
  • You like to work in agile, flow-based working environments
  • You see testing as essential to the process of building software
  • You have experience designing and building in existing codebases.

Key Relationships:

  1. VP Platform - You will be reporting to the VP of Platform with whom you will define the Data goals and targets, track progress and in general work alongside to deliver the data infrastructure and products the company needs
  2. Senior Data Analyst - You will work closely with Data Analyst to prepare the data infrastructure for them to have more efficient and more accessible ways of dealing with data and presenting it to the rest of the company and stakeholders
  3. Operations and Commercial stakeholders - You will work collaboratively with the Operations and Commercial teams to gather their data requirements and communicate high level roadmap plans to them

Requirements

Key Qualities and Competencies:

These are some of the key qualities and competencies required in this role (categorised under our 4 performance pillars):

Technical competency:

  • Knowledge of data structure and pipeline architectures
  • Advanced knowledge and experience of at least one programming language - ideally SQL, SQLX, Terraform, Python
  • A functional knowledge of cloud platforms (ideally GCP), and cloud-native data architecture components, such as queues and big data stores and query engines.
  • A good level of fluency in design Ideas, especially lean and agile methodologies (we are pragmatic, not dogmatic with this)
  • You can take a first-principles approach to problem solving and debugging

Technical Stack (We do not expect you to know all of these, but be happy to adapt to them):

  • Infrastructure management: Terraform
  • Language: Python 3
  • Analytics tooling: SQL and Dataform
  • Cloud: GCP (key components: BigQuery, Cloud Run, GCS, Pub/Sub, IAM, Workflows)
  • Dashboarding: Looker Studio
  • Deployment tools: Docker, GitHub, Github Actions, CircleCi

Leadership and collaboration:

  • Excellent communication skills, working with team members across the technical spectrum, including people who work with technology.  You are able to clearly vocalise your opinions, and are willing to listen to others
  • You have experienced building and delivering data products and infrastructure in a variety of ways and with a diversity of teams, and can use that experience to help us find the right lean/agile principles and practices to find and maintain productivity as an engineering team.

Sustainability and values:

  • You are expected to be a champion for sustainability initiatives and improvements.  This includes human, economic, and environmental aspects. 
  • You are expected to embody our company-wide values of creativity, excellence and integrity. 

Please note that this is a remote-first role for UK-based employees. You will be required to attend quarterly company offsites at our Big Blue I site in Swindon.

Benefits

💷 A salary of £70,000-85,000 per year

🧩 Generous EMI share options, following successful probationary period

☀️ 31 days’ holiday + the option to buy more and take your birthday off!

🧢 Uniform and equipment provided by Oxwash. You’ll also get an allowance to buy work related equipment.

💧 Employee discount on personal laundry and wet cleaning

🌻 Free, confidential mental health support + 1 Mental Health Day off

🏥 A funded health cash plan and enhanced sick pay

💚 Cycle to work scheme

Oxwash cares deeply about leading the way as an employer and providing a more inclusive and diverse workplace for all. We are an equal opportunity employer and value diversity at Oxwash. We do not discriminate on the basis of race, religion, colour, national origin, gender, sexual orientation, age, pregnancy and maternity, marital status or disability status. Please let us know if you would benefit from any reasonable adjustments in order to participate fully in our recruitment process.

Apply now Apply later
  • Share this job via
  • or

Tags: Agile Architecture Big Data BigQuery Data pipelines Docker ELT Engineering GCP GitHub Looker Pipelines Python Security SQL Terraform Testing

Perks/benefits: Equity Flex vacation Gear Health care

Regions: Remote/Anywhere Europe
Country: United Kingdom
Job stats:  8  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.