Data Engineer

London, England, United Kingdom

LiveScore Group

View company page

Hybrid working: 2 days in the office per week Tues – Thurs

The Role

We are looking for an experienced Data Engineer to take on the day to day responsibilities of designing, building, and maintaining scalable and efficient data pipelines and architectures within the GCP ecosystem.

Candidates for this position typically need a strong background in computer science, data  engineering, or a related field, along with experience in dimensional data modelling, data warehousing solutions, ETL tools, SQL, and programming languages such as Python or Java. Familiarity with GCP-specific tools and services is crucial, along with a solid understanding of cloud architecture principles and data security protocols.

LiveScore Group is home to LiveScore, LiveScore Bet and Virgin Bet, three of the most exciting brands in the sports and gaming industries today. We are passionate and excited about what we do in the Sports Media, Gaming and Betting sector and are proud of the high ratings we have achieved. We live our values, aiming to enhance and increase the enjoyment of sport through innovative products and immersive experiences. Trust is important in how we work, ensuring we are a collaborative unit who holds each other accountable. Relentlessly ambitious in everything we do, we embrace change and understand that failure is part of success.

We use our platform for positive social change and work to build a company our employees are proud to work for. We have a long-term commitment to diversity and inclusion. Our I.D.E.A (Inclusion, Diversity, Equality, Awareness) Forum is critical to our business and is embedded in our organisational structure to support a sense of belonging for everyone and to create a culture that allows all employees to thrive.

We know a job description can often deter an applicant who does not believe they match 100% of our requirements. If you are interested in this position and have relevant experience, then we would still love to hear from you.

Key Responsibilities 

  • Develop and deploy robust data pipelines using GCP services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage. This includes integrating data from various sources, ensuring data quality, and enabling real-time processing capabilities.
  • Design data models and manage databases effectively to support business analytics and intelligence using dimensional data modelling principles. Utilise GCP’s database technologies such as Bigtable, Spanner, or Firestore to optimise performance and scalability.
  • Automate data workflows and optimise data storage and processing costs within GCP. Use tools like Cloud Composer for orchestration and employ best practices in performance tuning.
  • Ensure data security by implementing GCP security practices. Manage access controls, comply with data governance policies, and safeguard sensitive information.
  • Work closely with other data engineers, business analysts, and IT teams to define data requirements and deliver comprehensive data solutions. Provide technical guidance and support to team members.
  • Stay updated with the latest GCP features and data engineering technologies. Explore innovative ways to enhance data infrastructure and contribute to the development of new data products.
  • Provide second-line support for critical pipeline failures or data quality incidents and perform fault analysis to quickly resolve issues.

Skills, Knowledge and Experience

  • Experience in data engineering / warehousing with large-scale complex datasets
  • Knowledge of Dimensional Data Modelling principles.
  • Knowledge of ETL and data warehousing concepts
  • Experience writing complex SQL queries
  • Experience with Python or any other similar programming language
  • Logical mindset – problem solving through methodical techniques
  • You are a structured, self driven and logical thinker and communicator who brings clarity and organisation to complex spaces
  • Passion for innovation, problem-solving, and driving positive change in the gambling industry
  • Willingness and ability to participate in a rotating on-call schedule for after-hours support.

What can we offer? 

  • Company Performance bonus
  • Hybrid working for all staff with flexible working opportunities
  • Private Healthcare scheme + Employee Enhanced Assistance
  • Enhanced Family Leave - Maternity, Shared Parental & Adoption Leave: up to 6 months at full pay and 6 months at half pay. Paternity leave: up to 4 weeks at full pay
  • Subsidised gym membership
  • Ride to Work & Annual Travel Card loan scheme
  • Life Assurance (x3 salary)
  • Contributory Pension Plan
  • Thursday drinks in the office and regular socials
Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Architecture BigQuery Bigtable Business Analytics Computer Science Dataflow Data governance Data pipelines Data quality Data Warehousing Engineering ETL GCP Java Pipelines Python Security SQL

Perks/benefits: Fitness / gym Flex hours Flex vacation Parental leave Salary bonus

Region: Europe
Country: United Kingdom
Job stats:  4  2  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.