Data Engineer

London, England, United Kingdom - Remote

Applications have closed

Kody

Your one-stop-shop for in-person payments. From terminals and order & pay to instant access to takings, we’ve got you covered.

View company page

Kody, a fast-growing Fintech specialising in bringing online payments and better financial services to brick and mortar businesses, is seeking a talented Data Engineer to join their dynamic team.

In this role, you will be responsible for developing, constructing, and maintaining the data architecture of Kody's platform. You will work closely with cross-functional teams to design and implement data solutions that enable data-driven decision making and support business operations. This role requires a deep understanding of data integration, data modelling, and experience working with data pipelines and large volumes of data.

Kody's data stack for storage and processing currently consists of several tools and technologies, including Postgres, InfluxDB, Kafka, S3, BigQuery, Fivetran, Datadog, and Looker.

The ideal candidate will have strong problem-solving skills, be a team player, and be proactive in their approach to achieving company goals.

Responsibilities

  • Design, develop, and maintain data architectures and infrastructure
  • Collaborate with cross-functional teams to understand data needs and develop solutions
  • Implement ETL processes to extract, transform, and load data from various sources
  • Create and optimize SQL queries for data analysis and reporting
  • Implement and maintain data pipelines and workflows
  • Perform data validation and quality assurance tasks
  • Monitor and optimize database performance
  • Stay up-to-date with emerging trends and technologies in the data engineering field

Requirements

  • At least 3 years of experience in data engineering or a related field
  • Strong SQL skills and experience with relational databases (PostgreSQL preferred)
  • Experience with ETL tools and processes
  • Working knowledge of data modelling and database design principles
  • Experience with scripting languages (Python, Bash)
  • Experience with cloud-based data platforms (AWS, GCP, we use both)
  • Proficient in data warehousing concepts and methodologies
  • Experience with data visualisation tools (Tableau, Power BI, Looker. We use Looker and it would be fantastic if you do as well)
  • Experience with version control systems (Git)
  • Strong analytical and problem-solving skills
  • Excellent communication and collaboration skills

Benefits

  • We're a fast growing, exciting company within a rapidly developing industry. You'll be surrounded by super ambitious individuals who think really big.
  • Remote only, but we meet a few times a year to get together as a team.
  • Annual learning and development allowance on training related to your role
  • Allowance to spend on your home office set-up (excluding your laptop, monitor, etc which are already part of the job. This allowance is for making your workplace comfy)
  • Every year you'll get 25 days of holiday leave plus an extra day of leave for life events like your birthday
  • Enhanced parental leave

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Architecture AWS BigQuery Data analysis Data pipelines Data Warehousing Engineering ETL FinTech FiveTran GCP Git InfluxDB Kafka Looker Pipelines PostgreSQL Power BI Python RDBMS SQL Tableau

Perks/benefits: Career development Gear Home office stipend Parental leave Team events

Regions: Remote/Anywhere Europe
Country: United Kingdom
Job stats:  21  5  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.