Data Engineer

Iași, Iași County, Romania - Remote

Applications have closed

You will:

Join our agile distributed team of software engineers responsible for setting a strategy for data engineering and developing guidelines across our streaming and data warehouse technologies.
The vision: making zero-carbon happen!

Your key responsibilities will include:

  • Engaging in the data engineering community with engineers working in product teams, providing advice and guidance, tool recommendations etc.
  • Building and maintaining integration pipelines that power our automated journeys
  • Transforming streaming data to meet target schemas
  • Being part of an agile engineering team where you will have the opportunity to influence technology selection
  • Establishing good data engineering practices including using infrastructure as code; contributing to automated testing strategies; setting up monitoring and alerting tools; employing CI/CD best practices to deploy regularly to production
  • Working with key stakeholders to understand their data needs and help deliver solutions that provide them with excellent quality data that allows teams to realise their objectives.

Requirements

Who we are looking for:

Our future colleague should be a passionate data engineer and trustworthy professional, with an attitude of ownership, commitment to deliver, a desire to make an impact, and a collaborative mindset. We are welcoming teammates who enjoy experimenting with data engineering technologies.

The following key skills and experience are required:

  • 4+ years of experience in designing, building, monitoring and managing large-scale data products, pipelines, tooling and platforms
  • A proven track record as a Data Engineer, setting strategy and defining ways of working
  • Experience working on streaming ETL solutions utilising streaming data processing tools (e.g. Kafka Streams, Kinesis, Spark or similar)
  • Experience developing cloud-based solutions on GCP (preferably), AWS or Azure using Infrastructure as code tools such as Terraform
  • Excellent knowledge of at least one programming language e.g. Scala, Python, Typescript, Java, Kotlin
  • Experience designing and building data pipelines using DataFusion/Dataform or similar
  • An understanding that building quality software is essential and you value automation and continuous delivery
  • A love for building scalable, resilient solutions, and you enjoy influencing the team’s technology selection and architectural direction
  • You will be comfortable working in an agile software development environment and have experience of CI/CD and deployment strategies
  • Adaptability and flexibility when dealing with change and ambiguity
  • Ability to interpret and communicate information in a clear and concise manner to others (technical/non-technical team members etc.)
  • Open and friendly personality, excellent interpersonal and team-working skills as well as a problem-solving mindset and self-improve skills
  • Excellent verbal and written communication skills in English
  • Bachelor or higher degree in computer science (or equivalent).

Benefits

  • Challenging projects in a highly professional, but also a collaborative and supportive environment
  • Working in small and excellently skilled teams
  • Opportunity for long-term professional growth within our development center
  • Competitive compensation depending on experience and skills
  • Respect and support for your professional, family and personal goals.

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile AWS Azure CI/CD Computer Science Data pipelines Data warehouse Engineering ETL GCP Java Kafka Kinesis Pipelines Python Scala Spark Streaming Terraform Testing TypeScript

Perks/benefits: Career development Competitive pay

Regions: Remote/Anywhere Europe
Country: Romania
Job stats:  31  4  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.