Data Engineer (Google Cloud Platform)

Madrid, Community of Madrid, Spain

Apply now Apply later

Data Engineer (Google Cloud Platform)

Ryanair Labs is the technology brand of Ryanair. Labs is a state of-the-art digital & IT innovation hub creating Europe's Leading Travel Experience for our customers. The Ryanair platform has over 1 billion visits per year. By joining Ryanair, you will develop cutting edge tech solutions inside Ryanair, transforming aviation for Pilots, Cabin Crew & Ground Ops, as well as driving the tech experience for our customers on Europe’s largest travel website!

Ryanair Labs has more than 550 employees across our offices in Dublin, Madrid, Poland, and Portugal. Our plan is to continue to grow our IT Labs Team so we are always on the lookout for the best talent. Apply today for more information.

The Role

We are looking for a Data Engineer to work in a multi-disciplined team, working alongside Developers, Designers and Product Owners, who will own their work from the initial idea to the final implementation. We are continually gathering information on travel related events and are looking for someone to help us make the most of a cloud based data system. The role will allow for opportunities to work in a variety of areas such as commercial, marketing, engineering, logistics, and anything else that you may think of, or that will come your way.

The Data Engineer team has the majority of their projects on AWS, but there is also chance to work with other clouds, such as GCP and Azure.

 

Requirements

Your responsibilities will include:

  • Designing, developing and maintaining data processing jobs in cloud.
  • Making the solutions fault tolerant.
  • Monitor, tune performance and troubleshoot existing jobs.
  • Support data scientists and analysts in deploying their solutions.
  • Make data available and reusable within the organization.

Experience Required:

The Data Engineer that we are looking for should have the following skills:

  • Good knowledge on general computing systems (Operating systems, network, memory layers, processes, etc).
  • Proficient in data analysis.
  • Good intuition in exploring data and interpretation.
  • Must also have decent math / statistical base skills and how to use them on data quality processes.
  • Good knowledge of Data Warehouse concepts.
  • Strong knowledge of SQL on both DDL and DML scripts. Being able to optimize queries and understand the processes generated behind them.
  • Good knowledge of Big Data technologies, especially on HDFS and Spark (2.X minimum).
  • Strong skills in data processing with either Python or Scala, but some knowledge on both technologies.
  • Strong coding skills, good code style, testing skills and documentation.
  • Virtualization through Docker. Experience with Kubernetes is also a plus.
  • Job Orchestration with Airflow or similar.
  • Some knowledge of IaaC and CICD principles.
  • Good data visualization skills through dashboards.
  • Good communication skills.
  • Proactivity and good team spirit.

Benefits

  • A competitive but flexible career plan.
  • We offer a relocation package to people who are coming from another country.
  • Travel discounts (of course!).
  • Hybrid remote work model (3 remote/ 2 office).
Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  4  0  0
Category: Engineering Jobs

Tags: Airflow AWS Azure Big Data Data analysis Data quality Data visualization Data warehouse DDL Docker Engineering GCP Google Cloud HDFS Kubernetes Mathematics Python Scala Spark SQL Statistics Testing

Perks/benefits: Career development Flex hours Relocation support Team events

Region: Europe
Country: Spain

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.