Data Engineer (Allegro Pay)

Warszawa, Poland

Applications have closed

Allegro

Allegro - Najlepsze ceny oraz gwarancja bezpiecznych zakupów!

View company page

Job Description

The salary range for this position (mid) is 12 300 - 17 600 PLN in gross terms 

Allegro Pay is Central Europe’s largest and fastest growing fintech – the only place where financial solutions can be created on such a large scale, using state-of-the-art technology. We work on purchase financing and payment methods that are used daily by customers of Allegro, the most popular shopping platform in Poland and one of the largest e-commerce companies in Europe. Due to this scale, we have the opportunity to work on one of the most exciting purchase financing and payment solutions in our country. We are looking for talented people who want to create powerful and stable solutions to support a product with over a dozen million active users.

As a Data Engineer, you will be a contributor to the Allegro Pay Data Platform, working on strategic data-driven processes and the infrastructure to make them work properly. In your daily work, you will also come into contact with other streams in the data area such as Data Mesh, data warehousing and complex analytical models. More precisely, you will:

  • design, implement and produce processes operating on large data sets
  • work on processes from various areas, starting from pure business processes and ending with advanced Data Governance
  • develop many in-house tools responsible for data quality or process monitoring
  • work with various formats, e.g. JSON, Avro, Parquet, tables in relational databases, geographic data, graph databases
  • optimize costs associated with the operation of the solution in the Azure Cloud and GCP

Why is it worth working with us?

  • You will work with an experienced team that carries out complex and demanding projects related to real-time processing of data produced by back-end and front-end. We approach the creation of data processes in an engineering way, drawing on the methodologies developed when creating classic software
  • You will work on a project related to the area of ​​finance where the scale, advancement of algorithms, business impact and technical requirements will be a key challenge
  • You will directly influence data processes that change in real time how millions of users use Allegro
  • Our employees regularly attend conferences in Poland and abroad (Europe & US), and each team has its own budget for training and study aids. If you want to keep growing and share your knowledge, we will always support you

We offer:

  • Support of experienced data engineers in development
  • Interesting and ambitious challenges allowing for the development of technical and business competences
  • A hybrid work model that you will agree on with your leader and the team. We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)
  • Informal working atmosphere in a professional team
  •  A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)
  • English classes that we pay for related to the specific nature of your job
  • Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)

Our offer is for people who:

  • Graduated in computer science, mathematics or other technical field
  • Have 1+ years of experience in building data-driven solutions
  • Have practical knowledge in creating efficient data processing applications 
  • Build advanced SQL queries using the functionality of the language (window functions, HAVING clause, EXISTS, OUTER APPLY)
  • Program in Python and are comfortable using basic development tools
  • Have experience in working with large data sets, understand database algorithms and data structures (e.g. they know what the difference between merge join and join hash)
  • Can independently make decisions in the areas entrusted to them and take responsibility for the code they create
  • Are not afraid of new technologies and want to expand their range of skills

Additional benefits you may have:

  • Practical knowledge of Big Data technologies (e.g. Apache Spark, BigQuery, Snowflake)
  • Experience working with large data sets (tables > 1TB in size)
  • Experience working with statically typed language (Java, C#)
  • Experience working with CI/CD tools

Apply and see why it is #dobrzetubyć (#goodtobehere)

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Avro Azure Big Data BigQuery CI/CD Computer Science Data governance Data quality Data Warehousing E-commerce Engineering Finance FinTech GCP Java JSON Mathematics Parquet Python RDBMS Snowflake Spark SQL

Perks/benefits: Career development Conferences Lunch / meals

Region: Europe
Country: Poland
Job stats:  8  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.