Senior Data Engineer for a mobility fintech project

Poland - Remote

Applications have closed

Netguru

Europe’s finest custom software development company. More than 10 years of experience, over 630 developers and designers specializing in software development, mobile development and product design.

View company page

Join Netguru Talent Marketplace, a proven partner for tech-minded freelancers and experts. Thanks to us, you will have access to various project-based opportunities and can collaborate with different companies and industries. As a result, you will not only gain more experience but also develop a variety of skills you didn’t even know you had. Work the way you like, on your terms, with no strings attached.


We are looking for a Senior Data Engineer who would like to join one of our key projects within the area of mobility financing, and develop a durable relationship with our client - a startup that develops their platform revolutionizing vehicle financing across countries that have limited or no access to services as such.

Project description:

  • You will join a team of experts, building a large product for millions of users, already existing in 6 countries, with a possibility of expanding to various continents,
  • You will co-create cutting-edge solutions with clients, partners, fin-techs, and other technology leaders,
  • You will contribute to the development of several innovation projects.

Requirements

Apply if you:

  • Have 3+ years of experience in Data Engineering or 4+ years in Software Engineering,
  • Are advanced in Python programming (good knowledge of OOP, iterators, generators, lambda functions, data structure, and Python ecosystem tools for code quality),
  • Haveexperience with DataOps - it is important for everyday tasks (observability, data quality & infrastructure monitoring),
  • Have strong skills in the GCP platform (with a focus on data services) and know AWS well,
  • Have experience in GCP BigQuery practical usage (writing/optimizing advanced SQL BQ queries, partitioning, and cost model understanding),
  • Have at least 1 year of experience in Apache Airflow.
  • Also, we’d expect from you:

    • Experience in Apache Beam framework (with Python SDK; Pipeline, PCollection, PTransform concepts),
    • Knowledge about GCP Dataflow (ready to switch from Airflow to Dataflow),
    • Good understanding of GCP pub/sub service (connecting from Apache Beam) and experience with the Apache Kafka ecosystem,
    • Experience with Change Data Capture (CDC) concept,
    • Knowledge about building/consuming data from APIs,
    • Great Expectations Python library usage will be a plus.


    Joining Netguru as a Senior Data Engineer on this project means:

    • Developing a clear understanding of the data use cases on client's side and critical user journeys to design data systems
    • Defining database structure / how we store our data (e.g., creating Entity Relationship Diagrams (ERD))
    • Database technology selection for both relational and non-relational databases (in particular, time-series databases)
    • Defining APIs and services for accessing data
    • Defining data validation and constraints to meet business needs
    • Integrating our data systems as part of an event-based architecture (especially with Kafka)
    • working with an experienced team;
    • 100% remote work – we've developed a perfect remote work culture;
    • processes based on the Scrum and Agile methodologies;
    • dev-friendly processes such as Continuous Integration, Continuous Delivery, Code Review, and bug bashes;
    • continuous development of your hard and soft skills.

Benefits

In return, we offer:

  • work with an experienced team and continuous development of your hard and soft skills;
  • dev-friendly processes such as Continuous Integration, Continuous Delivery, Code Review and bug bashes;
  • long-term collaboration on challenging products (FinTech, B2B software, E-commerce and more).


Looking for a full-time job? Check out our Career Page and find out more about our open recruitment processes.

If you need any disability-related adaptation at any step of the recruitment process – simply let the recruiter know! We'd be happy to help.

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile Airflow APIs Architecture AWS BigQuery Dataflow DataOps Data quality E-commerce Engineering FinTech GCP Kafka Lambda OOP Python RDBMS Scrum SQL

Perks/benefits: Career development Startup environment

Regions: Remote/Anywhere Europe
Country: Poland
Job stats:  11  0  0

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.