Data Engineer (GCP, Python)

Poland - Remote


Sonalake is a trusted partner for global technology brands, telcos and scale-up disruptors.

View company page

Sonalake is a software partnering company that helps our clients realise their product roadmaps. Product design and engineering are at the heart of our business. Our engineering teams work with clients right across the stack; UX, UI design, frontend, backend, analytics, infrastructure, operations - and everything else that goes into delivering great products.

We thrive on variety and are highly adaptable. Our teams are exposed to domains as varied as telecom billing, ad tech, securities-based lending, travel tech analytics, and many more.

This is exciting times for the Data team of one of our client, as they continue the evolution of data environment, through Google Cloud, providing the business and retailers with fast, data-driven management decisions, and the ability to discover new insights in a growing market.


  • You should be committed to delivering fast, delivering quality and possess a passion for data and the value it can bring.
  • Working with the rest of the data team on GCP to build a dedicated data architecture to support the needs of ESW and its partner retailers.
  • Create and maintain optimal data pipeline architecture.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Python, SQL and GCP data technologies.
  • Concept building, case building and interacting with product teams and business units.
  • Analysis of operational problems and requests for change.
  • Production of technical specifications and estimations of work.
  • Development of solutions and delivery into production.
  • Generation of documentation to support solutions delivered.
  • Peer reviews of work with other team members.

Desired experience

  • Strong programming and architectural experience, ideally in Python or Java, and SQL
  • Google Cloud Platform (BigQuery, Google Storage, DataFlow, Pub/Sub, Cloud Functions, etc.)
  • Apache Beam, Airflow, Kafka desirable
  • Experience with Microsoft Azure is desirable.
  • Experience building scalable and high-performant code.
  • Experience in producing tested, resilient, and well documented applications.
  • The ability to take ownership of problems end to end and find creative solutions.
  • Experience in architecting, building, maintaining, and troubleshooting cloud infrastructure.
  • Excellent interpersonal skills, verbal, and written communication skills; a team player and keen learner who loves building great things together.
  • Knowledge of eCommerce would be also an advantage.

We take pride in being a people-oriented company. Openness and opportunity are really important to us. We build teams that span from experienced leaders to bright graduates and work to develop all of us within our coaching culture.

Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Airflow Architecture Azure BigQuery Dataflow E-commerce Engineering GCP Google Cloud Kafka Python SQL UX

Regions: Remote/Anywhere Europe
Country: Poland
Job stats:  28  9  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.