Data Engineer

Remote - Poland

Applications have closed

Codete

Accelerate your growth through technology. Discover the wide range of IT consulting and software development services delivered by Codete & let's work together!

View company page

Codete is not just a software company, it’s a place where tech-enthusiasts can grow by doing what they love and feel valued for what they are. We’re experienced, agile and versatile: we work with a wide range of technologies in projects from many different industries, and the majority of our team are senior-level specialists. At Codete, there’s always something new to learn!

We are searching for a Data Engineer who gets excited about building data systems and data architecture and wants to solve data-centric problems unique to the field of social media-driven e-commerce.

Your partner is a Berlin-based sustainable energy startup. The company is developing artificial intelligence-based algo-trading technology in the renewable energy era to help improve the efficiency of the power grid. Our team speaks more than 20 languages and consists of highly skilled and passionate experts in software development, energy trading, AI, Mechanical Engineering and Physics. The goal is to change the way we trade energy and we need your help :)

Your role will be to ensure that Data Science team will always have access to high-quality and concise data to develop core algo trading models. Large amounts of multivariate inputs such as raw weather data, solar/wind production forecasts, and grid balance indices will be your bread and butter. As part of our Backend team, you will be part of the foundation on which the company is built.


Tech stack: Python, SQL, PostgreSQL, TimescaleDB, AWS, Pyspark or Dask

Location: Kraków, Lublin, Remote

Salary:

B2B: 24 000 - 28 000 PLN Net + VAT

CoE: 19 000 - 23 000 PLN Gross

Requirements

  • Bachelors degree in computer science or a related technical field.
  • Experience working with Python and SQL
  • Hands-on experience working with PostgreSQL, TimescaleDB and AWS S3
  • A detailed understanding of ETL and data virtualization
  • Comfortable with frameworks such as PySpark and Das
  • Applied experience with managing databases on the cloud
  • Experience with data lakes, AWS, Airflow/Dagster is a plus


Responsibilities:

  • Analyze, develop and maintain data in close cooperation with our Data Science team
  • Create efficient data pipelines for machine learning models
  • Help set up a data lake and ETL pipeline to handle data management
  • Improve data quality and efficiency
  • Continuously monitor and visualize data
  • Act as the company’s database administrator ensuring state-of-the-art database management

Benefits

Values & Atmosphere

• flexible attitude (including working hours) • international business trips • social events & awaydays • support for your ideas

Personal development

• external conferences • technical & soft skills training • switching between projects/technology • English classes • internal library

Health & Relax

• Employee Wellbeing Platform • private health care • multisport card • sports events • chill-out room • fresh fruits & juicer

Knowledge & Culture

• open source initiatives • CodeteCON • R&D department

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile Airflow AWS Computer Science Dagster Data management Data pipelines E-commerce Engineering ETL Machine Learning ML models Open Source Physics Pipelines PostgreSQL PySpark Python R R&D SQL

Perks/benefits: Career development Conferences Flex hours Health care Startup environment Team events

Regions: Remote/Anywhere Europe
Country: Poland
Job stats:  4  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.