Data Engineer with Cloud

Kraków, PL, 30-302

GFT Technologies

Wir unterstützen unsere Kunden mit modernsten IT-Lösungen, Technologien und Beratung in der digitalen Transformation.

View all jobs at GFT Technologies

Apply now Apply later

Why join GFT?

 

You will work with and learn from top IT experts. You will join a crew of experienced engineers: 60% of our employees are senior level.

 

Interested in the cloud? You will enjoy our full support in developing your skills: training programs, certifications and our internal community of experts. We have strong partnerships with top cloud providers: Google, Amazon and Microsoft - we are number one in Poland in numbers of GCP certificates. Apart from GCP, you can also develop in AWS or Azure.

 

We are focused on development and knowledge sharing. Internal expert communities provide a comfortable environment where you can develop your skillset in areas such as blockchain, Big Data, cloud computing or artificial intelligence.

 

You will work in a stable company (32 years on the market) in demanding and challenging projects for the biggest financial institutions in the world.

 

What will you do?

 

  • Design, build, test and deploy Cloud and on-premise data models and transformations in Cloud Native or dedicated toolset
  • ETL development
  • Optimize data views for specific visualization use cases making use of schema design partitions, indexes, down-sampling, archiving, etc. to manage trade-offs such as performance and flexibility
  • Review and refine, interpret and implement business and technical requirements
  • Ensure you are part of the on-going productivity and priorities by refining User Stories, Epics and Backlogs in Jira
  • Onboarding new data sources, design, build, test and deploy Cloud data ingest, pipelines, warehouse and data models/products

 

 

Your skills:

 

  • Openness to work in a hybrid model (2 days from the office per week)
  • At least 2 years of commercial experience as a Data Engineer

  • Strong SQL and Python and Spark/PySpark
  • Experience with Data Modelling
  • Understanding of key concepts around Data Warehousing, Data Lakes and Data Lakehouses
  • Experience with at least one of commercial clouds (GCP, AWS, Azure)
  • Experience with Cloud Data engineering toolset

 

 

Nice to have:

 

  • Experience with Business Intelligence tools

  • Snowflake
  • DataBricks

 

 

We offer you:

 

  • Working in a highly experienced and dedicated team
  • Competitive salary and extra benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
  • Permanent contract or B2B contract
  • On-line training and certifications fit for career path
  • Free on-line foreign languages lessons
  • Regular social events
  • Access to e-learning platform
  • Ergonomic and functional working space with 2 monitors

 

Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  3  0  0
Category: Engineering Jobs

Tags: AWS Azure Big Data Blockchain Business Intelligence Databricks Data Warehousing Engineering ETL GCP Jira Pipelines PySpark Python Snowflake Spark SQL

Perks/benefits: Career development Competitive pay Health care Team events

Region: Europe
Country: Poland

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.