Data Engineer

Krakow, Lesser Poland, Poland

Ocado Technology

View company page

 

Job Purpose 

As a Data Engineer in Ocado Technology, you implement data processing workflows. At this position, your responsibilities will include designing and building data marts to serve specific business lines. That gives you a unique opportunity to work with and influence how top technologies provided by Google Cloud Platform (BigQuery, DataFlow, Cloud Storage) will evolve - the same that Google is using internally to process extreme amounts of data.


You are responsible not only for development, maintenance and support of scalable and well monitored data processing workflows in batch mode but also driving the change toward streaming data processing.


As with any member of the team, you also will be expected to work closely with the product manager to identify and scope projects, plan those projects and effectively communicate advanced technical issues and findings to a range of technical and non-technical internal audiences. You are also an advocate of Data Driven Company paradigm and will be able to evangelize others about data processing best practices.

 

Roles & Responsibilities 

You have responsibility for: 

  • Design and build data marts for business stakeholders
  • Build streaming data transformations pipelines
  • Development of tools that help:
    • create data marts
    • build streaming data transformations
    • orchestrate data monitoring
  • Create company-wide best practices in data transformation
  • Evangelisation for Data Driven / Informed organisation
  • Constantly look for improvements and learn from our mistake
  • Work closely with team leaders to set technical direction within the team and to Ensure software developed is of a consistently high standard, cohesive and meets business objectives 
  • Assisting all team leaders to ensure projects are accurately estimated and delivered to schedule 
  • Working closely with Team Leader on issues related to requirements and design
  • Mentoring colleagues and take an active role in their development
  • Research, development & optimisation
  • Participate effectively in all stages of systems development, including:
    • requirements gathering 
    • design of systems and data 
    • software construction (coding)
    • deployment of production ready code
    • maintenance and support of existing systems
    • investigation and resolution of production issues
    • performance tuning
    • code review and change control

You may be asked to perform tasks as required by management deemed as a reasonable request. This job description is a summary of the typical functions of the role, not an exhaustive or comprehensive list of possible role responsibilities, tasks and duties and is subject to review. The responsibilities, tasks and duties of the jobholder might differ from those outlined in the job description and other duties, as assigned, might form part of the job.

 

Knowledge, Skills and Experience

ESSENTIAL  
  • Knowledge of Java 
  • Ability to write and optimise SQL queries 
  • Knowledge of databases and best engineering practices 
  • Data structuring and design skills 
  • Strong knowledge of distributed computing 
DESIRABLE
  • Previous experience working with data sets measured in terabytes 
  • Knowledge of data warehousing principles 
  • Be passionate about Big Data 
  • Knowledge of Google Cloud Platform (Compute Engine, BigQuery, DataFlow, DataProc, Cloud Storage) 
  • Good understanding of data protection issues 
REQUIRED COMPETENCIES 
  • Technical Excellence : demonstrates intellectual rigour, possesses relevant abilities & is able to pick up new skills quickly
  • Innovation & Problem Solving: able to solve complex problems, participates in continuous improvement, adapts the ideas of others
  • Productivity, Drive & Achievement: proactive approach, gets things done, demonstrates accountability & ownership, prioritises own workload
  • Business awareness: ability to apply learned skills, awareness beyond immediate area/role
  • Adaptability: working under pressure, flexible, positive & focused during times of change
  • Communication & Impact: strong verbal and written communication in English and Polish. Robust interaction with internal clients
  • Teamwork: works well with others & actively contributes towards team objectives

 

Tags: Big Data BigQuery Dataflow Dataproc Data Warehousing Engineering GCP Google Cloud Pipelines Research SQL Streaming

Perks/benefits: Flex hours

Region: Europe
Country: Poland
Job stats:  18  1  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.