Google Pillar | Mid/Senior Data Engineer

Lisbon, Portugal

Devoteam

Stay ahead with Devoteam's IT consultants. We offer digital strategy, cloud platforms, cybersecurity expertise & more for your digital innovation vision.

View company page

Company Description

At Devoteam, we believe that technology with strong human values can actively drive change for the better. Discover how Tech for People unlocks the future, creating a positive impact on the people and the world around us. We are a global leading player in Digital Transformation for leading organisations across EMEA, with a revenue of €652M. We believe in transforming technology to create value for our clients, partners and employees in a world where technology is developed for people. We are proud of the culture we have built together. We are proud of our people at the service of technology. We are proud of our diverse environment. Because we are #TechforPeople. Join our multidisciplinary team of Cloud experts, Designers, Business consultants, Security experts, Engineers, Developers and other extraordinary talents, spread across more than 18 EMEA countries. Become one of our +8.000 tech and business leaders on cloud, data and cyber security. Let’s fuse creativity with technology together and build innovative solutions that actively change things for the better.

Job Description

Our Devoteam G Cloud is looking for Google Cloud Data Engineers to join our Google Cloud Platform specialists.

  • Delivery of Data projects more focused on the Engineering component;
  • Working with GCP Data Services such as BigQuery, Cloud Storage , Dataflow, Dataproc, Pub/Sub and Dataplex;
  • Write efficient SQL queries;
  • Develop data processing pipelines using programming frameworks like Apache Beam;
  • Automate data engineering tasks;
  • Building and managing data pipelines, with a deep understanding of workflow orchestration, task scheduling, and dependency management;
  • Data Integration and Streaming, including data ingestion from various sources (such as databases, APIs, or logs) into GCP.

Qualifications

  • Bachelor degree in IT or similar;
  • More than 3 years of professional experience, with expertise in the delivery of Data Engineering projects;
  • GCP Data Services, BigQuery, Cloud Storage , Dataflow, Dataproc, Pub/Sub and Dataplex;
  • Knowledge of programing languages: Python, Java, or SQL;
  • Experience with tools like Apache Airflow, Google Cloud Composer, or Cloud Data Fusion;
  • Knowledge of streaming data processing using tools like Apache Kafka;
  • GCP Certifications: Professional Data Engineer or Professional Cloud Database Engineer and/or Associate Cloud Engineer (nice to have);
  • Proficiency in English (written and spoken).
Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Airflow APIs BigQuery Dataflow Data pipelines Dataproc Engineering GCP Google Cloud Java Kafka Pipelines Python Security SQL Streaming

Region: Europe
Country: Portugal
Job stats:  5  1  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.