Data Engineer

Prague, Czech republic

Applications have closed

dentsu international

At dentsu, innovation is our strength, and your growth is our mission. We help you keep up with technological changes in the digital economy.

View company page

Company Description

We Dream. We Do. We Deliver.

As a full-service, data-driven customer experience transformation, we partner with Top 500 companies in the DACH region and in Eastern Europe. Originally from Switzerland, Merkle DACH was created out of a merger Namics and Isobar - two leading full-service digital agencies.

Our 1200+ digital enthusiasts are innovating the way brands are built, through providing expertise in Digital Transformation strategy, MarTech platforms, Creativity, UX, CRM, Data, Commerce, Mobile, Social Media, Intranet and CMS. We are part of the global Merkle brand, the largest brand within the dentsu group, who shares with us a network of over 66,000 passionate individuals in 146 countries.

Job Description

  • Design and implement data ingestion and processing of various data sources using public cloud (MS Azure, AWS, GCP) big data technologies like Databricks, AWS Glue, Azure DataFactory, Redshift, Kafka, Azure Event Hubs, AWS Step Functions, AWS Lambda, Azure Functions etc.
  • Collaborate with Business Intelligence consultants and assemble large and complex data sets that meet functional / non-functional business requirements for data lakehouse.
  • Support data scientist / analyst teams in deployment and optimization of AI / Machine Learning models and other data algorithms in services like AWS SageMaker or Azure ML.
  • Develop data pipelines to provide actionable insights into marketing automation, customer acquisition and other key businesses areas.
  • Develop DevOps automation of continuous development / test / deployment processes.
  • Document implemented data pipelines and logic in a structured manner using Confluence, plan your activities using Agile methodology in Jira.
  • Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs, like optimizing existing data delivery, re-designing infrastructure for greater scalability, etc.
  • Support pre-sales by proposing technical solution and accurate effort estimate

Qualifications

  • Experience in building and productionizing big data architectures, pipelines and data sets.
  • Understanding data concepts and patterns of big data, data lake, lambda architecture, stream processing, DWH, BI & reporting.
  • 2+ years of experience in a Data Engineer role, who has attained experience using the following software/tools:
    • Experience with big data tools like Hadoop, Spark, Kafka, etc.
    • Experience with object-oriented / functional / scripting languages like Python, Scala, Java, R, C++, Bash, PowerShell etc.
    • Experience with MS Azure (Databricks, Data Factory, Data Lake, Azure SQL, Event Hub, etc.) or AWS (Glue, EC2, EMR, RDS, Redshift, Sagemaker, etc.) cloud services.
    • Implementing large-scale data/events oriented pipelines/workflows using ETL tools.
    • Extensive working experience with relational (MS SQL, Oracle, Postgres, Snowflake, etc.) and NoSQL databases (Cassandra, MongoDB, Elasticsearch, Redis, etc.)
  • Strong analytic skills related to working with structured and unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Experience in setting up and using CI/CD automation tools like Azure DevOps, AWS CodePipeline, etc.
  • Person who is precise, well organized, has good communication skill, can adapt to changing circumstances and is not afraid of responsibility for his / her work will do great in this role.

Preferred Skills

  • Deep hands-on development experience in MS Azure or AWS environments
  • Past experience in delivery of business intelligence projects, using tools like Power BI, Tableau, Qlik Sense, Keboola…
  • Working knowledge of message queuing, stream processing, and highly scalable real-time  data processing using technologies like Storm, Spark-Streaming, etc.
  • Experience with data pipeline / workflow management tools like AWS Glue, Azure Data Factory, Airflow, AWS Step Functions, NiFi, etc.

Additional Information

Why should you work with us?

  • We are cooperating with an all types and sizes of clients from all around the world which allows us to provide a very diverse portfolio of projects and technologies to choose from to our people.
  • You will be part of a startup-minded branch of a company with a very strong and stable background of the Swiss/Germany headquarters.
  • Your contributions will have a distinct impact on our clients.

And we offer even more:

  • ⛺ 5 weeks of vacation + 3 wellness days

    ❤️ 2 Volunteering days to share the kindness of your heart with others

    ⏰ Flexible working hours and home office

    🎯 Fully covered certifications in Salesforce, Adobe, Microsoft, etc. (delete for non-tech roles)

    🎓 Full access to Dentsu Academy, LinkedIn Learning, on-site learning sessions

    🐶 Pet friendly offices

    💌 Edenred meal and cafeteria points

    🍹 Team events: company parties, monthly breakfasts, and pub quizzes

    🥪 Snacks, and drinks at the office

    💸 Referral bonus programme

    💻 Laptop + equipment

    📞 Corporate mobile phone subscription

Tags: Agile Airflow Architecture AWS Azure Big Data Business Intelligence Cassandra CI/CD Databricks Data pipelines DevOps EC2 Elasticsearch ETL GCP Hadoop Jira Kafka Lambda Machine Learning ML models MongoDB MS SQL NoSQL Oracle Pipelines PostgreSQL Power BI Python Qlik R Redshift SageMaker Scala Snowflake Spark SQL Streaming Tableau UX

Perks/benefits: Career development Flex hours Flex vacation Gear Pet friendly Salary bonus Startup environment Team events

Region: Europe
Country: Czechia
Job stats:  6  1  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.