SRE Data Engineer

Thane, India

Applications have closed

dentsu international

At dentsu, innovation is our strength, and your growth is our mission. We help you keep up with technological changes in the digital economy.

View company page

Company Description

We Dream. We Do. We Deliver.

As a full-service, data-driven customer experience transformation, we partner with Top 500 companies in the DACH region and in Eastern Europe. Originally from Switzerland, Merkle DACH was created out of a merger Namics and Isobar - two leading full-service digital agencies.

Our 1200+ digital enthusiasts are innovating the way brands are built, through providing expertise in Digital Transformation strategy, MarTech platforms, Creativity, UX, CRM, Data, Commerce, Mobile, Social Media, Intranet and CMS. We are part of the global Merkle brand, the largest brand within the dentsu group, who shares with us a network of over 66,000 passionate individuals in 146 countries.

Job Description

Total Experience : 5 to 8 Years

Required Skills:

  • Knowledge of defining and monitoring system quality measures, including SLO/SLI and SLA 
  • Expert knowledge in designing, developing, and managing large real-time systems
  • Project and process management 
  • Prior successful experience as a systems performance or site/systems reliability engineer
  • Mastery in Hadoop - Manage HDFS, YARN, HIVE
  • Experience in managing Spark Jobs / Airflow DAGs / Kafka-based data-pipelines
  • Experience in managing/deploying containerized applications; Kubernetes or similar.
  • Experience of one of the following tooling languages - Python, Java or other JVM language
  • Excellent troubleshooting, problem solving, critical thinking, and communication skills
  • Good understanding of Unix/Linux based operating system. Proficient in linux, command-line tools, and general system debugging
  • Knowledge of microservice architecture
  • Comfortable working with global teams in different time zones 

Key Responsibilities:

  • Someone coming from the development or DevOps background.
  • At least 5+ years of hands-on experience with Hadoop, Kubernetes.
  • Good understanding of object-oriented programming concept.
  • Hands-on coding experience with Python 3, Linux.
  • Open to learning new tools and technologies on the job.
  • Leadership acumen.
  • Agile & Lean thinking mindset.
  • Rational approach to solving problems.
  • Comfortable writing code in at least one programming language.

Qualifications

  • Bachelor's degree or work experience that demonstrates comparable abilities

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile Airflow Architecture DevOps Hadoop HDFS Kafka Kubernetes Linux OOP Pipelines Python Spark UX

Region: Asia/Pacific
Country: India
Job stats:  3  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.