Senior Associate Data Engineering L2_DE-Big Data_GCP

Bengaluru, India

Applications have closed

Publicis Groupe

View company page

Company Description

Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value

Job Description

As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solutions. Utilize a deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions.

  • You will independently drive design discussions to ensure the necessary health of the overall solution.
  • The role requires a hands-on technologist who has strong programming background like Java / Scala / Python and should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines, and exposure to Hadoop ecosystem components.
  • You are also required to have hands-on knowledge of at least one of the AWS, GCP, and Azure cloud platforms. 

Role & Responsibilities:

  • Your role is focused on the Design, Development and delivery of solutions involving:
  • Data Integration, Processing & Governance
  • Data Storage and ComputationFrameworks,PerformanceOptimizations Analytics & Visualizations
  • Infrastructure & Cloud Computing
  • Data Management Platforms
  • Implement scalable architectural models for data processing and storage
  • Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode
  • Build functionality for data analytics, search and aggregation

Qualifications

  • Overall 5+ years of IT experience with 3+ years in Data related technologies
  • Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)
  • Hands-on experience with the Hadoop stack – HDFS, scoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.
  • Strong experience in at least of the programming language Java, Scala, Python. Java preferable
  • Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
  •  Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security 

 Competency

1. Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2. Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3. Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4. Performance tuning and optimization of data pipelines Job Title: Senior Associate L2 – Data Engineering

5. CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6. Cloud data specialty and other related Big data technology certifications

Personal Attributes:

  • Strong written and verbal communication skills
  • Articulaion skills
  • Good team player
  • Self-starter who requires minimal oversight
  • Ability to prioritize and manage multiple tasks
  • Process orientation and the ability to define and set up processe

Additional Information

  • Gender Neutral Policy
  • 18 paid holidays throughout the year for NCR/BLR (22 For Mumbai)
  • Generous parental leave and new parent transition program
  • Flexible work arrangements
  • Employee Assistance Programs to help you in wellness and well being

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile Airflow Architecture AWS Azure Big Data BigQuery Cassandra CI/CD Consulting Data Analytics Data governance Data management Data pipelines Engineering ETL Flink GCP Hadoop HBase HDFS Informatica Kafka MongoDB MPP MySQL NoSQL Oozie Oracle Pipelines PostgreSQL Pulsar Python Redshift Scala Security Spark SQL Streaming Talend

Perks/benefits: Career development Flex hours Flex vacation Health care Parental leave Wellness

Region: Asia/Pacific
Country: India
Job stats:  2  1  0

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.