Senior Data Engineer

Bengaluru, Delhi, Gurgaon, Kolkata,Chennai,Hyderabad,Pune,Indore,Jaipur and Ahmadabad

Applications have closed

Srijan Technologies

Srijan is a digital experience services company that helps global Fortune 500s to nonprofits build transformative digital paths to a better future.

View company page

Location: Bengaluru, Delhi, Gurgaon, Kolkata,Chennai,Hyderabad,Pune,Indore,Jaipur and Ahmadabad ,None,None

About Material

Material is a global strategy, insights, design, and technology partner to companies striving for true customer centricity and ongoing relevance in a digital first, customer-led world. By leveraging proprietary, science-based tools that enable human understanding, we inform and create customer-centric business models and experiences + deploy measurement systems – to build transformational relationships between businesses and the people they serve.

About Srijan

Srijan is a global engineering firm that builds transformative digital paths to better futures for Fortune 500 enterprises to nonprofits all over the world. Srijan brings advanced engineering capabilities and agile practices to some of the biggest names across FMCG, Aviation, Telecom, Technology, and others. We help businesses embrace the digital future with cloud, data, API and platform centric technologies and adapt to changing business models and market demands. Srijan leads in Drupal with 350+ Drupal engineers, 80+ Acquia certifications. Srijan is also a Drupal Enterprise Partner & Diamond Certified

.

Job description

The Senior Data Engineer will design, develop, monitor and maintain a robust and scalable data platform used by other data analyst and engineering teams to deliver powerful insights to both internal and external stakeholders. This role will create abstractions to speed the platforms adoption and build reliable pipelines to support growing data processing and analytics needs. Ideally this person will be a self-starter, detail and quality oriented, and excited about the prospects of having a big impact with data.

Responsibilities

  • Design, build and maintain data infrastructure that powers both batch and realtime processing of billions of records a day.
  • Improve the data quality and reliability of data pipelines through monitoring, validation and failure detection.
  • Design, build and maintain a central data cataloging system to ease integration and discovery of datasets.
  • Develop data pipelines that provide fast, optimized, and robust end-to-end solutions.
  • Expected to have strong skills in data modeling, data integration, data cleansing, and data mapping. You should be able to design effective data models to support the storage and processing of data and be able to extract data from a variety of sources, transform it as needed, and load it into a target system. You should also be able to identify and correct errors.

Skills, Experience 

  • 4+ years of relevant industry experience in Data Engineering working with large scale data driven systems.
  • Experience designing data schemas and fine-tuning SQL queries around large, complex data sets. (Expert).
  • 3 years+ Exp of programming,using Python, and SQL with AWS Infrastructure, Git, and CI/CD Best Practices.
  • As an ETL Devloper responsible for designing and implementing efficient data pipelines to extract, transform, and load data from various sources into our data warehouse.

What you will get:

  1. Competitive Salaries with flexi benefits 
  2. Group Mediclaim Insurance and Personal Accidental Policy
  3. 30+ Paid Leaves in a year 
  4. Learning and Development of quarterly budgets for certification
Apply to this job

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile APIs AWS CI/CD Data pipelines Data quality Data warehouse Engineering ETL Git Pipelines Python SQL

Perks/benefits: Career development

Region: Asia/Pacific
Country: India
Job stats:  3  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.