ETL & DB Engineer

Hyderabad

Applications have closed

 ETL & Database Engineer
The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are 
sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and 
a background in relational modeling in a Hybrid architecture. 
The candidate will help drive the business towards specific technical initiatives and will work closely with the 
Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across 
the US and India. 
Responsibilities:
• ETL Development – The CDP ETL & Database Engineer will be responsible for building pipelines to feed 
downstream data processes. They will have the ability to analyze data, interpret business requirements, and 
establish relationships between data sets. The ideal candidate will be familiar with different encoding formats 
and file layouts such as JSON and XML.
• Implementations & Onboarding – Will work with the team to onboard new clients onto the ZMP/CDP+ 
platform. The candidate will solidify business requirements, perform ETL file validation, establish users, perform 
complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven 
approach towards development and will be able to document processes and workflows.
• Incremental Change Requests– The CDP ETL & Database Engineer will be responsible for analyzing change 
requests and determining the best approach towards implementation and execution of the request. This 
requires the engineer to have a deep understanding of the overall architecture of the platform. Change requests
will be implemented and tested in a development environment to ensure the introduction of change will not 
have a negative impact on downstream processes.
• Change Data Management – The candidate will adhere to change data management procedures and actively 
participate in CAB meetings where change requests will be presented and approved. Prior to introducing 
change, the engineer will ensure that processes are running in a development environment. The engineer will 
be asked to do peer to peer code reviews as well as solution reviews prior to production code deployment.
• Collaboration & Process Improvement – The engineer will be asked to participate in knowledge share 
sessions where they will engage with peers, discuss solutions, best practices, overall approach, and process. The 
candidate will be able to look for opportunities to streamline processes with an eye towards building a
repeatable model to reduce implementation duration.
Job Requirements:
• The CDP ETL & Database Engineer will be well versed in the following areas:
o Relational data modeling
o ETL and FTP concepts 
o Advanced Analytics using SQL Functions 
o Cloud technologies - AWS, Snowflake 
• Able to decipher requirements, provide recommendations, and implement solutions within predefined 
timeframes. 
• The ability to work independently, but at the same time, the individual will be called upon to contribute in a 
team setting. 
• The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct 
manager.
• Participate in internal client project status meetings with the Solution/Delivery management teams.
• When required, collaborate with the Business Solutions Analyst (BSA) to solidify requirements. 
• Ability to work in a fast paced, agile environment; individual will be able to work with a sense of urgency 
when escalated issues arise. 
• Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client 
demand. 
• Familiarity with Jira for workflow mgmt., and time allocation.
• Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives etc. 
Required Skills:
• ETL – ETL tools such as 
o Talend (Preferred, not required) 
o DMExpress – Nice to have
o Informatica – Nice to have
• Database - Hands on experience with the following database Technologies
o Snowflake (Required) 
o Oracle 11 or Higher – Nice to have
o SQL Server SSIS – Nice to have
o MYSQL/PostgreSQL – Nice to have 
o Familiar with NOSQL DB methodologies (Nice to have) 
• Programming Languages – Can demonstrate knowledge of any of the following.
o PLSQL 
o JavaScript Strong Plus 
o Python - Nice to have
o Scala - Nice to have
• AWS – Knowledge of the following AWS services:
o S3 
o EMR (Concepts) 
o EC2 (Concepts) 
o Systems Manager / Parameter Store 
• Understands JSON Data structures, key value pair.
• Working knowledge of Code Repositories such as GIT, Win CVS, SVN. 
• Workflow management tools such as Apache Airflow, Automic/Appworx 
• Jira
Minimum Qualifications 
• Bachelor's degree or equivalent
• 2-4 Years' experience 
• Excellent verbal & written communications skills
• Self-Starter, highly motivated 
• Analytical mindset 

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile Airflow Architecture AWS Data management EC2 Engineering ETL Git Informatica JavaScript Jira JSON MySQL NoSQL Oracle Pipelines PostgreSQL Python Scala Scrum Snowflake SQL SSIS Talend XML

Region: Asia/Pacific
Country: India
Job stats:  7  2  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.