Associate engineer, Data platform

Hyderabad, India

Applications have closed

Vialto Partners

Learn how our global mobility solutions can help you with immigration, tax, business travel, remote work, risk management, and compliance needs.

View company page

Company Description

At Vialto we create the connections that build global careers and societies, bridging the gap between people and their aspirations. As the world of work continues to evolve, we are bringing together individuals, ideas, and technology to unlock business and human growth.

Job Description

Expertise at Integrations with various data sources for ingestion (Lakes, Warehouse, integration with REST APIs and Files), transformation patterns and Python

  • Demonstrating experience with Transformation, scripting, integrations with ingestion sources and data warehouse systems.

  • Design, Define and continuous implementation of Source to Target Mapping

  • Experience with Azure environment (Azure Data Lake and Azure data factory) and Snowflake

  • Knowledge on Database development and SQL

  • Knowledge on virtual machines is an added advantage.

  • Collaborate with Product owners, developers, cloud engineers and other DevOps engineers and operations to plan, design, test and deliver pipelines and infrastructure using the Continuous Integration/Continuous Delivery (CI/CD) model.

  • Experience in Azure DevOps Agile/Kanban practices.

  • Mandatory experience working with data platforms

  • Designing and implement enterprise-wide scalable and highly available, performant design, data migration, implementation, assessment, and troubleshooting of various integration patterns

  • Highly self-motivated and directed, with keen attention to detail

  • Develop professional documentation, operating guides, high level, and low-level design documents

  • Conducts testing and analysis of all components of data platform maintains uptime consistent with Business goals and metrics

  • Exploring and best and enabling models faster with low coding 

  • Must be good at troubleshooting - Logging, instrumentation and observability, performance, security

  • MuleSoft experience is an added advantage.

  • A minimum of 5 years of experience leading Data Platform design, Integrations experience

  • Data retention, Disaster Recovery, Caching methods and Tools

  • Integration best practices with any APIs, Analytics platforms, and various products with Security practices (e.g., Authorization, RBAC)

  • DevOps practices, CI/CD Pipelines, and related tools (e.g., Azure DevOps, Ansible, Terraform, Docker)

Qualifications

This role requires a wide variety of strengths and expertise, including:

  • Experience related to Data Engineering, Data Management

  • Bachelor’s degree of Computer science or equivalent 

  • 3-5 years Enterprise experience including large scale global implementations

  • 3+ years data platform design and integrations experience

  • 6 - 8 Years' experience in total in IT

Additional Information

  • Excellent written and verbal communication skills and the ability to work in large, collaborative, and geographically distributed teams to achieve organizational goals.
  • Develop new skills outside of your comfort zone.
  • Act to resolve issues that prevent the team from working effectively.
  • Work Location: Hyderabad

Tags: Agile Ansible APIs Azure CI/CD Computer Science Data management Data warehouse DevOps Docker Engineering Kanban Pipelines Python Security Snowflake SQL Terraform Testing

Perks/benefits: Startup environment

Region: Asia/Pacific
Country: India
Job stats:  3  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.