Automation Engineer - Data Engineering (Hyderabad)

Hyderabad

Astronomer

Take Apache Airflow™ to the next level. From AI and Large Language Models to modern ETL, deliver data on time with the speed and scale your applications demand.

View company page

Astronomer designed Astro, a modern data orchestration platform, powered by Apache Airflow™. Astro enables companies to place Apache Airflow at the core of their data operations, providing ease of use, scalability, and enterprise-grade security, to ensure the reliable delivery of mission-critical data pipelines.

We’re a globally-distributed and rapidly growing venture-backed team of learners, innovators and collaborators. Our mission is to build an Enterprise-grade product that makes it easy for data teams at Fortune 500’s and startups alike to adopt Apache Airflow. As a member of our team, you will be at the forefront of the industry as we strive to deliver the world's data.

Your background may be unconventional; as long as you have the essential qualifications, we encourage you to apply. While having "bonus" qualifications makes for a strong candidate, Astronomer values diverse experiences. Many of us at Astronomer haven't followed traditional career paths, and we welcome it if yours hasn't either.

About the role

As an Automation Engineer - Data Engineering, you will work from Hyderabad across our customer/client base to create, improve, and maintain tools, CLI, and scripts to assist our customers through their journey with Apache Airflow implementations and Astronomer's best-in-class managed service. Our customers range from seed-stage startups to Fortune 500 enterprises.

Hybrid Work Model: for this role you will embrace a flexible hybrid work model with at least 3 days per week at our Office in Hyderabad, while delivering a seamless experience that is digitally and physically connected.

What You Get To Do

  • Create and maintain tooling, scripts, and automation to assist our customers through their Astronomer journey and ease the lives of the folks assisting our customers

  • Build tooling to migrate customers quickly and easily from other platforms

  • Improve upon our existing stack of tooling and automation

What You Bring To The Role

  • Strong Python Experience

  • Strong Data Engineering familiarity and SQL knowledge.

  • 5-8 years of industry experience

  • When you see a problem, you are passionate to go and find a fast and elegant solution for it

  • Comfortable working collaboratively, but can accomplish goals with indirect oversight

  • Communication skills that allow you to know what to say and how to say it regardless of the medium (slack, email, webinars, pair programming sessions)

  • Ability to explain highly technical issues to a non-technical audience

  • Strong analytical skills with high attention to detail and accuracy

Bonus points if you have:

  • Control-M OR Autosys OR Automic experience

  • Some experience in working with or in Apache Airflow in a production-level environment.

  • Experience with related-market products such as Cloudera, Databricks, Snowflake, DBT, ETL Technologies, and analytics solutions

  • Familiarity with Go, Rust, ReactJS / Javascript

  • Excitement about the challenge of delivering an awesome Astronomer experience to hundreds of customers.

At Astronomer, we value diversity. We are an equal opportunity employer: we do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.  Astronomer is a remote-first company.

Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Airflow Databricks DataOps Data pipelines dbt Engineering ETL JavaScript Pipelines Python Rust Security Snowflake SQL

Perks/benefits: Flex hours

Region: Asia/Pacific
Country: India
Job stats:  3  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.