Staff, Data Engineer- AWS Cloud Ops, L4

Bengaluru, Karnataka, India

Applications have closed

Twilio

Connect with customers on their preferred channels—anywhere in the world. Quickly integrate powerful communication APIs to start building solutions for SMS and WhatsApp messaging, voice, video, and email.

View company page

Because you belong at Twilio

The Who, What, Why and Where

Twilio is growing rapidly and seeking a Data Engineer to be a key member of the Enterprise Data & Business Intelligence organization with the focus on data engineering services based in Bengaluru, India. You will be joining one of the first teams of engineers in our new Bengaluru office and as the early engineers in the Enterprise Data & BI organization in Bengaluru, you will have an opportunity to help define our technical and team culture in India. You will design and build data platforms and services, while managing our data infrastructure in cloud environments that fuels strategic business decisions across Twilio. You will partner with other engineers and product managers to translate data needs into critical information that can be used to implement scalable data platforms and self service tools. We are looking for someone who is passionate about solving problems using engineering and data, thrives in an evolving environment, brings an enthusiastic and collaborative attitude, and delights in making a difference. As a successful candidate, you must have a deep background in data engineering and a proven record of  solving data problems at scale leveraging distributed data systems. You are a self-starter, embody a growth attitude, and can collaborate effectively across the entire Twilio organization. 

Who?

Twilio is looking for an exceptional individual who lives Twilio Magic and has a demonstrated track record in conceiving and delivering Data Warehouse solutions at company-wide scale. 

  • 8+ years of data engineering experience is a fast paced company that delivers software 
  • Knowledge of all phases of software development including requirements analysis, design, coding, testing, debugging, implementation, and support. 
  • A good understanding of designing and building highly scalable Data warehouses and Data pipelines.Hands on Experience in PL SQL,Scala, Pyspark & Python programming languages. 
  • Deep understanding of Apache Spark RDDs and building data frames.  Create Scala/Spark jobs for data transformation and aggregation.
  • Experience in building distributed environments using any of Kafka, Spark, Hive, Presto, Hadoop  etc. Deep understanding of architecture and functioning of Distributed database systems like Snowflake, Presto and Redshift (any ) .
  • Experience working with various file formats like Hudi, Parquet, Avro, ORC for large volumes of data.
  • Hands on experience on AWS cloud services and designing architecture for sizing / scalability and performance. AWS certification would be a huge plus.
  • Experience with one or more NoSQL databases such as Cassandra, MongoDB, DynamoDB is a plus. 
  • Strong understanding of engineering best practices and design principles
  • Experience working in agile environment and iterative development 
  • Collaborative mindset and ability to work with distributed, cross-functional teams
  • Solid communication skills and the ability to clearly articulate your point of view
  • Bachelors/Masters degree in Computer Science required, or equivalent experience
  • Should have excellent communication and presentation, strong analytical and problem-solving skills

Roles and responsibilities

  • Responsible for the planning, implementation, and growth of the AWS cloud infrastructure yet keeping the cost low.
  • Gather technical requirements, assess client capabilities, and analyze findings to offer relevant cloud solutions and adoption strategies.
  • Conduct research, analyze, recommend, and select technical solutions to difficult and complex development and integration issues.
  • Improve performance, automation, and scalability by learning and implementing new tools and methodologies.
  • Work alongside architecture and engineering teams to design and implement any scalable software services
  • Ensure necessary system security by using best in class cloud security solutions
  • Implement continuous integration/continuous delivery (CI/CD) pipelines when necessary
  • Recommend process and architecture improvements
  • Troubleshoot the system and solve problems across all platform and application domains
  • Oversee pre-production acceptance testing to ensure the high quality of a company's services and products

What?

As a Staff Data Engineer, you will live the Twilio Magic and:

BE AN OWNER 

  • Design and implement data management services for data trust, data compliance, data access and metadata management in the form of scalable and configurable while clearly articulating technical rationale behind your design and implementation choices
  • Participate in Agile/Scrum activities including planning, standups, retrospectives; Provide point of view on user stories. 

WEAR THE CUSTOMER’S SHOES

  • Partner with data architects, product managers and other engineers to ensure they have the right information about our services and platforms while ensuring happy customers.
  • Listen to your customers’ challenges, identify opportunities, craft solutions, and deliver the right value at the right time.

WRITE IT DOWN

  • Demonstrate excellent verbal and written communication - ensure that complex ideas, thoughts, and vision can be communicated simply and effectively. You are expected to thrive in a highly collaborative environment.

DRAW THE OWL 

  • You’ll build highly scalable platforms and services that support rapidly growing data needs in Twilio. There’s no instruction book, it’s yours to write. You’ll figure it out, ship it, and iterate. You’ll invent the future, but you won’t wing it.

Why?

The Enterprise Data and Business Intelligence is a central organization within Twilio that provides data infrastructure and related services in the form of data lake, data warehouse, business intelligence and data governance etc, that supports long term growth and sustainability. Our mission is to enable fact based decision making by providing clean, governed, accurate data in scalable and easy to use systems in a timely manner.  We play an integral role in shaping our business decisions that enable company’s growth and success and are the backbone of Twilio’s data driven culture. 

Twilio is a company that is empowering the world’s developers with modern communication in order to build better applications. Twilio is truly unique; we are a company committed to your growth, your learning, your development, and your entire employee experience. We only win when our employees succeed and we're dedicated to helping you develop your strengths. We have a cultural foundation built on diversity, inclusion, and innovation and we want you and your ideas to thrive at Twilio.

Where?

This position will be located in our office in Bengaluru, India. You will enjoy our office perks: catered meals, snacks, game room, ergonomic desks, massages, bi-weekly All Hands and more. What you will also get to experience is a company that believes in small teams for maximum impact; seeks well-rounded talent to ensure a full perspective on our customers’ experience,  understands that this is a marathon, not a sprint; that continuously and purposefully builds an inclusive culture where everyone is able to do and be the best version of themselves.

About Us

Millions of developers around the world have used Twilio to unlock the magic of communications to improve any human experience. Twilio has democratized communications channels like voice, text, chat, video and email by virtualizing the world’s communications infrastructure through APIs that are simple enough for any developer to use, yet robust enough to power the world’s most demanding applications. By making communications a part of every software developer’s toolkit, Twilio is enabling innovators across every industry — from emerging leaders to the world’s largest organizations — to reinvent how companies engage with their customers.

Tags: Agile APIs Avro AWS Business Intelligence Cassandra CI/CD Computer Science Data management Data pipelines DynamoDB Engineering Hadoop Kafka MongoDB NoSQL Parquet Pipelines PySpark Python Redshift Research Scala Scrum Security Snowflake Spark SQL Testing

Perks/benefits: Career development Lunch / meals Snacks / Drinks Startup environment Team events

Region: Asia/Pacific
Country: India
Job stats:  7  0  0

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.