Data Engineer
Bengaluru, Karnataka, India
Twilio
Connect with customers on their preferred channels—anywhere in the world. Quickly integrate powerful communication APIs to start building solutions for SMS and WhatsApp messaging, voice, video, and email.Because you belong at Twilio
The Who, What, Why and Where
Twilio is growing rapidly and seeking a Data Engineer to be a key member of the Enterprise Data & Business Intelligence organization with the focus on data engineering services based in Bengaluru, India. You will be joining one of the first teams of engineers in our new Bengaluru office and as the early engineers in the Enterprise Data & BI organization in Bengaluru, you will have an opportunity to help define our technical and team culture in India. You will design and build data platforms and services, while handling our data infrastructure in cloud environments that fuels strategic business decisions across Twilio. You will partner with other engineers and product managers to translate data needs into critical information that can be used to implement scalable data platforms and self service tools. We are looking for someone who is passionate about solving problems using engineering and data, thrives in an evolving environment, brings an enthusiastic and collaborative demeanour, and delights in making a difference. As a successful candidate, you must have a deep background in data engineering and a demonstrable record of solving data problems at scale leveraging distributed data systems. You are a self-starter, embody a growth demeanor, and can collaborate successfully across the entire Twilio organization.
Who?
Twilio is looking for an exceptional individual who lives Twilio Magic and has a proven track record in conceiving and delivering Data Warehouse solutions at company-wide scale.
- Minimum of 6+ years of technical experience supporting highly-available Data warehouse systems and ETLs.
- Hands-on demonstrable experience with SQL/NoSQL databases. A good understanding of solving problems highly scalable Data warehouses and Data pipelines.
- Hands on Experience in SQL and Python, Java, Scala programming languages.
- Experience in supporting distributed environments using Spark, Hive, Presto, Hadoop etc.
- Deep understanding of architecture and functioning of Distributed database systems like Snowflake, Presto and Redshift (any).
- Experience working with various file formats like Parquet, Arvo, ORC for large volumes of data
- Experience with one or more NoSQL databases such as Cassandra, MongoDB, DynamoDB is a plus.
- Solid understanding of engineering standard methodologies and design principles
- Experience working in agile environment and iterative development
- Collaborative attitude and ability to work with distributed, multi-functional teams
- Excellent communication skills and the ability to clearly articulate your point of view
- Bachelors/Master’s degree in Computer Science required, or equivalent experience
- Should have excellent communication and presentation, strong analytical and problem-solving skills
What?
As a Data Engineer, you will live the Twilio Magic and:
Be an owner
- Design and implement data management services for data trust, data compliance, data access and metadata management in the form of scalable and configurable while clearly articulating technical rationale behind your design and implementation choices
- Participate in Agile/Scrum activities including planning, standups, retrospectives; Provide point of view on user stories.
Wear the customer’s shoes
- Partner with data architects, product managers and other engineers to ensure they have the right information about our services and platforms while ensuring happy customers.
- Listen to your customers’ challenges, find opportunities, craft solutions, and deliver the right value at the right time.
Write it down
- Demonstrate excellent verbal and written communication - ensure that Complex ideas, thoughts, and vision can be communicated simply and effectively. You are expected to thrive in a highly collaborative environment.
Draw the owl
- You’ll build highly scalable platforms and services that support rapidly growing data needs in Twilio. There’s no instruction book, it’s yours to write. You’ll figure it out, ship it, and iterate. You’ll invent the future, but you won’t wing it.
Why?
The Enterprise Data and Business Intelligence is a central organization within Twilio that provides data infrastructure and related services in the form of data lake, data warehouse, business intelligence and data governance etc, that supports long term growth and sustainability. Our mission is to enable fact based decision making by providing clean, governed, accurate data in scalable and easy to use systems in a timely manner. We play an integral role in shaping our business decisions that enable company’s growth and success and are the backbone of Twilio’s data driven culture.
Twilio is a company that is empowering the world’s developers with modern communication in order to build better applications. Twilio is truly unique; we are a company committed to your growth, your learning, your development, and your entire employee experience. We only win when our employees succeed, and we're dedicated to helping you develop your strengths. We have a cultural foundation built on diversity, inclusion, and innovation, and we want you and your ideas to thrive at Twilio.
Where?
This position will be located in our office in Bengaluru, India. You will enjoy our office perks: catered meals, snacks, game room, ergonomic desks, massages, bi-weekly All Hands and more. What you will also get to experience is a company that believes in small teams for maximum impact; seeks well-rounded talent to ensure a full perspective on our customers’ experience, understands that this is a marathon, not a sprint; that continuously and purposefully builds an inclusive culture where everyone is able to do and be the best version of themselves.
About Us
Millions of developers around the world have used Twilio to unlock the magic of communications to improve any human experience. Twilio has democratized communications channels like voice, text, chat, video and email by virtualizing the world’s communications infrastructure through APIs that are simple enough for any developer to use, yet robust enough to power the world’s most demanding applications. By making communications a part of every software developer’s toolkit, Twilio is enabling innovators across every industry — from emerging leaders to the world’s largest organizations — to reinvent how companies engage with their customers.
Tags: Agile APIs Business Intelligence Cassandra Computer Science Data management Data pipelines DynamoDB Engineering Hadoop MongoDB NoSQL Parquet Pipelines Python Redshift Scala Scrum Snowflake Spark SQL
Perks/benefits: Career development Lunch / meals Snacks / Drinks
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Data Science Manager jobs
- Open MLOps Engineer jobs
- Open AI Engineer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Sr Data Engineer jobs
- Open Data Engineer II jobs
- Open Data Manager jobs
- Open Principal Data Engineer jobs
- Open Power BI Developer jobs
- Open Data Analytics Engineer jobs
- Open Junior Data Scientist jobs
- Open Product Data Analyst jobs
- Open Data Scientist II jobs
- Open Senior Data Architect jobs
- Open Business Intelligence Developer jobs
- Open Sr. Data Scientist jobs
- Open Manager, Data Engineering jobs
- Open Big Data Engineer jobs
- Open Data Analyst Intern jobs
- Open Data Quality Analyst jobs
- Open Business Data Analyst jobs
- Open Data Product Manager jobs
- Open Junior Data Engineer jobs
- Open ETL Developer jobs
- Open Principal Data Scientist jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open ML models-related jobs
- Open GCP-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open Java-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open APIs-related jobs
- Open Deep Learning-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open Snowflake-related jobs
- Open TensorFlow-related jobs
- Open PhD-related jobs
- Open CI/CD-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open Kubernetes-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open DevOps-related jobs