Data Engineer (L3)
Bengaluru, Karnataka, India
Twilio
Connect with customers on their preferred channels—anywhere in the world. Quickly integrate powerful communication APIs to start building solutions for SMS and WhatsApp messaging, voice, video, and email.Join the team as our next Data Engineer (L3)
Who we are & why we’re hiringTwilio powers real-time business communications and data solutions that help companies and developers worldwide build better applications and customer experiences.
Although we're headquartered in San Francisco, we're on a journey to becoming a globally antiracist company that supports diversity, equity & inclusion wherever we do business. We employ thousands of Twilions worldwide, and we're looking for more builders, creators, and visionaries to help fuel our growth momentum.
About the jobThis position is needed to fulfill the critical role of a Data Engineer to build data products that deliver actionable intelligence to achieve operational excellence.
Twilio is growing rapidly and seeking a Data Engineer to be a key member of the Reliability & Response organization with the focus on data engineering services based in Bengaluru, India. You will help us build solutions that deliver actionable intelligence from a number of critical systems, ensuring that Twilio is the leader in trusted communications. As a successful candidate, you must have a deep background in data engineering, and a consistent track record of solving data problems at scale. You are a self-starter, embody a growth attitude, and collaborate effectively across the entire Twilio organization.
Responsibilities
In this role, you’ll:
- BE AN OWNER:
- Design, develop, and maintain data products that perform robust analytics to support Twilio’s product engineering operational data: incidents, deployments, performance, utilization, defects, change failure rate, test data, infrastructure costs.
- Build scalable solutions and self-serve platforms that will provide data, Key Performance Indicators, anomaly alerts, and forecasts to inform business decision making.
- WRITE IT DOWN: Develop and maintain documentation relating to all assigned systems and projects
- WEAR THE CUSTOMER’S SHOES:
- Develop high-trust relationships and processes with partner teams and stakeholders to identify and address the requirements.
- Interface with various product development teams to understand their data better, and build data pipelines for data ingestion from a wide variety of data sources.
- DRAW THE OWL:
- Participate in workstreams planning process including inception, technical design, development, testing and delivery of BI solutions. You’ll figure it out, ship it, and iterate!
- Be able to understand complex architectures and be comfortable working with multiple teams.
- PRIORITIZE RUTHLESSLY:: Adapt to prioritizing multiple issues in a high-pressure environment.
Qualifications
Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table!
Required:
- Bachelors/Masters degree in Computer Science, Engineering or related field or equivalent work experience.
- 3+ years of relevant work experience in a role requiring design and development of highly scalable data warehouses and data pipelines.
- Hands on Experience in Java, SQL and Python programming languages.
- Experience in building stream processing applications using Apache Kafka or Apache Flink or similar frameworks
- Deep understanding of architecture and functioning of Distributed database systems like Snowflake, Presto or Redshift.
- Experience with one or more NoSQL databases such as Cassandra, MongoDB, DynamoDB is a plus.
- Strong expertise in troubleshooting production issues.
- Excellent problem solving, critical thinking, and communication skills.
- Strong belief in automation over toil.
Desired:
- Familiarity with BI and Visualization platforms like Tableau or Looker.
Location
This role will be based in our Bengaluru office.
Approximately 10% travel is anticipated.
What We OfferThere are many benefits to working at Twilio, including, in addition to competitive pay, things like generous time-off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location.
Twilio thinks big. Do you?We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts.
So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now!
If this role isn't what you're looking for, please consider other open positions.
Tags: Cassandra Computer Science Data pipelines DynamoDB Engineering Flink Kafka Looker MongoDB NoSQL Pipelines Python Redshift Snowflake SQL Tableau Testing
Perks/benefits: Career development Competitive pay Parental leave Startup environment
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Marketing Data Analyst jobs
- Open MLOps Engineer jobs
- Open Junior Data Scientist jobs
- Open Data Engineer II jobs
- Open AI Engineer jobs
- Open Senior Data Architect jobs
- Open Power BI Developer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Analytics Engineer jobs
- Open Sr Data Engineer jobs
- Open Manager, Data Engineering jobs
- Open Principal Data Engineer jobs
- Open Business Data Analyst jobs
- Open Product Data Analyst jobs
- Open Data Quality Analyst jobs
- Open Data Manager jobs
- Open Sr. Data Scientist jobs
- Open Big Data Engineer jobs
- Open Data Scientist II jobs
- Open Business Intelligence Developer jobs
- Open Data Analyst Intern jobs
- Open ETL Developer jobs
- Open Principal Data Scientist jobs
- Open Azure Data Engineer jobs
- Open Data Product Manager jobs
- Open Business Intelligence-related jobs
- Open Data quality-related jobs
- Open Privacy-related jobs
- Open Data management-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open ML models-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open Deep Learning-related jobs
- Open APIs-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open PhD-related jobs
- Open TensorFlow-related jobs
- Open Snowflake-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open Data warehouse-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open DevOps-related jobs
- Open CI/CD-related jobs