Data Engineer L2
Bengaluru, Karnataka, India
Twilio
Connect with customers on their preferred channels—anywhere in the world. Quickly integrate powerful communication APIs to start building solutions for SMS and WhatsApp messaging, voice, video, and email.Join the team as our next Data Engineer
Who we are & why we’re hiringTwilio powers real-time business communications and data solutions that help companies and developers worldwide build better applications and customer experiences.
Although we're headquartered in San Francisco, we're on a journey to becoming a globally antiracist company that supports diversity, equity & inclusion wherever we do business. We employ thousands of Twilions worldwide, and we're looking for more builders, creators, and visionaries to help fuel our growth momentum.
About the job
This position is needed to hire a Data Engineer to be a key member of the CDaX(Corporate Data & Analytics & Infrastructure) organization with the focus on data engineering services based in Bengaluru, India. You will be joining one of the first teams of engineers in our new Bengaluru office and as the early engineers. You will design and build data platforms and services, while managing our data infrastructure in cloud environments that fuels strategic business decisions across Twilio.
ResponsibilitiesIn this role, you’ll:
- Partner with other engineers and product managers to translate data needs into critical information that can be used to implement scalable data platforms and self service tools.
- Design and Build new data pipelines to handle petabytes of data.
- Passionate about solving problems using engineering and data, thrives in an evolving environment, brings an enthusiastic and collaborative attitude, and delights in making a difference
- Cultivate a deep background in data engineering and a proven record of solving data problems at scale leveraging distributed data systems.
- Identify and resolve the critical production issues and customer requests.
Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table!
Required:
- Minimum of 3+ years of technical experience supporting highly-available Data warehouse systems and ETLs.
- Hands-on strong experience with SQL/NoSQL databases. A good understanding of troubleshooting highly scalable Data warehouses and Data pipelines.
- Hands on Experience in SQL , Python and Scala programming languages.
- Experience in supporting distributed environments using Spark, Hive, Presto, Hadoop etc.
- Deep understanding of architecture and functioning of Distributed database systems like Snowflake, Presto and Redshift (any ) .
- Experience working with various file formats like Parquet, Avro, ORC for large volumes of data
- Experience with one or more NoSQL databases such as Cassandra, MongoDB, DynamoDB is a plus.
- Strong understanding of engineering standard methodologies design principles
- Experience working in agile environment and iterative development
Desired:
- Excellent communication skills and the ability to clearly articulate your point of view
- Bachelors/Masters degree in Computer Science required, or equivalent experience
- Should have excellent communication and presentation, strong analytical and problem-solving skills
Location
This role will be based in our Bengaluru, India office.
What We OfferThere are many benefits to working at Twilio, including, in addition to competitive pay, things like generous time-off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location.
Twilio thinks big. Do you?We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts.
So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now!
If this role isn't what you're looking for, please consider other open positions.
Tags: Agile Avro Cassandra Computer Science Data pipelines DynamoDB Engineering Hadoop MongoDB NoSQL Parquet Pipelines Python Redshift Scala Snowflake Spark SQL
Perks/benefits: Career development Competitive pay Parental leave
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Data Science Manager jobs
- Open MLOps Engineer jobs
- Open AI Engineer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Sr Data Engineer jobs
- Open Data Engineer II jobs
- Open Data Manager jobs
- Open Principal Data Engineer jobs
- Open Data Analytics Engineer jobs
- Open Power BI Developer jobs
- Open Junior Data Scientist jobs
- Open Product Data Analyst jobs
- Open Data Scientist II jobs
- Open Senior Data Architect jobs
- Open Business Intelligence Developer jobs
- Open Sr. Data Scientist jobs
- Open Manager, Data Engineering jobs
- Open Big Data Engineer jobs
- Open Data Analyst Intern jobs
- Open Business Data Analyst jobs
- Open Data Quality Analyst jobs
- Open Data Product Manager jobs
- Open Junior Data Engineer jobs
- Open ETL Developer jobs
- Open Principal Data Scientist jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open GCP-related jobs
- Open ML models-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open Java-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open APIs-related jobs
- Open Deep Learning-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open TensorFlow-related jobs
- Open Snowflake-related jobs
- Open PhD-related jobs
- Open NLP-related jobs
- Open CI/CD-related jobs
- Open Kubernetes-related jobs
- Open Airflow-related jobs
- Open Data governance-related jobs
- Open Databricks-related jobs
- Open Hadoop-related jobs
- Open LLMs-related jobs
- Open Data warehouse-related jobs