Senior Data Engineer

Remote - US

Applications have closed

Twilio

Connect with customers on their preferred channels—anywhere in the world. Quickly integrate powerful communication APIs to start building solutions for SMS and WhatsApp messaging, voice, video, and email.

View company page

See yourself at Twilio

Join the team as our next Staff Software Engineer, CDaX

Who we are & why we’re hiring

Twilio powers real-time business communications and data solutions that help companies and developers worldwide build better applications and customer experiences.

Although we're headquartered in San Francisco, we're on a journey to becoming a globally antiracist company that supports diversity, equity & inclusion wherever we do business. We employ thousands of Twilions worldwide, and we're looking for more builders, creators, and visionaries to help fuel our growth momentum.

About the job

Twilio is seeking a Staff Software Engineer for CDAX (Corporate Data Analytics & Infrastructure) team with industry experience and a deep understanding of data usage and life cycles across a SaaS-based organization. We are looking for someone to build the data processing and integration layer that extracts raw real time data from our internal platforms or 3rd party systems and stores/summarizes/presents it appropriately for analysis across multiple business teams for both tactical and strategic initiatives along with setting up and maintaining the infrastructure along the way.

Responsibilities

In this role, you’ll:

  • Build and launch real-time data processing pipelines supporting various insights projects
  • Build and launch real-time scalable 3rd party integration framework
  • Implement automated data quality framework ensuring highest standard of data accuracy
  • Design and scale our self service analytics platform supporting company wide initiatives
  • Work across multiple teams in high visibility roles and own the solution end-to-end.
  • Organize, and mine massive data sets of both structured and unstructured data by implementing appropriate infrastructure tools and systems to house data for reporting and analytics purposes.

Qualifications 

Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table!

Required:

  • 7+ years of diverse experience in building data processing frameworks and services
  • Experience working with distributed analytics platforms - Hive, Presto etc.
  • Experience working with real-time data processing technologies like Kafka, Spark streaming etc.
  • Experience with NoSQL database technologies like Cassandra, DynamoDB etc.
  • Experience building scalable services and related frameworks like Akka etc.
  • Experience with DAG based open source workflow management solutions airflow etc.
  • Expert coding skills across a number of languages including Scala, Java, Python etc.
  • Experience in processing various file formats such as Parquet, Avro, JSON etc

Location 

This role will be remote or based in Denver, Co.

What We Offer

There are many benefits to working at Twilio, including, in addition to competitive pay, things like generous time-off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location.

Twilio thinks big. Do you?

We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts.

So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now!

If this role isn't what you're looking for, please consider other open positions.

*Please note this role is open to candidates outside of Colorado as well. The information below is provided for those hired in Colorado only.

*If you are a Colorado applicant:

  • The estimated pay range for this role, based in Colorado, is $132,928 - $166,160
  • Non-Sales: Additionally, this role is eligible to participate in Twilio's equity plan.

The successful candidate’s starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location within the state. This role is also eligible to participate in Twilio’s equity plan and for the following benefits: health care insurance, 401(k) retirement account, paid sick time, paid personal time off, paid parental leave.

 

Tags: Airflow Avro Cassandra Data Analytics DynamoDB JSON Kafka NoSQL Open Source Parquet Pipelines Python Scala Spark Streaming Unstructured data

Perks/benefits: 401(k) matching Career development Competitive pay Equity Health care Parental leave Startup environment Wellness

Regions: Remote/Anywhere North America
Country: United States
Job stats:  3  1  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.