Senior Data Engineer, Telemetry
San Francisco, CA or Washington, DC
Mapbox is building a live location platform. Our maps represent the ever-evolving world, accessing, aggregating, and adapting anonymous data from millions of sensors and phones in real-time. Mapbox has the exciting opportunity to power devices and products across the next frontier in location-based data. Whether you’re watching the delivery of your grocery order on Instacart, tracking your personal best mile on Strava, sending snaps on Snapchat, monitoring your gas budget on Metromile, or checking today’s forecast on The Weather Channel, Mapbox is the location and maps within those apps. We’re changing how people move by live-mapping the world.
What We Do
The Telemetry team collects location telemetry from mobile Mapbox SDKs and uses it to improve our map, directions, and travel times, as well as to provide insights into human mobility patterns. To do so, we build and deploy pipelines to process, anonymize, and analyze billions of location probes per day. The customers of our unique, privacy-centric datasets on traffic patterns and human activity range from major corporations to local governments and NGO; but we are also committed to making data accessible to smaller developers, and work on a number of community-driven and open-source projects. We're a diverse team of backend engineers, data engineers, and data scientists located around the world, but we work closely together, support each other's growth and learning, and share an enthusiasm for deriving interesting and useful insights out of geospatial data.
To see some examples of our work, check out some of our blog posts from the past year:
- Tracking how COVID changed mobility patterns throughout the world
- Shopping and sunbathing: how America responds to quarantine
- Tracking retail recovery using Movement data
- Using Movement data to analyze airport traffic
- How location data becomes traffic data
What You’ll Do
You will be responsible for architecting, building, and maintaining production systems for Mapbox’s Movement product. Your technical contributions will include:
- Designing and building new data pipelines, as well as optimizing existing data pipelines in Python and Spark/PySpark.
- Building tools for automated quality assurance and data exploration.
- Designing and building APIs for delivering data to customers.
- Leveraging geospatial algorithms to optimize pipelines, as well as creating automated tuning and outlier detection algorithms.
Outside of your direct technical contributions you will:
- Collaborate with other teams within Mapbox to expand use of Movement data throughout the company.
- Mentor colleagues, especially with respect to Spark optimization and best practices.
- Participate in code and design reviews
There are also ample opportunities to act as an evangelist in the larger community by writing blog posts, engaging with our community partners and journalists to help them make the best use of our data, and collaborating with our academic partners on white papers.
What We Believe are Important Traits for This Role
- Ability to write approachable, performant code with comprehensive tests and thorough documentation.
- Excellent communication skills, both written and verbal.
- Experience building and maintaining data pipelines with large datasets
- Experience designing & building systems at scale using AWS infrastructure
- A highly autonomous and self-driven approach
- A solid engineering foundation with an interest in geospatial data and big data analysis
- Comfort being vulnerable when you don’t know the answer, and knowing how to ask for help.
- Ability to communicate how complex systems work to both engineers as well as customers or business stakeholders
- Experience & enthusiasm for mentoring teammates, providing technical feedback, and sharing knowledge with others
You will need strong skills in at least two of the following areas, which form the foundation of our technical stack. We don’t expect anyone to be an expert in all areas. However, familiarity with more than two is a bonus, and you should be ready to learn any of them. In the bigger picture, curiosity about spatial data and our planet is highly encouraged, as is mentoring aptitude or experience.
- Python (numPy, pandas)
- Experience with distributed processing pipelines (Hadoop, Spark, Airflow, Dask)
- AWS infrastructure (e.g., S3, EMR, ECS, Athena, Lambda, DynamoDB, IAM)
- Building secure and stable APIs
- Experience working with very large datasets – statistical analysis, data quality control, optimizing storage
What We ValueIn addition to our core values:
- We value creative individuals who dig into problems and opportunities.
- We believe in individuals being their whole selves at work. We commit to this through supportive health care, parental leave, flexibility for the things that come up in life, and innovating on how we think about supporting our people.
- We emphasize an environment of teaching and learning to equip employees with the tools needed to be successful in their function and the company.
- We strongly believe in the value of growing a diverse team and encourage people of all backgrounds, genders, ethnicities, abilities, and sexual orientations to apply.