Data Engineer

San Francisco, CA or Austin, TX

Shippo logo
Apply now Apply later

Posted 4 weeks ago

Before you read on, take a look around you. Chances are, pretty much everything you see has been shipped, often multiple times, in order to get there. E-commerce is exploding, and with it, parcel shipping is becoming a meaningful factor in a business' ability to succeed. Creating a compelling shipping experience for customers is hard but necessary.
At Shippo, our goal is to level the playing field by providing businesses access to shipping tools and terms that would not be available to them otherwise.
Shippo lowers the barriers to shipping for businesses around the world. As free and fast shipping becomes the norm, better access to shipping is a competitive advantage for businesses. Through Shippo, e-commerce businesses, marketplaces, and platforms are able to connect to multiple shipping carriers around the world from one API and dashboard. Businesses can get shipping rates, print labels, automate international documents, track shipments, and facilitate returns.
Internally, we think of Shippo as the building blocks of shipping. Shippos are a diverse set of individuals. We look for cultural and skill fit in every new person. Join us to build the foundations of something great, roll up your sleeves, and get important work done everyday. Founded in 2013, we are a proud team based out of San Francisco. Shippo’s investors include D1 Capital Partners, Bessemer Venture Partners, Union Square Ventures, Uncork Capital, VersionOne Ventures, FundersClub, and others.
We are seeking a new Data Engineer! You will be responsible for building systems to collect and process events of massive scale to gain operational and business insight into the performance and optimization of shipping services.The data engineer will work closely with product, engineering, and business leads in generating customer-facing and internal dashboards, ad hoc reports, and models to provide insights and affect platform behavior. This will also include building and maintaining the infrastructure to collect and transform raw data.

Responsibilities and Impact

  • Design, build, scale, and evolve our large scale data infrastructure and processing workflows to support running our business intelligence, data analytics and data science processes
  • Build robust, efficient and reliable data pipelines and data integration consisting of diverse data sources and transformation techniques, and ensure consistency and availability of data insights 
  • Collaborates with product, engineering and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-informed decision making across the organization Articulate and present findings and recommendations at different levels, with a clear bias towards impactful learning and results 
  • Develop clean, well-designed, reusable, scalable code following TDD practices Champion engineering organization’s adoption and ongoing use of the data infrastructure Embody Shippo’s cultural values in your everyday work and interactions


  •  3+ years of experience in software development
  • Experience designing, building, and maintaining data pipeline systemsCoding experience in server-side programming languages (e.g. Python, Scala, Go, Java) as well as database languages (SQL)
  • Experience with data technologies and concepts such as Airflow, Kafka, Hadoop, Hive, Spark, MapReduce, RDBMS, NoSQL, and Columnar databases
  • Exceptional verbal, written, and interpersonal communication skills
  • Deep understanding of customer needs and passion for customer success
  • Exhibit core behaviors focused on craftsmanship, continuous improvement, and team success
  • BS or MS degree in Computer Science or equivalent experience

Bonus Points

  • Experience with implementing ETL process
  • Experience with Big Data frameworks such as Hadoop, MapReduce and associated tools
  • Experience building stream-processing systems, using solutions such as Kinesis Stream, Kafka or Spark-Streaming
  • Experience integrating with APIs that use REST, gRPC, SOAP and other technologies
  • Experience with cloud environments and DevOps tools; working experience with AWS and its associated products a plus Experience with machine learning a plus

Benefits, Perks, and More

  • Medical, dental, and vision healthcare coverage for you and your dependents. Pets coverage is also available!
  • Flexible policy for PTO and work arrangement
  • 3 VTO days for ShippoCares volunteering events$2,500 annual learning stipend for your personal and professional growth
  • Charity donation match up to $100Free daily catered lunch, drinks, and snacks
  • Fun team events outside of work hours - happy hours, “escape room” adventures, hikes, and more!
Job tags: Airflow AWS Big Data Business Intelligence Data Analytics Engineering ETL Hadoop Healthcare Java Kafka Machine Learning NoSQL Python Scala Spark SQL
Job region(s): North America
Share this job: