Data Engineer II

Denver, Colorado, United States

Applications have closed

Parsyl

Parsyl combines data-powered insurance and risk management into a single integrated solution for perishable supply chains.

View company page

About Parsyl

Parsyl is a data-powered insurance and risk management provider for essential supply chains in food and health. Our mission is to end the days of “ship and pray” and build a world where everyone, everywhere can trust the quality of the goods they rely on, from the foods they eat to the medicines they need.

We are working to achieve this by combining smart sensors, data insights, and data-driven insurance to improve risk resiliency and safeguard goods in transit and storage. This unique combination of IoT and insurance means our customers can use data to make supply chains more transparent, safe and sustainable - better for people and the planet. Parsyl was recognized as a "Best Startups to Work for in Colorado" by BuiltIn Colorado in 2022 and 2023.

What You Bring to Parsyl

We are creating a mission-driven team that aims to transform the essential supply chain industry. Data is at the core of everything we do at Parsyl. You will work with the Data Science team to ensure the team has the data it needs to build products that ultimately improve the quality of essential goods throughout the global supply chain. You will report to the Director of Data and Insights in this role.

You are a good fit for this position if you are confident being an early member of the data engineering function within a data-first organization. You are equally comfortable building data pipelines and communicating technical tradeoffs to non-technical audiences. You are self-motivated and thrive in a fast-paced, ambiguous environment.

In this role, you’ll get to:

  • Maintain and improve Parsyl’s data warehouse and data tools to accelerate Data Science and Data Analyst research and development.
  • Develop, deploy, and monitor data pipelines from transactional and IoT data, SaaS tools, and partner or purchased data.
  • Ensure the quality, reliability, and availability of data for consumption by the data team as well as the broader business.
  • Coordinate with the production backend team to efficiently manage data pipelines and integrations; understand and advocate for the needs of the data team and liaise between the data team, backend and product teams.
  • Collaborate cross-functionally to understand and support Data Scientist, Analyst, Product, and Business users' data needs.
  • Support all aspects of data governance, including access, security, and master data management.

Requirements

Parsyl is committed to cultivating a diverse pool of candidates interested in joining a mission-driven company. We are building an inclusive team at Parsyl that welcomes different perspectives and creative ideas in order to best achieve our mission of ending the days of "ship and pray" and serving our customers.

What we're looking for:

  • 3+ years of experience in data engineering and data technologies
  • 1+ year of experience architecting cloud data solutions, preferably on AWS
  • Experience with batch and streaming frameworks, such as Hadoop, Spark or Flink
  • Experience with infrastructure as code, preferably Terraform
  • Experience with data pipeline tools, such as dbt, Airflow, or Databricks Workflows
  • Strong SQL and relational data modeling experience
  • Experience with at least one general-purpose programming language, such as Go, Python, or Scala
  • Eagerness to work in a fast-paced startup environment
  • Excellent communication skills
  • Strong project management skills
  • *Parsyl requires all employees to be fully vaccinated against Covid-19, unless they qualify for a religious or medical accommodation.

It's a bonus if you also have experience with:

  • Working closely with Data Analysts and Scientists
  • BI tools, preferably Looker
  • Databricks and Apache Spark
  • IoT data
  • Insurance, Global Health, or supply chain industries

Benefits

Market competitive salary with an anticipated base compensation range of $125,000 - $145,000. Actual salaries will vary depending on a candidate’s experience, qualifications, and skills.

Additional Financial Benefits include:

  • Stock options
  • 401(k) including company match

Health and Wellness Benefits include:

  • Medical, dental, and vision insurance effective on your start date (100% of medical, dental, and vision premiums for employees and 75% of premiums for dependents based on a solid, mid-tier plan)
  • Six weeks of fully paid leave family and/or medical leave
  • Monthly wellness benefit of $100 per month

Time-off and Vacation Benefits include:

  • Unlimited vacation policy
  • Company Breaks - quarterly mental health days, summer and winter breaks
  • Paid sabbatical program

Additional Work Environment Benefits include:

  • Significant career growth opportunities and continuing education stipend
  • Flexible work environment based on role requirements
  • Commuter benefit of $100 per month for public transportation or parking costs
  • Home office set-up stipend of up to $1000
  • Relocation assistance available (Denver, CO candidates or candidates willing to relocate will be considered)

Tags: Airflow AWS Databricks Data governance Data management Data pipelines Data warehouse Engineering Flink Hadoop Looker Pipelines Python Research Scala Security Spark SQL Streaming Terraform

Perks/benefits: 401(k) matching Career development Competitive pay Equity Flex hours Flex vacation Health care Home office stipend Medical leave Paid sabbatical Relocation support Salary bonus Startup environment Unlimited paid time off Wellness

Region: North America
Country: United States
Job stats:  14  4  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.