Data Engineer

Remote Canada

Applications have closed

1Password

A password manager, digital vault, form filler, and secure digital wallet. Manage everything in one secure place – 1Password remembers all your passwords and sensitive information, so you don’t have to.

View company page

Over 80,000 businesses and millions of people use 1Password to protect their most important information. We’re a kind, curious, and customer-focused team on a mission to build the world's most-loved password manager and give people more control over their data.
We’re looking for a Data Engineer to help take our data reporting and infrastructure to the next level. You’ll work with the Executive, Product, Marketing, Sales, Sales Engineering, Finance, and Customer Success teams every day – in short, your work will impact the whole company.  You’ll be part of our first team dedicated to Data and have a critical role in shaping our data foundation.  
As a Data Engineer, you’ll contribute to a variety of projects that range from designing robust and fully automated ETL processes to building tools for improving company-wide productivity with data.  
You have a passion for designing, implementing, and operating stable, scalable, and efficient solutions to flow data from production systems into the data warehouse. You are curious about all the nuances in the source data systems & always make an effort to assert data quality checks and improve documentation. 
New analytics technologies are emerging every day and we’re excited about the impact they’ll have – we hope you share our enthusiasm!
This is a remote opportunity within Canada.

What we're looking for:

  • Proficiency in Python and SQL
  • Proficiency with DBT or other data transformation tool
  • 5+ years experience with at least one relational database – MySQL, PostgreSQL, Oracle, etc.
  • 3+ years experience with Data Warehousing or Data Lake technologies. 
  • 3+ years experience with large-scale data pipelines and ETL tooling (Spark/Dask, aws-kinesis/kafka, Airflow/Prefect, etc.).
  • The ability to maintain confidentiality of sensitive customer data.
  • Experience developing data-pipelines to and from CRM and marketing tools (Salesforce, Marketo, Intercom, etc.).
  • Experience with BI tools (We use Looker)
  • A team player with a solution oriented attitude with both technical and soft skills to get things done.

Bonus points for:

  • Experience with AWS, Terraform & Kubernetes
  • Experience with Event Sourcing.
  • Experience working with data using Python (pandas, dask, numpy, etc).
  • Experience with Distributed Data Technologies.
  • You love to write & produce great documentation.

What you can expect:

  • Interface with other engineers to extract, transform, and load (ETL) data from a wide variety of in-house and third-party data sources.
  • Ensure we have data consistency on both production and analytical databases. You’ll own the integrity of our data from end-to-end, and the company will make high impact decisions based on this data.
  • Architect and build a data warehouse to provide timely data to a variety of third-party applications (Salesforce, Marketo, etc).
  • Design and build tools that make our data pipelines and surfacing more reliable and easier to use.
  • Work closely with Application Engineers to roll out new tools and features.
  • Triage, identify, and fix scaling challenges.
  • Collaborate with internal data customers to gather requirements.
  • Help develop our data engineering function in areas of data architecture, business intuition, and insight.
When we say bring your whole self to work, we mean it. You’ll join a diverse and inclusive community, built on trust, support and respect. Be yourself, find your people and share the things you love. As we continue to build our global team, we welcome all individuals and do not discriminate on the basis of gender identity, race, ethnicity, disability, sexual orientation, education, languages spoken and veteran’s status. Accommodation is available upon request at any point during the recruitment process, should you require any please do let us know.

Tags: Airflow AWS Data pipelines Data Warehousing Engineering ETL Finance Kafka Kinesis Kubernetes Looker MySQL NumPy Oracle Pandas Pipelines PostgreSQL Python Spark SQL Terraform

Regions: Remote/Anywhere North America
Country: Canada
Job stats:  29  2  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.