Data Engineer II

Canada

Applications have closed

HashiCorp

HashiCorp helps organizations automate multi-cloud and hybrid environments with Infrastructure Lifecycle Management and Security Lifecycle Management.

View company page

About the team

Our mission as a Data Analytics & Engineering organization is to enable HashiCorp to leverage data as a strategic asset by providing reliable, scalable, and efficient data solutions. Our ultimate goal is to empower our stakeholders to make informed, data driven decisions, and achieve critical business objectives. We are seeking a mid-level engineer to join our team! 

In this role you can expect to

  • Securing financial data in a data warehouse and retrieving data from financial sources
  • Building analytics and automating transformation to support stakeholders
  • Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively
  • Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity
  • Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization
  •  Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it
  •  Writes unit/integration tests, contributes to engineering wiki, and documents work
  •  Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues
  •  Designs data integrations and data quality framework
  •  Designs and evaluates open source and vendor tools for data lineage
  •  Works closely with all business units and engineering teams to develop strategy for long term data platform architecture
  • Develop best practices for data structure to ensure consistency within the system

You may be a good fit for our team if you have

  • Experience working with financial data sets
  • Experience in developing and deploying data pipelines, preferably in the Cloud
  • Experience with snowflake- snowflake SQL, Snow pipe, streams, Stored procedure, Task, Hashing, Row Level Security, Time Travel etc.
  • Hands on experience with Snowpark and App development with Snowpark and Stream lit
  • Strong experience in ETL or ELT Data Pipelines and various aspects, terminologies with Pure SQL like SCD Dimensions, Delta Processing etc.
  • Working with AWS cloud services - S3, Lambda, Glue, Athena, IAM, CloudWatch
  • Hands-on experience in API ( Restful API) development and maintenance with Cloud technologies( Like AWS API Gateway, AWS lambda etc).
  • Experience in creating pipelines for real time and near real time integration working with different data sources - flat files, XML, JSON, Avro files and databases
  • Experience in Python or Go programming languages to write maintainable, reusable, and complex functions for backend data processing. Front-end development with python is nice to have but not required
  • Strong written and oral communication skills with the ability to synthesize, simplify and explain complex problems to different audiences
  • Bachelor's in computer engineering, computer science or related area

#LI-Remote

 

Individual pay within the range will be determined based on job related-factors such as skills, experience, and education or training.

The base pay range for this role is:$110,500—$130,000 USD

Tags: APIs Architecture Athena Avro AWS Business Intelligence Computer Science Data analysis Data Analytics Data pipelines Data quality Data warehouse ELT Engineering ETL JSON Lambda Open Source Pipelines Python Security Snowflake SQL XML

Regions: Remote/Anywhere North America
Country: Canada
Job stats:  2  1  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.