Sr. Data Engineer

United States (Remote)

Applications have closed

HashiCorp

HashiCorp delivers consistent workflows to provision, secure, connect, and run any infrastructure for any application.

View company page

About HashiCorp

HashiCorp is a fast-growing startup that solves development, operations, and security challenges in infrastructure so organizations can focus on business-critical tasks. We build products to give organizations a consistent way to manage their move to cloud-based IT infrastructures for running their applications. Our products enable companies large and small to mix and match AWS, Microsoft Azure, Google Cloud, and other clouds as well as on-premises environments, easing their ability to deliver new applications for their business.

About the role

Reporting to the Head of Data Engineering - this role will help lead Hashicorp’s IT data engineering effort across a wide range of data systems and tools with the primary goal of delivering data for deeper integration, enhanced capabilities, and impactful analytics. 

This role will work closely with a data integration team, product engineering, as well as data analysts to deliver end to end optimization and support of the data life cycle. 

Key projects include the creation and support of an enterprise data warehouse, data prep for prescriptive and predictive modelling, data analytics for process improvement, data acquisition for future use cases, and many others.  

In this role you can expect to:

  • Lead, design, and develop secure, scalable, high-performance and reliable and cost-effective Data platform software and services in a multi-Cluster environment.
  • Provide technical leadership in Data Engineering practices. Partner with architecture, Infosec, and development teams across the entire enterprise to establish a strong relationship and trust for prioritizing and delivering projects on time and with a high degree of quality.
  • Establish patterns for re-use, standards and best practices for Data Engineering & Analytics such that data timeliness and quality are maximized.
  • Partner with the Audit, Compliance and Governance leaders to ensure data is secured and conforms to the guidelines laid out by the organization and the industry.
  • Partner with cross-functional teams to deliver large scale strategic projects as well as ongoing operational activities related to Big Data Platform. Interface with cross functional leaders and represent point of views to drive measurable success.
  • Drive meetings and lead discussions. Prioritize projects across the team and allocate resources to meet business and team goals.

You may be a good fit for our team if you have:

  • Bachelor's or Master's in computer engineering, computer science or related area.
  • 7+ years proven experience in developing and deploying data pipelines, preferably in the Cloud
  • 5+ years proven experience in building data warehouse platforms in dealing with star & snowflake schemas as well as slowly changing dimensions.
  • 4+ years of strong experience with SQL and stored procedures
  • 4+ years of Experience of working with public cloud platforms like AWS, Azure or GCP
  • Expertise in Snowflake architecture and setting up the full Environment
  • 2+ years of proven expertise in creating pipelines for real time and near real time integration working with different data sources - flat files, XML, JSON, Avro files and databases
  • 3+ years of experience with at least one programming language like Python, Java or Scala
  • Knowledge on Big Data platforms and applications is a plus.
  • Knowledge for handling exceptions and automated re-processing and reconciling
  • Passion for Data Quality with an ability to integrate these capabilities into the deliverables
  • Prior use of Big Data components and the ability to rationalize and align their fit for a business case
  • Experience in working with different data sources - flat files, XML, JSON, Avro files and databases
  • Proficiency in techniques for slowly changing dimensions
  • Knowledge of Jenkins for continuous integration and End-to-End automation for application build and deployments
  • Ability to integrate into a project team environment and contribute to project planning activities
  • Experience in developing implementation plans and schedules and preparing documentation for the jobs according to the business requirements
  • Lead ambiguous and complex situations to clear measurable plans
  • Proven experience and ability to work with people across the organization and skilled at managing cross-functional relationships and communicating with leadership across multiple organizations.
  • Proven capabilities for strong written and oral communication skills with the ability to synthesize, simplify and explain complex problems to different audiences.
  • Excellent experience with critical applications like salesforce, Netsuite, Anaplan, Marketo and other Marketing, Sales, SaaS platforms

HashiCorp embraces diversity and equal opportunity. We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. We believe the more inclusive we are, the better our company will be.

For more information regarding how HashiCorp collects, uses, and manages personal information, please review our Privacy Policy.

 

Tags: Avro AWS Azure Big Data Computer Science Data Analytics Data pipelines Engineering GCP Google Cloud JSON Pipelines Python Scala Security Snowflake SQL XML

Perks/benefits: Startup environment Team events

Regions: Remote/Anywhere North America
Country: United States
Job stats:  2  1  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.