Security Data Engineer

United States - California (Remote)

Applications have closed

HashiCorp

HashiCorp delivers consistent workflows to provision, secure, connect, and run any infrastructure for any application.

View company page

Security Data Engineer

About Us

HashiCorp is a fast-growing startup that solves development, operations, and security challenges in infrastructure so organizations can focus on business-critical tasks. We build tools to ease these decisions by presenting solutions that span the gaps. Our tools manage both physical machines and virtual machines, Windows, and Linux, SaaS and IaaS, etc. Our open source software is used by millions of users to provision, secure, connect, and run any infrastructure for any application. The Global 2000 uses our enterprise software to accelerate application delivery and drive innovation through software. 

We're looking for talented Data Engineers to join our Threat Detection and Response Team (TDR). This team will help defend HashiCorp by enhancing strategic detection, response, and prevention patterns across all of our products and the enterprise. This person will be responsible for expanding and maturing our approach to delivering visibility across all major cloud providers to ensure we have an accurate record of actions performed across each layer of our technology stacks.  This person will also be responsible for the continued improvement of our asset management coverage to ensure we always have an accurate inventory when investigating security events. 

As a member of our Security team, you’ll be responsible for ensuring we have the best practices implemented across our multi-cloud environment.  You will partner with engineering and other stakeholders to define and drive secure by default environments supporting  our products and the enterprise.  We’re heavily invested in tooling and automation, and the ability to continually improve these areas will be key to success as we scale our environments to meet customer demand.

Engineering at HashiCorp is largely a remote team. While prior experience working remotely isn't required, we are looking for team members who perform well given a high level of independence and autonomy.

HashiCorp embraces diversity and equal opportunity. We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. We believe the more inclusive we are, the better our company will be.

You may be a good fit:

  • 2+ years in an  engineering role focused on large scale data collection in the cloud, using cloud-native tooling
  • Working knowledge of batch or streaming data processing pipelines
    • Collect, Normalize, Tag, Enrich
    • Windowing and time series transformation
  • Working knowledge of patterns of information retrieval and optimizing query workload
    • Developing aggregates, views, summaries, and indices to accelerate access to data
    • Profiling query workloads using query planner output or other diagnostic tooling to identify performance bottlenecks
    • Profiling resource consumption to optimize expenditure on storage and transit
    • Planning, dispatching, and monitoring query workload to ensure on-time delivery of information with optimal use of resources
  • Experience working with multiple data query models 
    • Relational, key-value, graph, document, full-text search
    • Maintaining and evolving shared query content through source code management practices
  • Natural curiosity and an interest in Threat Detection, Incident Response, Fraud, and/or Threat Intel problem space and the desire to be exposed to and develop these skill areas while serving in a development-focused role
  • You have experience taking a periodic on-call rotation in a distributed team
  • Publicly released tools or modules or open source contributions

You have experience with some or all of these :

  • Python, Go or experience with other languages and willingness to learn
  • Terraform, Vault, Packer
  • AWS, GCP, Azure
  • AWS EC2, Lambda, Step Functions, ECR/ECS/EKS, S3
  • Logging Infrastructure and ETL Pipelines - fluentd, logstash, kafka, kinesis or similar
  • CI/CD - Building pipelines involving Jenkins, CircleCI, GH Actions, etc..
  • Solid foundation of Linux and exposure linux in cloud provider environments


Note to Candidates:  This is a list of items we think would define a successful candidate but we encourage you to apply if you feel you are a great match.

 

#LI-RR1

#LI-Remote

Tags: AWS Azure CI/CD EC2 ECS Engineering ETL GCP Kafka Kinesis Lambda Linux Logstash Open Source Pipelines Python Security Streaming Terraform

Perks/benefits: Startup environment Team events

Regions: Remote/Anywhere North America
Country: United States
Job stats:  5  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.