DevOps Data Engineer

Pleasanton, United States

Full Time Senior-level / Expert
OpenVPN logo
OpenVPN
Apply now Apply later

Posted 3 weeks ago

OpenVPN Inc is seeking a DevOps Data Engineer who loves the numbers.


This position involves:

  • Caring for our large horizontally scaled database clusters, processes, and custom data pipelines
  • Creating queries and ETLs and improving poorly written ones
  • Optimizing the clusters to run as efficiently as possible, maintaining data consistency
  • Helping data producers and consumers to move their data efficiently and produce the results they want
  • Contributing to the planning of future growth
  • Revision control and migrating data forward through new versions of software and architectures.
  • Implementation of our technologies on cloud environments (AWS, Azure, GCP) and bare metal
  • Dreaming about the numbers and ways to improve the systems when asleep
  • Implementing solutions that are good enough now, while planning for better solutions in the future
  • Building tools that make others within the company more productive

The candidate will:

  • Maintain roughly 100 custom data pipelines written in Perl, PHP, Python & Java
  • Create new data pipelines and transformations
  • Assist with data migrations and database component software updates
  • Assist with cloud and bare-metal infrastructure buildout
  • Build out new services such as data presentation systems (metabase, pentaho data reporter, tableau, etc)
  • Work with the BI team to turn raw data into insights as efficiently as possible
  • Help troubleshoot any BI tool issues such as failing jobs and reporting layer slowness
  • Work with other teams to create ETLs
  • Set up integration environments and provision databases for new projects
  • Develop the Dev-Ops process using CI/CD tools
  • Repair failing components as needed -- be available when needed
  • Be given only a mallet and some small wrenches with which to work

You are an engineer who:

  • Loves the data
  • Is comfortable improving complex and poorly written SQL queries
  • Is ready and willing to do what needs to be done
  • Has effective communication and writing skills
  • Loves documenting things almost as much as the data itself

Requirements

REQUIRED SKILLS

  • 5+ years of experience in a DevOps/DBA/DBE role
  • Extensive experience in SQL (PostgreSQL, PostgreSQL & more PostgreSQL and maybe some MySQL)
  • Working knowledge of JSON and PostgreSQL JSONB extensions
  • Strong operational experience in Linux/Unix environment and scripting languages like Shell, Perl, Python is required
  • Strong troubleshooting skills
  • Excellent verbal and written communication skills
  • When you talk in your sleep it should be PostgreSQL compatible SELECT statements

DESIRED SKILLS

  • Experience in horizontal database scaling, Citus Data/TimescaleDB
  • Familiarity with PostGIS
  • Experience in NoSQL (MongoDB) databases
  • Experience in Airflow, Kettle, and the Pentaho PDI/PDR
Job tags: Airflow AWS Java JSON Linux MongoDB MySQL NoSQL Perl Python SQL Tableau
Job region(s): North America
Job stats:  19  2  0
Share this job: