Senior Data Engineer

Remote job

Applications have closed

OpenVPN

OpenVPN provides flexible business VPN solutions for an enterprise to secure all data communications and extend private network services while maintaining security.

View company page

OpenVPN Inc is seeking an experienced Data Engineer who loves the numbers.


This position involves:

  • Caring for our large horizontally scaled database clusters, processes, and custom data pipelines
  • Creating queries and ETLs and improving poorly written ones
  • Optimizing the clusters to run as efficiently as possible, maintaining data consistency
  • Helping data producers and consumers to move their data efficiently and produce the results they want
  • Contributing to the planning of future growth
  • Revision control and migrating data forward through new versions of software and architectures.
  • Implementation of our technologies on cloud environments (AWS, Azure, GCP) and bare metal
  • Dreaming about the numbers and ways to improve the systems when asleep
  • Implementing solutions that are good enough now, while planning for better solutions in the future
  • Building tools that make others within the company more productive

The candidate will:

  • Maintain roughly 100 custom data pipelines written in Perl, PHP, Python & Java
  • Create new data pipelines and transformations
  • Assist with data migrations and database component software updates
  • Assist with cloud and bare-metal infrastructure buildout
  • Build out new services such as data presentation systems (metabase, pentaho data reporter, tableau, etc)
  • Work with the BI team to turn raw data into insights as efficiently as possible
  • Help troubleshoot any BI tool issues such as failing jobs and reporting layer slowness
  • Work with other teams to create ETLs
  • Set up integration environments and provision databases for new projects
  • Develop the Dev-Ops process using CI/CD tools
  • Repair failing components as needed -- be available when needed
  • Be given only a mallet and some small wrenches with which to work

You are an engineer who:

  • Loves the data
  • Is comfortable improving complex and poorly written SQL queries
  • Is ready and willing to do what needs to be done
  • Has effective communication and writing skills
  • Loves documenting things almost as much as the data itself

Requirements

REQUIRED SKILLS

  • 5+ years of experience in a DevOps/DBA/DBE role
  • Extensive experience in SQL (PostgreSQL, PostgreSQL & more PostgreSQL and maybe some MySQL)
  • Working knowledge of JSON and PostgreSQL JSONB extensions
  • Strong operational experience in Linux/Unix environment and scripting languages like Shell, Perl, Python is required
  • Strong troubleshooting skills
  • Excellent verbal and written communication skills
  • When you talk in your sleep it should be PostgreSQL compatible SELECT statements

DESIRED SKILLS

  • Experience in horizontal database scaling, Citus Data/TimescaleDB
  • Familiarity with PostGIS
  • Experience in NoSQL (MongoDB) databases
  • Experience in Airflow, Kettle, and the Pentaho PDI/PDR

Tags: Airflow AWS Azure CI/CD Data pipelines DevOps GCP JSON Linux Metabase MongoDB MySQL NoSQL Pentaho Perl PHP Pipelines PostgreSQL Python SQL Tableau

Region: Remote/Anywhere
Job stats:  32  5  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.