Data Engineer

Remote

Applications have closed

Pango Group

View company page

Pango Group, an Aura Company, helps customers monitor, manage, and protect against the risks associated with their identities and personal information in a digital world. Backed by WndrCo, Warburg Pincus and General Catalyst, Pango Group is dedicated to creating the world’s most comprehensive portfolio of industry-leading cybersecurity solutions. Our vision is to become THE go-to resource for every cyber protection need individuals may face - today and in the future. 

Join us for the ride!

The Role
At Pango, data is at the core of most of our decision making and our Data Platform is one of the pillars
supporting the company’s growth. The data that pass through the Platform are used by both internal and
external customers. We are looking for a Data Engineer to help us continue improving the architecture,
enhancing, and maintaining the Data Platform.


Responsibilities
● Design, build, deploy, and support scalable pipelines (scaling in both volume and variety of data
sources)
● Enhance and maintain the infrastructure powering our analytics and data products.
● Drive data quality across the organization; develop best practices and frameworks for unit,
functional and integration tests for our team's test coverage and automation
● Define and manage SLAs for data sets and processes running in production
● Contribute to enhancements of Data Governance
● Evaluate, recommend, and build prototypes of new data engineering technology
● Participate in technical architecture discussions and peer code reviews
● Maintain documentation for all data engineering processes.


Qualifications
● 5+ years of experience in the data space with several years in Data Engineering working with a
modern, cloud-hosted tech stack; AWS experience critical
● Proficiency in Spark and Distributed Computing, Databricks a plus
● Track record of architecting, designing and deploying high performance systems with reliable
monitoring and logging practices Strong Knowledge of Python and SQL, Scala a plus
● Experience with job orchestration technologies like Airflow
● Strong communication skills and the ability to work asynchronously across multiple time zones;
open to shifted work hours to accommodate team meetings
● Knowledge of CI/CD, version control and the command line
● Experience working with third party APIs for ingestion and integration


Preferred Qualifications
● Knowledge of container services (Docker/Kubernetes)
● Experience with Stream-processing systems (Kafka, or equivalent)
● Infrastructure as Code tools like Terraform

 

#LI-Remote

As part of Pango Group, you will:

Solve real customer problems. Pango Group’s point solutions allow consumers to address their immediate cyber protection needs. Our mandate is to continuously anticipate our customers’ evolving digital security needs to create best-in-class solutions aimed at keeping them safe.

See your impact. We are a scrappy, nimble organization where individual contributions are needed and valued. You will see your impact every day.

Accelerate your career.  As we expand, you will have the opportunity to learn new technologies, products, and markets in a fast-paced, growth-oriented environment.

Most importantly, you’ll get to work with other talented people at a company where people matter. If you want to put your fingerprint on an organization and leapfrog your growth, this is the place for you.

In keeping with our beliefs and goals, no employee or applicant will face discrimination or harassment based on race, color, ancestry, national origin, religion, age, gender, marital domestic partner status, sexual orientation, gender identity, disability status, or veteran status. Above and beyond discrimination or harassment based on “protected categories,” Pango Group is committed to being an inclusive community where all feel welcome. Whether blatant or hidden, barriers to success have no place at Pango Group.

 

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Airflow APIs AWS CI/CD Databricks Docker Engineering Kafka Kubernetes Pipelines Python Scala Security Spark SQL Terraform

Perks/benefits: Career development Startup environment

Region: Remote/Anywhere
Job stats:  10  1  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.