Data Engineer

San Diego, CA

Description:

Top 3 Required Skills:
• Hands-on experience creating automated data pipelines using Pega DataFlow and other modern technology stacks for batch ETL or API
• Hands-on experience with Pega 8.X, Postgres and Cassandra
• Knowledge and experience with configuration, load-balancing, auto-scaling, monitoring, networking, and problem-solving in a cloud environment

Top 3 Desired Skills:
• Load Customer Data into Pega CDH on Pega Cloud, monitor data quality and implement related controls
• Migrate existing ETL for the Pega marketing PaaS solution to a Pega managed service in the cloud, SaaS solution
• Implement integration plans and interface with testing teams to incorporate plans into the integration testing process

Basic Purpose:
Develop technical solutions for importing and ingesting Customer Data into Pega CDH on Pega Cloud. The Data Engineer will be responsible for guiding the design, development, of Navy Federal's Data with a specific focus on Pega DataFlow, and ETL pipelines to support Pega Marketing systems capabilities. Solves highly complex problems; takes a broad perspective to identify solutions. Interacts with other functional teams or projects. Works independently.
 

Responsibilities:
• Load Customer Data into Pega CDH on Pega Cloud, monitor data quality and implement related controls
• Migrate existing ETL for the Pega marketing PaaS solution to a Pega managed service in the cloud, SaaS solution
• Evaluate possible designs, improve methods, and implement optimizations
• Document best practices for data models, data loads, query performance and enforce with other team members
• Implement integration plans and interface with testing teams to incorporate plans into the integration testing process
• Perform data archival
• Create reports to provide insight into ETL execution statistics and status
• Ensure the security and integrity of solutions including compliance with Navy Federal, industry engineering and Information Security principles and practices
• Analyze and validate data sharing requirements within and outside data partners
• Work directly with business leadership to understand data requirements; propose and develop solutions that enable effective decision-making and drives business objectives
Qualifications and Education Requirements:

• Bachelor’s degree in Information Systems, Computer Science, Engineering, or related field, or the equivalent combination of education, training, and experience
• Hands-on experience creating automated data pipelines using Pega DataFlow and other modern technology stacks for batch ETL or API
• Hands-on experience with Pega 8.x, Postgres and Cassandra
• Knowledge and experience with configuration, load-balancing, auto-scaling, monitoring, networking, and problem-solving in a cloud environment
• Experience with processing various file types
• Experienced in the use of ETL tools and techniques and have knowledge of CI/CD
• Very good understanding of SQL (Advanced level)
• Demonstrates change management and/or excellent communication skills
• Pega Cloud experience is desired, but not required

Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: APIs Cassandra CI/CD Computer Science Dataflow Data pipelines Data quality Engineering ETL Pipelines PostgreSQL Security SQL Statistics Testing

Region: North America
Country: United States
Job stats:  5  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.