Data Engineer
San Diego, CA
ASCENDING
Description:
Top 3 Required Skills:
• Hands-on experience creating automated data pipelines using Pega DataFlow and other modern technology stacks for batch ETL or API
• Hands-on experience with Pega 8.X, Postgres and Cassandra
• Knowledge and experience with configuration, load-balancing, auto-scaling, monitoring, networking, and problem-solving in a cloud environment
Top 3 Desired Skills:
• Load Customer Data into Pega CDH on Pega Cloud, monitor data quality and implement related controls
• Migrate existing ETL for the Pega marketing PaaS solution to a Pega managed service in the cloud, SaaS solution
• Implement integration plans and interface with testing teams to incorporate plans into the integration testing process
Basic Purpose:
Develop technical solutions for importing and ingesting Customer Data into Pega CDH on Pega Cloud. The Data Engineer will be responsible for guiding the design, development, of Navy Federal's Data with a specific focus on Pega DataFlow, and ETL pipelines to support Pega Marketing systems capabilities. Solves highly complex problems; takes a broad perspective to identify solutions. Interacts with other functional teams or projects. Works independently.
Responsibilities:
• Load Customer Data into Pega CDH on Pega Cloud, monitor data quality and implement related controls
• Migrate existing ETL for the Pega marketing PaaS solution to a Pega managed service in the cloud, SaaS solution
• Evaluate possible designs, improve methods, and implement optimizations
• Document best practices for data models, data loads, query performance and enforce with other team members
• Implement integration plans and interface with testing teams to incorporate plans into the integration testing process
• Perform data archival
• Create reports to provide insight into ETL execution statistics and status
• Ensure the security and integrity of solutions including compliance with Navy Federal, industry engineering and Information Security principles and practices
• Analyze and validate data sharing requirements within and outside data partners
• Work directly with business leadership to understand data requirements; propose and develop solutions that enable effective decision-making and drives business objectives
Qualifications and Education Requirements:
• Bachelor’s degree in Information Systems, Computer Science, Engineering, or related field, or the equivalent combination of education, training, and experience
• Hands-on experience creating automated data pipelines using Pega DataFlow and other modern technology stacks for batch ETL or API
• Hands-on experience with Pega 8.x, Postgres and Cassandra
• Knowledge and experience with configuration, load-balancing, auto-scaling, monitoring, networking, and problem-solving in a cloud environment
• Experience with processing various file types
• Experienced in the use of ETL tools and techniques and have knowledge of CI/CD
• Very good understanding of SQL (Advanced level)
• Demonstrates change management and/or excellent communication skills
• Pega Cloud experience is desired, but not required
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: APIs Cassandra CI/CD Computer Science Dataflow Data pipelines Data quality Engineering ETL Pipelines PostgreSQL Security SQL Statistics Testing
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Data Science Manager jobs
- Open Lead Data Analyst jobs
- Open Marketing Data Analyst jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Engineer II jobs
- Open Principal Data Engineer jobs
- Open MLOps Engineer jobs
- Open Power BI Developer jobs
- Open Data Scientist II jobs
- Open Business Intelligence Developer jobs
- Open Data Analytics Engineer jobs
- Open Business Data Analyst jobs
- Open Sr Data Engineer jobs
- Open Junior Data Scientist jobs
- Open Data Analyst Intern jobs
- Open Product Data Analyst jobs
- Open Sr. Data Scientist jobs
- Open Senior Data Architect jobs
- Open Research Scientist jobs
- Open Big Data Engineer jobs
- Open Azure Data Engineer jobs
- Open Manager, Data Engineering jobs
- Open Principal Data Scientist jobs
- Open Data Quality Analyst jobs
- Open Data Product Manager jobs
- Open Data quality-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open Business Intelligence-related jobs
- Open ML models-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open PhD-related jobs
- Open Deep Learning-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open PyTorch-related jobs
- Open TensorFlow-related jobs
- Open APIs-related jobs
- Open NLP-related jobs
- Open Consulting-related jobs
- Open LLMs-related jobs
- Open CI/CD-related jobs
- Open Snowflake-related jobs
- Open Generative AI-related jobs
- Open Kubernetes-related jobs
- Open Hadoop-related jobs
- Open Data governance-related jobs
- Open Airflow-related jobs
- Open Databricks-related jobs