Manager II | Data Integration Lead

Charlotte, NC, US, 28210

Applications have closed

At Popular, we offer a wide variety of services and financial solutions to serve our communities in Puerto Rico, United States & Virgin Islands. As employees, we are dedicated to making our customers dreams come true by offering financial solutions in each stage of their life. Our extensive trajectory demonstrates the resiliency and determination of our employees to innovate, reach for the right solutions and strongly support the communities we serve; this is why we value their diverse skills, experiences and backgrounds.

 

Are you ready for a rewarding career?

Over 8,000 people in Puerto Rico, United States and Virgin Islands work at Popular.

Come and join our community!

The Opportunity

Data Integration Lead is an integral component of the Analytical Engineering & Enablement function, focused on the seamless integration and management of data across various sources into the data lake. This role takes on the crucial responsibility of designing and implementing data integration strategies that ensure the accessibility, reliability, and timeliness of data. Data Integration Manager collaborates with different teams to define integration requirements, oversees the data integration processes, and ensures compliance with data policies and standards.

Your key responsibilities:

You will collaborate with multifaceted teams of specialists spread across various locations to offer a broad spectrum of data and analytics solutions. You will address complicated challenges and propel advancement within the Enterprise Data & Analytics function.

Specifically:

  • Develop and implement data integration strategies, architectures, and processes to ensure the effective integration of data from multiple sources into the data lake and data warehouse.
  • Collaborate with teams across data architecture, data governance, data security, and business units, in addition to data analysts and other stakeholders to define data integration requirements and understand source data systems.
  • Monitor and maintain data integration workflows, ensuring data quality and performance optimization.
  • Lead and oversee Center of Excellence (COE) that includes building, evaluating, and selecting ETL, ELT Products; additionally, drive performance monitoring and tuning of ETL / ELT process.
  • Define and implement KPIs to track, monitor and improve ETL / ELT process flow and SLAs.
  • Drive implementation of data lifecycle framework, and define balancing, auditing and reconciliation frameworks.
  • Collaborate with data governance team and spearhead integration with metadata management, lineage, and data quality products.
  • Setup and oversee DevOps and DataOps process for automated testing and deployment of data products.
  • Experience in Cloud transformation and implemented various strategies such as Rehost, Re-platform, Repurchase, Refactor / Re-architect, Retire , and Retain.
  • Govern and manage the release management schedule for ETL process flow.
  • Advocate for implementation of data pipelines leveraging new technologies or solutions and guide team members during the adaptation journey.
  • Monitoring ongoing data integration projects to ensure that the project is on schedule and verify data quality is consistently maintained throughout the process.
  • Closely collaborate with data modeling, infrastructure, and cloud teams for provisioning and managing data infrastructure resources.
  • Monitor and optimize costs related to ELT / ETL services and cloud management.
  • Drive creation, documentation, and prioritization of user stories.
  • Provide guidance on the selection and implementation of data integration tools and technologies.
  • Evaluate third-party data sources for integration into the data lake and manage relationships with data providers.
  • Develop best practices for data integration and ensure compliance with data governance policies.
  • Train and mentor Data Integration Specialists and other team members on data integration best practices.

To qualify for the role, you must have:

 

  • Bachelor's degree in Computer Science, Information Systems, Engineering, Statistics, Mathematics, or a related field. A Master’s degree in a related field is a plus.
  • Minimum 15 years of experience in implementing large scale Data & Analytics platform in AWS, Azure, or Google Cloud, on-prem and Hybrid environment.
  • Minimum 5 years of experience in leading and managing various functional team within ED&A such as data integration, data engineering, analytical engineering, BI / data visualization, Data Operations, or a similar role.
  • Experience in leading engineering teams and delivering data capabilities in following waterfall, iterative, scaled agile, scrum, and kanban methodologies.
  • In-depth knowledge of data integration methodologies such as change data capture, ETL & ELT processes, real-time data processing, micro-services, data lifecycle management, data lake, data warehouse, data vault, data mesh, data marketplace and data science concepts.
  • Hands-on experience with On-prem & cloud data platforms such as Snowflake, AWS Redshift, Azure Synapse Analytics, Databricks, AWS Aurora, Oracle Exadata, SQL server, Hadoop, Spark, SAS and R.
  • Proficiency in data integration tools and frameworks such as Informatica PC & IICS, IBM DataStage, DBT, Matillion, Microsoft SSIS, Glue, Batch, Azure data factory, data pipeline, Qlik replicate, Oracle GoldenGate, Shareplex, Apache NiFi and Python based frameworks.
  • Experience in Implementing tools and services in data security and data governance domains such as data modeling, data classification, data access control, data masking, data quality, metadata management, catalog, auditing, balancing, reconciliation, and data privacy compliance like GDPR & CCPA.
  • Experience with DevOps and DataOps products such as Jenkins, Git, GITLab, Maven, Bitbucket, and Jira
  • Experience with log integration and observability products such as Splunk, Datadog, Grafana, AppDynamics, and CloudWatch
  • Excellent data analysis, profiling and statistics skills coupled with proficiency in SQL tools and technologies such as Oracle, SQL Server, MySQL, Pandas, NumPy, Ggplot, Shiny, SciPy, Sci-Kit Learn, and Matplotlib.
  • Strong proficiency in SQL, Spark, Python, R, SAS or other data manipulation and transformation languages.
  • Experience in handling data streams, APIs, events, container orchestration products such OpenShift, EKS, ECS.
  • Experience in implementing and managing scheduling & Orchestration tools such as AutoSys, ControlM, Apachi airflow, Step Functions and Tidal.
  • Implementation experience of one or more AI/ML platforms in cloud such as Sagemaker, Dataiku, DataRobot, H2O.ai, Snowpark, ModelOp Center, and Domino Data Lab.
  • Managing team of 30 or more members with combination of full-time employees, contractors, and vendor resources across various locations in onshore, nearshore and offshore and from multiple vendors.

What we look for:

We are seeking enthusiastic and proactive leaders who have a clear vision and an unwavering commitment to remain at the forefront of data technology and science. Our ideal candidates are those who aim to foster a team spirit and collaboration and have a knack for adept management. It is essential that you display comprehensive technical proficiency and possess a rich understanding of the industry.

If you have a genuine drive for helping consumers achieve the full potential of their data while working towards your own development, this role is for you.

 

 

Important: The candidate must provide evidence of academic preparation or courses related to the job posting, if necessary.

 

If you have a disability and need assistance with the application process, please contact us at  asesorialaboral@popular.com. This email inbox is monitored for such types of requests only. All information you provide will be kept confidential and will be used only to the extent required to provide reasonable accommodations. Any other correspondence will not receive a response.

 

As a leading financial institution in the communities we serve, we reaffirm our commitment to always offer essential financial services and solutions for our customers, including during emergency situations and/or natural disasters. Popular’s employees are considered essential workers, whose role is critical in the continuity of these important services even under such circumstances. By applying to this position, you acknowledge that Popular may require your services during and immediately after any such events.

 

If you are a California resident, please click here to learn more about your privacy rights.

 

          .

 

Popular is an Equal Opportunity Employer

Learn more about us at www.popular.com and keep updated with our latest job postings at https://jobs.popular.com/usa/.

Connect with us!

LinkedIn |   Facebook   |   Twitter   |   Instagram  |   Blog

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile Airflow APIs Architecture AWS Azure Bitbucket Classification Computer Science Data analysis Databricks Data governance DataOps Data pipelines Data quality DataRobot Data visualization Data warehouse DevOps ECS ELT Engineering ETL GCP Git GitLab Google Cloud Grafana Hadoop Informatica Jira Kanban KPIs Machine Learning Mathematics Matillion Matplotlib Maven MySQL NiFi NumPy Oracle Pandas Pipelines Privacy Python Qlik R Redshift SageMaker SAS Scikit-learn SciPy Scrum Security Snowflake Spark Splunk SQL SSIS Statistics Step Functions Testing

Perks/benefits: Career development Team events

Region: North America
Country: United States
Job stats:  5  0  0
Category: Leadership Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.