Senior Data Engineer (Remote, Supply Chain Experience Required)
MAKATI CITY, 00, PH, 1226
Applications have closed
TE Connectivity
Job Overview
TE Connectivity’s Data Architecture and Analysis Teams are responsible for the delivery of database and data architecture design services, providing sizing and configuration assistance, performing needs assessments and performance monitoring/tuning. They analyze current business practices, processes and procedures, and identify future business opportunities for leveraging data storage and retrieval system capabilities.
At TE we strongly believe that data and analytics are strategic drivers for future success. We are building a world class advanced analytics team that will solve some of the most complex strategic problems and deliver topline growth and operational efficiencies across our business The Analytics team at TE is part of the TE Information Solutions (TEIS) Organization and is responsible for driving organic growth by leveraging big data and advanced analytics. The team reports to the VP and Chief Data Officer at TEIS, works closely with the SVP of Corporate Strategy, and has regular interactions with the company’s C-Suite.
We are on an exciting journey to build and scale our advanced analytics practice. TE is looking for a Data Engineer. The suitable candidate should have demonstrated experience in designing and implementing ETL solutions on-premise and cloud platforms to support Enterprise data warehouse, Data Lake and advanced analytics capabilities. Success in this role comes from marrying a strong data engineering background with product and business acumen to deliver scalable data pipeline and BI solutions that can enable data for Self Service and Advanced Analytics at TE in a simple and standard manner.
You will be responsible to help defining ROI, requirements analysis, design and implementation data solutions on-premise and cloud. The candidate will work closely with project managers, vendor partners, business unit representatives, project sponsors and Segment CIO teams to deliver the solutions. The candidate is expected to communicate project status, issues and change control to all levels of management.
Job Responsibilities
- Designs and develops ETL solutions using data warehouse design best practices for Next Generation Analytics platform
- Analyze data requirements, complex source data, data models, and determine the best methods in extracting, transforming and loading the data into the data staging, warehouse and other system integration projects.
- Analyze business requirements and outline solutions.
- Have deep working knowledge of on-prem & cloud ESB architecture to address the client’s requirements for scalability, reliability, security, and performance
- Provide technical assistance in identifying, evaluating, and developing systems and procedures.
- Document all ETL and data warehouse processes and flows..
- Develop and deploy ETL job workflow with reliable error/exception handling and rollback.
- Manage foundational data administration tasks such as scheduling jobs, troubleshooting job errors, identifying issues with job windows, assisting with Database backups and performance tuning.
- Design, Develop, Test, Adapt ETL code & jobs to accommodate changes in source data and new business requirements.
- Create or update technical documentation for transition to support teams.
- Develop automated data audit and validation processes
- Provides senior technical leadership to design, architecture, integration and support of the entire data sourcing platform with a focus on high availability, performance, scalability and maintainability
- Manage automation of file processing as well as all ETL processes within a job workflow.
- Develop, Contribute and adhere to the development of standards and sound procedural practices.
- Proactively communicate innovative ideas, solutions, and capabilities over and above the specific task request
- Effectively communicate status, workloads, offers to assist other areas.
- Collaboratively work with a team and independently. Continuously strive for high performing business solutions
- Perform and coordinate unit and system integration testing.
- Participate in design review sessions and ensure all solutions are aligned to pre-defined architectural specifications.
- Ensure data quality throughout entire ETL process, including audits and feedback loops to sources of truth.
Job Requirements
- 6+ years of Data Engineering experience in ETL design, development, optimization & testing using PL/SQL, SAP Data Services (BODS) or Talend etc
- 5+ PL/SQL, Complex SQL Tuning, Stored Procedures, Data Warehousing best practices etc.
- 3+ years of experience in relational and Cloud database design, optimization and performance; preferably with AWS (S3 and Redshift), SAP HANA, BW, Oracle, and Hadoop
- 3+ years of experience in developing flows using batch, Realtime and streaming process to personalize experiences for our customers.
- 2+ years of experience in designing service-oriented architecture (SOA), RESTful APIs and enterprise application integration (EAI) solutions utilizing MuleSoft Platform
- 2+ years of experience with CI/CD tools like Jenkins, GIT, Java and Shell scripting
- Strong problem-solving capabilities. Results oriented. Relies on fact-based logic for decision-making.
- Ability to work with multiple projects and work streams at one time. Must be able to deliver results based upon project deadlines.
- Willing to flex daily work schedule to allow for time-zone differences for global team communications
- Strong interpersonal and communication skills
Competencies
Values: Integrity, Accountability, Inclusion, Innovation, Teamwork* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: APIs Architecture AWS Big Data CI/CD Data quality Data warehouse Data Warehousing Engineering ETL Git Hadoop Java Oracle Redshift Security Shell scripting SQL Streaming Talend Testing
Perks/benefits: Startup environment
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Data Science Manager jobs
- Open MLOps Engineer jobs
- Open Lead Data Analyst jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Manager jobs
- Open Data Engineer II jobs
- Open Principal Data Engineer jobs
- Open Power BI Developer jobs
- Open Sr Data Engineer jobs
- Open Business Intelligence Developer jobs
- Open Junior Data Scientist jobs
- Open Data Analytics Engineer jobs
- Open Product Data Analyst jobs
- Open Data Scientist II jobs
- Open Business Data Analyst jobs
- Open Senior Data Architect jobs
- Open Sr. Data Scientist jobs
- Open Data Analyst Intern jobs
- Open Big Data Engineer jobs
- Open Manager, Data Engineering jobs
- Open Data Quality Analyst jobs
- Open Data Product Manager jobs
- Open Junior Data Engineer jobs
- Open Principal Data Scientist jobs
- Open Azure Data Engineer jobs
- Open GCP-related jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open Java-related jobs
- Open ML models-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open Data visualization-related jobs
- Open Finance-related jobs
- Open Deep Learning-related jobs
- Open PhD-related jobs
- Open APIs-related jobs
- Open TensorFlow-related jobs
- Open PyTorch-related jobs
- Open NLP-related jobs
- Open Consulting-related jobs
- Open Snowflake-related jobs
- Open CI/CD-related jobs
- Open LLMs-related jobs
- Open Kubernetes-related jobs
- Open Generative AI-related jobs
- Open Data governance-related jobs
- Open Hadoop-related jobs
- Open Airflow-related jobs
- Open Docker-related jobs