Engineer, Data

Johannesburg, South Africa

Applications have closed

Standard Bank Group

The Standard Bank group is a leading financial services provider that supports Africa’s growth and development.

View company page

Company Description

Standard Bank Group is a leading Africa-focused financial services group, and an innovative player on the global stage, that offers a variety of career-enhancing opportunities – plus the chance to work alongside some of the sector’s most talented, motivated professionals. Our clients range from individuals, to businesses of all sizes, high net worth families and large multinational corporates and institutions. We’re passionate about creating growth in Africa. Bringing true, meaningful value to our clients and the communities we serve and creating a real sense of purpose for you.

Job Description

To develop and maintain complete data architecture across several application platforms, provide capability across application platforms. To design, build, operationalise, secure and monitor data pipelines and data stores to applicable architecture, solution designs, standards, policies and governance requirements thus making data accessible for the evaluation and optimisation for downstream use case consumption. To execute data engineering duties according to standards, frameworks, and roadmaps

Qualifications

Minimum Qualifications:

  • Degree: BSc Computer Science or Information Studies or Information Technology

Experience Required:

Data & Analytics

  • Information Lifecycle Management
  • Certification(s) in MS SQL and Data Lake technologies and databases, Cloud related data engineering tooling, services and storage layers (Synapse, S3, Spark and ADLS etc.).
  • Structured, semi-structured and unstructured Data usage .  Cloud certification(s) specifically Data Engineering, DEVOPS, Security and Analytics services

Experience Required

  • 5-7 years' experience building databases, warehouses, reporting and data integration solutions.
  • Experience building and optimising big data data-pipelines, architectures and data sets.
  • Experience in creating and integrating APIs.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
  • 8-10 years' experience in database programming languages including SQL, PL/SQL, SPARK and or appropriate data tooling.
  • Experience with data pipeline and workflow management tools
  • Deep understanding of data pipelining and performance optimisation, data principles, how data fits in an organisation, including customers, products and transactional information. Knowledge of integration patterns, styles, protocols and systems theory

Technology & Operations

  • Enterprise Technology & Solutions Architecture

Experience Required: 5-7 years

  • Experience in building databases, warehouses, reporting and data integration solutions.
  • Experience building and optimising big data data-pipelines, architectures and data sets.
  • Experience in creating and integrating APIs.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement

Technology & Operations

  • Software Engineering

Experience Required:

  • 8-10 years' experience in database programming languages including SQL, PL/SQL, SPARK and or appropriate data tooling.
  • Experience with data pipeline and workflow management tools

Additional Information

Technical Competencies:

Big Data Frameworks and Tools

  • Ability to manipulate and structure amounts of datasets that are so enormous, customary data preparing software can't oversee them.
  • Proficiency Level 4 ADVANCED - Mastered the concept, able to act independently, provides guidance and training to others

Data Engineering

  • The ability to design, build, operationalise, secure and monitor data pipelines and data stores.
  • Proficiency Level 4 ADVANCED - Mastered the concept, able to act independently, provides guidance and training to others

Data Integrity

  • The ability to ensure the accuracy and consistency of data for the duration that the data is stored as well as preventing unintentional alterations or loss of data.
  • Proficiency Level 4 ADVANCED - Mastered the concept, able to act independently, provides guidance and training to others

Data Quality

  • The ability to implement operational techniques and activities that are used to fulfil requirements for data quality including: improvement, control, management or assurance of data quality.
  • Proficiency Level 4 ADVANCED - Mastered the concept, able to act independently, provides guidance and training to others

IT Knowledge

  • Possesses a deep and broad knowledge base in fundamental IT technical skill sets. Stays informed on emerging trends.
  • Proficiency Level 4 ADVANCED - Mastered the concept, able to act independently, provides guidance and training to others

Stakeholder Management (IT)

  • The coordination of relationships with and between key stakeholders, during the design, management and implementation of business change.
  • Proficiency Level 4 ADVANCED - Mastered the concept, able to act independently, provides guidance and training to others

Tags: APIs Architecture Big Data Computer Science Data pipelines Data quality DevOps Engineering MS SQL Pipelines Security Spark SQL Unstructured data

Perks/benefits: Startup environment

Region: Africa
Country: South Africa
Job stats:  4  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.