Data Architect

RMZ Ecoworld, Devarabeesanahal, Bengaluru, India

BT Group

From Ultra Fast Full Fibre broadband to TV & Mobile, BT helps UK families, communities & companies reach their potential. Find more BT products here.

View company page

Why this job matters

BT is building a world class Data & AI function within its Business & Digital businesses as part of its strategy to put data and actionable insight at the heart of everything it does and deliver on its Digital Transformation Programme to enhance the experience of its customers. This is key to delivering next generation products and key business processes across Sales & Marketing, Service Delivery and Service Assurance. Looking ahead we want to harness data and exploit specifically AI/ML technologies; we want people in the company and our customers to self-serve, make our operations smarter and create ‘best in class’ data driven innovative products our customers will love. For example, predicting service impacting incidents before they occur using network telemetry data, identifying jeopardy in delivery plans based on historical events, or personalising upsell opportunities by harnessing a 360-degree view of the customer, up to leveraging IoT to support big initiatives such as ‘smart cities.

You will work as a Data Architect specialising in Data Mesh and Data Vault 2 on the Google Cloud Platform (GCP).  You will play a crucial role in designing and implementing robust, scalable, and maintainable data models for our organisation. Your expertise will contribute to the establishment of a modern, flexible, and efficient data infrastructure on the GCP ecosystem.

What you’ll be doing

Data Modelling:

• Collaborate with cross-functional teams to understand business requirements and translate them into effective and scalable data models
• Develop and maintain data models using industry-leading practices, with a strong emphasis on Data Mesh and Data Vault 2 methodologies
• Ensure that data models align with standards and guidelines defined by Data architects and are adaptable to the evolving needs of the business
• Responsible for the development of the conceptual, logical, and physical data models, the implementation of Data Mesh, Data Fabric on target platforms (Google Big Query) using ERWIN.

Domain Expertise:

• Acquire a deep understanding of various business domains and their associated data, processes and systems, ensuring that data models are reflective of the domain-specific context and requirements
• Data Mesh Implementation:
• Work closely with the Data Mesh architecture principles to ensure the decentralised ownership and domain-oriented approach to data
• Define and implement data products, aligning with the Data Mesh principles of domain-driven decentralized data ownership
• Ensure that data is structured in order to easily conform to security controls and obligations, that relate to the data
• Data Vault 2 Implementation:
• Design and implement Data Vault 2.0 compliant data warehouses and hubs.
• Ensure that the Data Vault model provides flexibility, scalability, and resilience in handling complex and evolving business requirements.
• Ensure that every artifact built is optimised and monitored and that cost is always considered
• Support, guide and mentor team members, in the domain

Collaboration:

• Prior experience working in an agile squad environment, with minimal supervision.
• Expert technical advice, presentations to and education of audiences (technical and business) within Enterprise Data and Architectures, and within the business, including data stewards and enterprise architects, regarding enterprise conformance and Data Vault modelling concepts
• Collaborate with solution architects, data engineers, data scientists, and other stakeholders to understand data usage patterns, deal with production and Data Quality issues and optimize data models for performance.
• Provide guidance and support to development teams in the implementation of data models within the Data Mesh and Data Vault 2 frameworks.

Documentation:

• Create and maintain comprehensive documentation of data models, ensuring that they are accessible to relevant stakeholders.
• Keep abreast of industry trends, emerging technologies, and best practices related to data modelling and integration.
• Creation and maintenance of artefacts relating to data models (e.g. DDLs, mapping of data, DMLs, Data Dictionaried, Change Registers etc.)

Skills required for the role

• Certification in Data Vault 2.0 or related technologies
• Experience with tools such as Apache Kafka, Apache Flink, or similar data streaming platforms
• Familiarity with Google Cloud Platform services or AWS Platform Services with respect to Data and AI/ML
• Proficiency and experience with Erwin Data Modeller
• Experience or exposure to data catalogues such as Collibra and Abinitio would be highly beneficial

Our leadership standards

Looking in:
Leading inclusively and Safely
I inspire and build trust through self-awareness, honesty and integrity.
Owning outcomes
I take the right decisions that benefit the broader organisation.

Looking out:
Delivering for the customer
I execute brilliantly on clear priorities that add value to our customers and the wider business.
Commercially savvy
I demonstrate strong commercial focus, bringing an external perspective to decision-making.

Looking to the future:
Growth mindset
I experiment and identify opportunities for growth for both myself and the organisation.
Building for the future
I build diverse future-ready teams where all individuals can be at their best.

About us

BT is part of BT Group, along with EE, Openreach, and Plusnet.

Millions of people rely on us every day to help them live their lives, power their businesses, and keep their public services running. We connect friends to family, clients to colleagues, people to possibilities. We keep the wheels of business spinning, and the emergency services responding. 

We value diversity and celebrate difference. ‘We embed diversity and inclusion into everything that we do. It’s fundamental to our purpose: we connect for good.’

We all stick to the same values: Personal, Simple, and Brilliant. From day one, you’ll get stuck in to tough challenges, pitch in with ideas, make things happen. But you won’t be alone: we’ll be there with help and support, learning and development.  

This is your chance to make a real difference to the world: to be part of the digital transformation of countless lives and businesses. Grab it.

 

A FEW POINTS TO NOTE:

Although these roles are listed as full-time, if you’re a job share partnership, work reduced hours, or any other way of working flexibly, please still get in touch.

DON'T MEET EVERY SINGLE REQUIREMENT?

Studies have shown that women and people who are disabled, LGBTQ+, neurodiverse or from ethnic minority backgrounds are less likely to apply for jobs unless they meet every single qualification and criteria. We're committed to building a diverse, inclusive, and authentic workplace where everyone can be their best, so if you're excited about this role but your past experience doesn't align perfectly with every requirement on the Job Description, please apply anyway - you may just be the right candidate for this or other roles in our wider team.

Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile Architecture AWS BigQuery Data quality Flink GCP Google Cloud Kafka Machine Learning Security Streaming

Perks/benefits: Career development Flex hours Team events

Region: Asia/Pacific
Country: India
Job stats:  2  0  0
Category: Architecture Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.