Sr . Data Engineer

India - Bengaluru

Applications have closed

Capco

Capco is a global management and technology consultancy dedicated to the financial services and energy industries.

View company page

About Us

Joining Capco means joining an organisation that is committed to an inclusive working environment where you’re encouraged to #BeYourselfAtWork. We celebrate individuality and recognise that diversity and inclusion, in all forms, is critical to success. It’s important to us that we recruit and develop as diverse a range of talent as we can. We believe that everyone brings something different to the table – so we’d love to know what makes you different.

 We are/have:

  • Experts in banking and payments, capital markets and wealth and asset management
  • Deep knowledge in financial services offering, including e.g. Finance, Risk and Compliance, Financial Crime, Core Banking etc.
  • Committed to growing our business and hiring the best talent to help us get there Focused on maintaining our nimble, agile and entrepreneurial culture

Capco is looking for hardworking, innovative and creative people to join our Data Analyst Team. We are looking for Data Analyst  to work on all aspects of project delivery engaging a complex stakeholder groups across multiple global regions / product lines to execute global change programs in the Investment Banking domain. This includes business analysis, requirements gathering and documentation, driving technical design and specifications in partnership with IT, functional test strategizing and execution, and user acceptance testing coordination. Must possess strong relationship management skills and be able to manage requirements and testing across Ops and IT teams both cross-division and globally. Role is focused on projects to support the development of regulatory/industry driven changes.

Role Description

Key Deliverables of this role:

MUST HAVE:

·         Technical Support, Team Building and mentoring of junior engineers

·         Strong understanding of SQL and data querying from Hive

·         Experience with Hadoop and Spark technology preferred (strong knowledge of architecture, internals, pipelines and execution)

·         Experience writing production level code with version control and CI/CD in place (GitHub, Jenkins and Anisble)

·         Experience with Cloud infrastructure technology (e.g. the full machine learning pipeline from ingestion to model outputs in the GCP suite or any major cloud vendor) preferred

·         Prior exposure in working closely with POs and knowledge or certification in agile

·         At least 5 years of experience manipulating data sets and implementing data models

·         Experience using object oriented and statistical programming language (python, pyspark – required; scala – a plus)

·         Experience in Agile methodologies and tracking progress through JIRA

·         Experience in Business Data Analysis and in transformation of requirements to mapping specification

·         Experience with performance tuning on Hive - Clustering, Vectorization and Cost Based Optimisation

Good to have:

·         Develop custom models and algorithms to drive experimentation forward

·         Identify immediate opportunities for predictive modelling to deliver measurable value across the bank

·         Knowledge of a variety of machine learning and statistical techniques; all interests in this space are welcome, but a desire to experiment and learn more is important for our research

·         As a baseline, an understanding of different machine learning model algorithms is required, spanning supervised and unsupervised areas, as well as deep learning

·         Experience building, interpreting, and presenting the applications of machine learning modelling and statistical techniques

·         Delivering machine learning or advanced analytics based use cases that contribute to the business kpi’s and help empower the frontline

·         Build re-usable & robust infrastructures that help reduce overall technical debt and promote engineering and scientific best practices across the team

·         Work with the leadership team to ensure that the existing model are in line with the business identified strategic deliverables

·          Promote a culture in the team that encourages their growth and development as well as empower the team to contribute and voice their opinions constructively

·         Work with stakeholders to help them understand the complex world of analytics and develop a trusted relationship.

·         Understand and collect project requirement, present analysis/ results, make recommendations on strategy formulation and solicit project feedback

·          Advocate and promote the use of data analytics to drive fact-based activity and behaviour in Wholesale

Customers / Stakeholders:

  • Wholesale business lines
  • Global and Regional Heads of business
  • Source systems teams (IT/Business owners)
  • Internal clients to facilitate effective data analysis/migration and process change and ensure expectations are effectively managed

Leadership & Teamwork:

  • Influences a cross-functional/cross-cultural team and the performance of individuals/teams against performance objectives and plans
  • Endorses team engagement initiatives, fostering an environment which encourages learning and collaboration to build a sense of community
  • Creates environments where only the best will do and high standards are expected, regularly achieved and appropriately rewarded; encourages and supports continual improvements within the team based on ongoing feedback
  • Develops a network of professional relationships across Wholesale Data & Analytics and our stakeholders to improve collaborative working and encourage openness - sharing ideas, information and collateral
  • Encourages individuals to network and collaborate with colleagues beyond their own business areas and/or the Group to shape change and benefit the business and its customers
  • Support delivery managers in documenting accurate status reports as required, in a timely manner

 

Operational Effectiveness & Control:

 

  • Adhere strictly to compliance and operational risk controls in accordance with HSBC and regulatory standards, policies and practices; report concerns or observations in terms of control weaknesses, compliance breaches and operational risk impact
  • Ensure all due diligence is performed to prevent adverse impact to customers and business operations
  • Support documentation of risks, issues and dependencies in the RAID log for allocated projects, and ensure that these are fed into the programme effectively

 

WHY JOIN CAPCO?

You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry.

We offer:

  • A work culture focused on innovation and creating lasting value for our clients and employees
  • Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
  • A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients
  • A diverse, inclusive, meritocratic culture

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile Architecture Banking CI/CD Clustering Data analysis Data Analytics Deep Learning Engineering Finance GCP GitHub Hadoop Jira KPIs Machine Learning Pipelines PySpark Python Research Scala Spark SQL Statistics Testing

Perks/benefits: Career development Flat hierarchy

Region: Asia/Pacific
Country: India
Job stats:  2  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.