Principal Data Engineer
UK - London
Capco
Capco is a global management and technology consultancy dedicated to the financial services and energy industries.Principal Data Engineer - Digital Engineering
Why Join Capco?
Capco is a global technology and business consultancy, focused on the financial services sector. We are passionate about helping our clients succeed in an ever-changing industry.
You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry.
We offer:
- A work culture focused on innovation and building lasting value for our clients and employees
- Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
- A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients
- A diverse, inclusive, meritocratic culture
- Enhanced and competitive family friendly benefits, including maternity / adoption / shared parental leave and paid leave for sickness, pregnancy loss, fertility treatment, menopause, and bereavement
Data Engineering at Capco
You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry.
You’ll be part of digital engineering team that develop new and enhance existing financial and data solutions, having the opportunity to work on exciting greenfield projects as well as on established Tier1 bank applications adopted by millions of users.
You’ll be involved in digital and data transformation processes through a continuous delivery model.
You will work on automating and optimising data engineering processes, develop robust and fault tolerant data solutions both on cloud and on-premise deployments.
You’ll be able to work across different data, cloud and messaging technology stacks.
You’ll have an opportunity to learn and work with specialised data and cloud technologies to widen the skill set.
A day in the life of a Data Engineer at Capco
Working alongside clients to interpret requirements and define industry-leading solutions
Designing and developing robust, well tested data pipelines
Demonstrating and helping clients adhere to best practices in engineering and SDLC
Building event-driven, loosely coupled distributed applications
Developing both on-premise and cloud-based solutions
Supporting internal Capco capabilities by sharing insight, experience and credentials
About you
Capco is looking for hardworking, innovative, and creative people to join our Digital Engineering team.
We'd also like to see:
- Practical experience of engineering best practices, while being obsessed with continuous improvement.
- Deep technical knowledge of two or more technologies and a curiosity for learning other parts of the stack.
- Experience delivering software/technology projects leveraging Agile methodologies
- You have personally made valuable contributions to products, solutions and teams and can articulate the value to customers.
- You have played a role in the delivery of critical business applications and ideally customer facing applications.
- You can communicate complex ideas to non-experts with eloquence and confidence.
- You bring an awareness and understanding of new technologies being used in finance and other industries and loves to experiment.
- A passion for being part of the engineering team that is forming the future of finance.
Skills & Expertise:
You will have experience working with some of the following Methodologies/Technologies.
- Strong cloud provider’s experience on at least one of AWS, Azure or GCP
- Hands on experience using Scala/JAVA
- Experience in most of data and cloud technologies such as Hadoop, HIVE, Spark, Pig, SQOOP, Flume, PySpark, Databricks, Cloudera, Airflow, Oozie, S3, Glue, Athena, Terraform etc.
- Experience with schema design using semi-structured and structured data structures
- Experience on messaging technologies – Kafka, Spark Streaming, Amazon Kinesis
- Good understanding of key security technologies, protocols e.g. TLS, OAuth, Encryption
- Strong experience in SQL
- Good understanding of the differences and tradeoff between SQL and NoSQL, ETL and ELT
- Understanding of containerisation, Graph Databases and ML Algorithms
- Experience with data lake formation and data warehousing principles and technologies – BigQuery, Redshift, Snowflake
- Experience using version control tool such as Git
- Experiencing building CI/CD Pipelines on Jenkins, CircleCI
- Enthusiasm and ability to pick up new technologies as needed to solve your problems
- Experience migrating on-premise solutions to a cloud ecosystem.
Joining Capco means joining an organisation that is committed to an inclusive working environment where you’re encouraged to #BeYourselfAtWork. We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. It’s important to us that we recruit and develop as diverse a range of talent as we can and we believe that everyone brings something different to the table – so we’d love to know what makes you different. Such differences may mean we need to make changes to our process to allow you the best possible platform to succeed, and we are happy to cater to any reasonable adjustments you may require. You will find the section to let us know of these at the bottom of your application form or you can mention it directly to your recruiter at any stage and they will be happy to help.
Tags: Agile Airflow Athena AWS Azure BigQuery CI/CD Databricks Data pipelines Data Warehousing ELT Engineering ETL Finance GCP Git Hadoop Kafka Kinesis Machine Learning NoSQL Oozie Pipelines PySpark Redshift Scala SDLC Security Snowflake Spark SQL Streaming Terraform
Perks/benefits: Career development Fertility benefits Flat hierarchy Parental leave
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open AI Engineer jobs
- Open Data Science Manager jobs
- Open Senior Business Intelligence Analyst jobs
- Open MLOps Engineer jobs
- Open Data Manager jobs
- Open Data Engineer II jobs
- Open Power BI Developer jobs
- Open Sr Data Engineer jobs
- Open Principal Data Engineer jobs
- Open Data Analytics Engineer jobs
- Open Business Intelligence Developer jobs
- Open Data Scientist II jobs
- Open Junior Data Scientist jobs
- Open Product Data Analyst jobs
- Open Senior Data Architect jobs
- Open Business Data Analyst jobs
- Open Sr. Data Scientist jobs
- Open Big Data Engineer jobs
- Open Data Analyst Intern jobs
- Open Manager, Data Engineering jobs
- Open Azure Data Engineer jobs
- Open Junior Data Engineer jobs
- Open Data Product Manager jobs
- Open Data Quality Analyst jobs
- Open Principal Data Scientist jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open ML models-related jobs
- Open GCP-related jobs
- Open Data management-related jobs
- Open Java-related jobs
- Open Privacy-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open APIs-related jobs
- Open Deep Learning-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open Snowflake-related jobs
- Open TensorFlow-related jobs
- Open PhD-related jobs
- Open CI/CD-related jobs
- Open Kubernetes-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open LLMs-related jobs
- Open Airflow-related jobs
- Open Data warehouse-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs