Cloud Data Engineer (Infrastructure) - Data Engineering Team

United Kingdom

Applications have closed

Yapily

Connect to thousands of banks across 19 countries in minutes through Yapily's open banking API. Access data, process payments, and much more. What will you build?

View company page

🌍 Our Vision

To be the central nervous system of the global economy

🚀 Our Mission

Become the universal key that enables the true potential of Open Finance

⚖️ Why We Exist

To enable fairer & better financial services for everyone

What we’re looking for

We are currently looking for a Cloud Data Engineer to join our Data Engineering team. Focusing on how we store, process and secure data for the rest of the business, the successful candidate will lead our efforts to ensure quality and integrity of all the company's data.

Reports to

Data Tech Lead

Who are Yapily

We’re a young company and we’re growing fast. We have offices in London, Vilnius, and we have over 165 talented individuals joining us from their homes and coworking spaces from around the world. We’re wonderfully diverse, with 27 nationalities - and counting! We continue to expand our global reach as we build the best infrastructure for our customers.

We attract people who are excited to work in a nascent and disruptive industry that will change the way we interact with financial services, forever. We’re building the infrastructure to power the global economy.

We are looking for people who don’t always know the answers but can use their skill and judgement to work beautifully within the team to create better solutions for our customers. So whichever discipline you choose, whether that’s Engineering, Product, Commercial or Operations, we hire people who want to make a difference and can roll up their sleeves to get the job done from many different backgrounds.

Responsibilities include, but are not limited to:

  • Designing data processing solutions on the cloud.
  • Choosing appropriate storage technologies based on business requirements such as retention, availability, consistency.
  • Keeping data separated based on data residency requirements.
  • Developing solutions across multiple cloud vendors.
  • Designing solutions based on security and compliance needs.
  • Choosing Database solutions based on the problem and its characteristics.
  • Designing Solutions that utilise IAM, data encryption, key management and respect compliance and privacy.
  • Creating and maintaining data pipelines.
  • Implementing ETL processes combining data from a variety of datasources.
  • Monitoring and alerting on existing datastores and databases
  • Managing and upgrading etc. databases and related technologies as and when necessary




Requirements

We would love you to have:

  • PostgreSQL / MS SQL, InfluxDB, Redis
  • ETL
  • Google Pub / Sub, Apache Kafka, AWS Kinesis
  • Knowledge on cloud infrastructure especially GCP and AWS
  • Advanced level knowledge on tools like GCP Monitoring, AWS Cloudwatch, Datadog
  • Proficient in Databases deployments and Data migrations
  • Data Warehousing solutions like Redshift, BigQuery, Snowflake etc. Data processing tools such as Apache Flink, Kinesis Firehose, Google Dataflow, Dataproc.
  • Understanding machine learning platforms such as Kubeflow and Sagemaker.
  • Data workflow orchestration services such as Composer.
  • Candidates with certifications like Google Data Engineer and AWS Certified Big Data will be strongly considered.
  • Experience in a cloud native fintech and close working with development teams (development experience a big plus but not necessary for the role
  • Interview ProcessThe application process consists of a 30 minute phone interview with one of our Talent Acquisition Partners, followed by video interviews with the hiring manager and various team members. There will also be a Strengths & Values conversation which will give you an opportunity to learn more about our culture and for us to understand how you like to work. You might also meet with a member of the management team and or have a short technical test depending on the role you are applying for. All interviews are held via Zoom or Google Meet currently. We continue to work in a hybrid approach with a mix of working from home and the office.About YapilyYapily is here to power a new era of financial services so that everyone can receive faster, affordable and personalised financial products. We connect to thousands of banks using a secure open API enabling innovative companies to embed the power of open banking into their products. We provide infrastructure for industry leaders including American Express, Intuit Quickbooks, Vivid, Moneyfarm and BUX.

Benefits


  • Competitive salary in a rapidly growing scale up
  • 25 days holiday (UK)
  • Generous level of stock options
  • Company learning budget for seminars, courses, books etc
  • Pension with employer contribution
  • BUPA Healthcare
  • Season Ticket Loan
  • Flexible working options (open to discussions on work location, flexible and part time hours)
  • Contribution to remote desk setup
  • MAC Products
  • Cycle to work scheme
  • Diverse high performing team environment

Tags: APIs AWS Banking Big Data BigQuery Dataflow Data pipelines Dataproc Data Warehousing Engineering ETL Finance FinTech Firehose Flink GCP Kafka Kinesis Machine Learning MS SQL Pipelines PostgreSQL Redshift SageMaker Security Snowflake SQL

Perks/benefits: Career development Competitive pay Equity Flex hours Startup environment

Region: Europe
Country: United Kingdom
Job stats:  1  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.