AWS Data Engineer

London, United Kingdom

Applications have closed

PA Consulting

An innovation and transformation consultancy, we believe in the power of ingenuity to build a positive human future. Our clients adapt and transform, and…

View company page

Company Description

We believe in the power of ingenuity to build a positive human future in a technology-driven world.

As strategies, technologies and innovation collide, we create opportunity from complexity.

Our diverse teams of experts combine innovative thinking and breakthrough use of technologies to progress further, faster. Our clients adapt and transform, and together we achieve enduring results.

An innovation and transformation consultancy, we are over 3,200 specialists in consumer, defence and security, energy and utilities, financial services, government, health and life sciences, manufacturing, and transport. Our people are strategists, innovators, designers, consultants, digital experts, scientists, engineers and technologists. We operate globally from offices across the UK, US, Europe, and the Nordics.

PA. Bringing Ingenuity to Life.

 

Job Description

As a team, we transform our client’s businesses using a combination of strategic thinking, customer-centric service design, and agile engineering practices. We do this at internet scale – driving innovation, and enriching people’s lives. Our skills and talent enable PA’s purpose of creating positive a human future in a technology driven world. 

As part of our Digital team, you’ll work alongside colleagues from across PA – delivering transformative digital solutions to today’s most complex business challenges. Our teams are trusted to deliver and given the space to be awesome. We’re an inclusive community for the curious, generous, pragmatic and committed digital practitioner. 

Qualifications

AWS is a significant growth area for us with a diverse and growing capability and we are looking for a data engineer with experience in AWS cloud technologies for ETL pipeline, data warehouse and data lake design/building and data movement. There are a variety of different tools, cloud technologies and approaches and while we have a preference for AWS tooling experience, open-source equivalence will be suitable. 

As an AWS Data Engineer, you’ll have experience working in teams to design, build, and maintain large scale data solutions and applications using AWS data and analytics services (or open-source equivalent) such as EMR, Glue, RedShift, Kinesis, Lambda, DynamoDB. Your team members will look to you as a trusted expert and will expect you to define the end-to-end software development lifecycle in line with modern AWS best practices. Your AWS experience spans data engineering, data science and product development projects, plus you will have an understanding of stream and batch processing. 

Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Perform tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Work closely with other engineering teams to integrate data engineering component into production systems. Knowledge of data cleaning, wrangling, visualization and reporting, with an understanding of the best, most efficient use of associated tools and applications to complete these tasks. Ability to travel to client site, where required, will be a consideration. 

Experience in processing large amounts of structured and unstructured data, including integrating data from multiple sources through ingestion and curation functions on AWS cloud using AWS native or custom programming. Knowledge of data mining, machine learning, natural language processing is an advantage. You enjoy working within cross-functional Agile teams and you are familiar with Scrum ceremonies. 

You’ll be comfortable designing and building for the AWS cloud and will have designed and worked on architectures that include Platform-as-a-Service components and perhaps even server-less and container technologies. 

Additional Information

• Private medical insurance
• Interest free season ticket loan
• 25 days annual leave with the opportunity to buy 5 additional days
• Company pension scheme
• Annual performance-based bonus
• Life and Income protection insurance
• Tax efficient benefits (cycle to work, give as you earn)
• Voluntary benefits (Dental, critical illness, spouse/partner life assurance)

 

PA is committed to building an inclusive and supportive culture where diversity thrives, and all of our people can excel. We believe that greater diversity stimulates innovation, enabling us to fulfil our purpose of ‘Bringing Ingenuity to Life’, supporting the growth of our people, and delivering more enduring results for our clients.

We only recruit, promote and reward our people based on their contribution, without regard to gender, race, disability, religion, nationality, ethnicity, sexual orientation, age or marital status.

We welcome international applications, but we are unable to offer sponsorship for work permits, so you will need to have the full right to live and work in the UK. Unfortunately, your application will be automatically rejected if you do not have these rights.

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile APIs Architecture AWS Big Data Data Mining Data pipelines Data warehouse DynamoDB Engineering ETL Excel Java Kinesis Lambda Machine Learning NLP Open Source Pipelines Python Redshift Scala Scrum Security Spark SQL Travel Unstructured data

Perks/benefits: Career development Health care Insurance Medical leave Salary bonus

Region: Europe
Country: United Kingdom
Job stats:  6  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.