Data Engineer

Remote - London, England, United Kingdom

Applications have closed

Are we looking for you?

As a Data professional, you have ambitions to work on transformational projects, building big data solutions and designing large-scale data processing systems, growing as a true consultant. You'll enjoy partnering with enterprise level customers on how to ingest, store, process, analyse, explore and visualise data on Public Cloud platforms.

Complex business problems excite you, and you want to use Big Data, Auto-ML, ML and visualisation tech (with others) to solve them.


Are you looking for us?

Appsbroker is Google’s largest Premier Partner in Europe and a MSP. We deliver leading-edge cloud services to some of the world’s most exciting brands, in sectors from retail through automotive to finance.

We expose you to some of the most exciting Cloud transformation and optimisation project opportunities utilising Google Cloud (GCP). Full training and certification on GCP are provided as part of an extensive on-boarding program even if your background is more AWS/Azure.

Day to Day

Appsbroker is dynamic and the role is varied (there is no 'typical day') so you’ll get involved in a wide range of activities (more production than POC) but here is what to expect:

  • Identify, scope and participate in the design and delivery of cloud data platform solutions, data storage, movement and orchestration solutions utilising GCP
  • Design and execute a platform modernisation approach for customers’ data environments
  • Design, coordinate and execute pilots, prototypes or proof of concepts, provide validation on specific scenarios and provide deployment guidance
  • Collaborate and share best practices/insights with engineering colleagues and the architect community
  • Travel to client sites as appropriate (when restrictions allow)

Requirements

What skills we look for…


  • Spark. Hadoop, GCP (BigQuery/Dataproc), Python and SQL
  • Articulate communication skills to explain complex solutions to external or internal customers in a clear and concise manner, aligned with the ability to adjust style to a varied audience within an organisation.
  • Exposure to data science modelling, analytics & BI reporting (Tableau, Looker, Periscope, DataStudio).
  • History of working with Data warehouse solutions (on-premise & Public Cloud).
  • Analytical and design skills around the full end-to-end ETL lifecycle of data pipelines in large enterprise settings
  • Practical experience of data curation, cleansing, creation and maintenance of data sets for business purposes

Benefits

  • 25 days holiday
  • Google Certification Path
  • 4 x salary life assurance
  • Contributory pension
  • Opportunity to join our Vitality Health Insurance scheme
  • Flexible working and work from home
  • Electric car lease scheme
  • Work From Home
  • Private Healthcare Plan
  • Employee Assistance Programme

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: AWS Azure Big Data BigQuery Data pipelines Dataproc Engineering ETL Finance GCP Google Cloud Hadoop Looker Machine Learning Periscope Pipelines Python Spark SQL Tableau

Perks/benefits: Career development Flex hours Health care

Regions: Remote/Anywhere Europe
Country: United Kingdom
Job stats:  11  3  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.