Data Scientist, Pricing - Remote Working (UK)
Remote - London, England, United Kingdom
Applications have closed
We are Zego - a commercial motor insurance provider that powers opportunities for businesses, from entire fleets of vehicles to self-employed drivers and riders. We combine best-in-class technology with sophisticated data sources to offer insurance products that save businesses time and money.
Since our inception, we have believed that the problem with traditional insurance is that it holds businesses back. It’s too expensive and time consuming, and it no longer suits businesses who use vehicles to earn money. Our products represent a solution to this problem for businesses based across the UK, Europe and beyond.
So far, we have raised over $200 million in funding and we were the first UK insurtech to be valued at over $1 billion. We were also the first to be awarded our own insurance license and recently won Tech Company of the Year 2020.
At Zego, we are proud to say we have a diverse and inclusive team, unified by our shared values and mission. Our people are the most important part of our story and everybody at Zego, no matter their role, has an integral part to play.
Overview of the Data Science team
The mission of the Data Science team at Zego is to leverage novel data sources and modern computational tools to deliver value through data-driven insights, personalised pricing and insurance product innovation. It is a growing team, responsible for all the aspects related to data, including data sourcing, cleaning, extraction, problem formulation, implementing and testing models, productionising new exciting tools and algorithms and designing scalable tools, practices, and systems to support data science analysis and deployment. The team partners closely with Product, Pricing, Engineering and wider Commercial and Business functions to find the best ways to incorporate statistical models and evidence-based decision making into Zego’s products and processes.
Purpose of the role
The Data Science team (Pricing R&D squad) is looking for an ambitious Data Scientist to work on the development of new insurance pricing models, leveraging novel sources of behavioural data and state-of-the-art machine learning techniques. This is a dynamic role where you will work closely with Data Scientists, Machine Learning Engineers and Actuaries to improve Zego’s risk identification and selection capabilities. You will have an integral role in researching, testing and deploying new pricing models and powering Zego’s pricing strategies through data-driven insights and evidence-based decision-making.
Sitting at the heart of Zego’s industry leading data team, you will have the full support of Zego’s exceptional engineering, data and pricing teams to develop processes and tooling for feature engineering, model training, deployment, impact assessment, price simulation, and performance monitoring. You will work on projects involving data acquisition, signal processing, behavioural modelling, actuarial risk assessment, productionalisation and post-deployment analytics, giving you a wide range of exposure and plenty of chances to develop as a data professional.
The role is suited to a candidate with existing data science experience, ideally in the areas of signal processing, insurance modelling, predictive analytics, pricing and risk management. You will have proven skills using a programming language such as Python (or R), skill with SQL, and an understanding of good development practices (git, code review, documentation). You will be given the support and backing you need to develop your skills, including on the job training, L&D budget, and a personal career development plan.
About the role
- You will dive into the heart of Zego pricing R&D efforts and help to build market leading pricing analytics and risk prediction capabilities.
- You will work closely with the Data Science team to design, develop, deploy and maintain industry leading pricing models and technology.
- You will use modern tools, methods and techniques to develop advanced algorithms to take smart and instantaneous decisions and promote automation in different areas.
- You will work with various types and diverse datasets, including claims, driver behaviour, shift and trip information, traffic, weather, telemetry, geospatial data, web and smartphone sensors.
- You will conduct pre- and post-deployment impact analysis to quantify the success of new strategies and pricing implementations.
- You will contribute to improving Zego’s understanding how driver behavior relates to actuarial risk and how Zego’s can leverage the insights to disrupt the way the industry currently operates.
- Your models will have an outsized impact in optimizing the balance between profitability, risk, and growth.
- You will define and improve rigorous reproducible research practices to maintain excellence in all you do.
- You will champion a data-driven approach as you partner with stakeholders of different seniority showcasing your inclination as an outstanding individual contributor and develop into other senior roles.
- You will constantly generate new research ideas, building a case and presenting your findings to propel capability forward.
- You will work with a diverse team of collaborative data professionals who enjoy technical discussions, learning new methodologies, and helping each other improve through feedback and shared researchools
About you
- You have BSc/BA in Mathematics, Statistics, Engineering, Computer Science or similarly quantitative disciplines.
- You have 2+ years of experience as a data scientist, preferably in pricing, risk or insurance related roles.
- You are strongly proficient working with and querying structured data using SQL.
- You are highly experienced in Python (or R) applied to data analysis and predictive modelling.
- You are well-versed in modern machine learning tools and libraries such as Scikit-learn, Tensorflow, Caffe2, PyTorch and Theano.
- You are familiar with common statistical tools and methods employed in insurance analysis, such as generalised linear models (GLMs), classification and regression trees (CARTs), ensembling methods, and clustering approaches as well as demonstrable knowledge in Bayesian statistics.
- You have experience and enthusiasm for reproducible research techniques including version control (geg. Git & Github etc.).
- You have outstanding eye to detail and take care to make sure that your work is accurate and effective.
- You have the ability to present complex information in an easily digestible format, appropriate for the audience.
- You are a lifelong learner; as technologies and techniques are constantly evolving, you are proactively pushing yourself to develop your skills and knowledge.
- You are resilient to change and use a holistic view of the business to ensure your goals are aligned and your work has impact.
Bonus points:
- You have a graduate degree in a highly quantitative discipline such as Economics, Applied Mathematics, Physics or Computer Science.
- You have previous experience in a fast-growing scale-up.
- You have previous experience in the insurance industry.
- Experience with large-scale analytics on AWS platform.
Tags: AWS Bayesian Classification Computer Science Data analysis Economics Engineering Feature engineering Git GitHub Machine Learning Mathematics Model training Physics Python PyTorch R R&D Research Scikit-learn SQL Statistics TensorFlow Testing Theano
Perks/benefits: Career development
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open MLOps Engineer jobs
- Open Lead Data Analyst jobs
- Open AI Engineer jobs
- Open Data Engineer II jobs
- Open Sr Data Engineer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Principal Data Engineer jobs
- Open Data Manager jobs
- Open Power BI Developer jobs
- Open Data Analytics Engineer jobs
- Open Junior Data Scientist jobs
- Open Product Data Analyst jobs
- Open Senior Data Architect jobs
- Open Data Scientist II jobs
- Open Business Intelligence Developer jobs
- Open Sr. Data Scientist jobs
- Open Manager, Data Engineering jobs
- Open Data Quality Analyst jobs
- Open Big Data Engineer jobs
- Open Business Data Analyst jobs
- Open Data Analyst Intern jobs
- Open ETL Developer jobs
- Open Principal Data Scientist jobs
- Open Research Scientist jobs
- Open Data Product Manager jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open GCP-related jobs
- Open Data management-related jobs
- Open ML models-related jobs
- Open Privacy-related jobs
- Open Java-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open APIs-related jobs
- Open Deep Learning-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open TensorFlow-related jobs
- Open Snowflake-related jobs
- Open PhD-related jobs
- Open NLP-related jobs
- Open CI/CD-related jobs
- Open Data governance-related jobs
- Open Kubernetes-related jobs
- Open Databricks-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open LLMs-related jobs
- Open Generative AI-related jobs