Data Infrastructure Engineer | Data Platform Core (f/m/d) - GER, UK, NL, PL or Remote
Remote job
DeepL
Translate texts & full document files instantly. Accurate translations for individuals and Teams. Millions translate with DeepL every day.is Germany's best-known AI company. We develop neural networks to help people work with language. With DeepL Translator, we have created the world's best machine translation system and made it available free of charge to everyone online. Over the next few years, we aim to make DeepL the world's leading language technology company.
Our goal is to overcome language barriers and bring cultures closer together.
What distinguishes us from other companies?
DeepL (formerly Linguee) was founded by developers and researchers. We focus on the development of new, exciting products, which is why we spend a lot of time actively researching the latest topics. We understand the challenges of developing new products and try to meet them with an agile and dynamic way of working. Our work culture is very open because we want our employees to feel comfortable. In our daily work we use modern technologies - not only to translate texts, but also to create the world's best dictionaries, and solve other language problems.
When we tell people about DeepL as an employer, reactions are overwhelmingly positive. Maybe it's because they have enjoyed our services, or maybe they just want to get on board with our quest to break down language barriers and facilitate communication.
Your choice
We are constantly looking for outstanding employees! Currently we offer remote work in Germany, the Netherlands, the UK and Poland. Whether you would like to work from home in one of these countries or from one of our offices in Cologne or Paderborn: the choice is yours. No matter where you choose to work from, our way of working is designed to make you an essential part of the team.
What will you be doing at DeepL?
We are looking for an Infrastructure Engineer to join a team of versatile engineers that craft libraries, tooling and the infrastructure supporting data processing at DeepL. The Data Platform combines different data sources, both internal and external and makes them available to our stakeholders company-wide: Developers, Product Development, Data Science and Management. You will work in a cross-functional team with Product Managers, Data Scientists, Data Engineers, and Developers to solve complex technical challenges.
Requirements
Your responsibilities
- You are part of the team that maintains and scales the infrastructure for storing, processing and streaming data
- Administer Kafka clusters that play a central role in the company's data delivery
- Administer ClickHouse, which is our company-wide data warehouse
- Consulting other engineering teams directly on designing and deploying their services to bare-metal Kubernetes
- Monitoring in Prometheus and Grafana - you build and set up solutions that make sure our systems are operating smoothly
- Play your part in setting up backups and load testing for a reliable infrastructure
What we offer
- Meaningful work: We break down language barriers worldwide and bring different cultures closer together
- Data at scale from products used by more than 1 billion people worldwide
- Our own data, analytics, machine learning and experimentation platform - far beyond the limitations of standard web analytics platforms
- Interesting challenges: design and programming at the highest level
- We are a distributed work force enabling our employees to work from the comfort of your home office in Germany, Poland, Netherlands or the UK or in one of our comfortable offices
- State of the art equipment for your workplace
- A friendly, international, and highly committed team with a lot of trust and very short decision making processes
About you
- Ideally experience with Kafka (or other event brokers) and/or Kubernetes
- Ideally experience with ClickHouse or other data warehouse solutions
- Experience with distributed and data-centric systems that ingests and stores massive amounts of data
- Experience in deploying and maintaining infrastructure predominantly used by developers
- You are curious and aren't afraid to interact with low-level systems
- You are fluent in English
Please make sure to send your application in English, we are looking forward to it!
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Consulting Data warehouse Engineering Grafana Kafka Kubernetes Machine Learning Streaming Testing
Perks/benefits: Gear Home office stipend
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Marketing Data Analyst jobs
- Open MLOps Engineer jobs
- Open Junior Data Scientist jobs
- Open Data Engineer II jobs
- Open AI Engineer jobs
- Open Senior Data Architect jobs
- Open Power BI Developer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Analytics Engineer jobs
- Open Sr Data Engineer jobs
- Open Manager, Data Engineering jobs
- Open Principal Data Engineer jobs
- Open Business Data Analyst jobs
- Open Product Data Analyst jobs
- Open Data Quality Analyst jobs
- Open Data Manager jobs
- Open Sr. Data Scientist jobs
- Open Big Data Engineer jobs
- Open Data Scientist II jobs
- Open Business Intelligence Developer jobs
- Open Data Analyst Intern jobs
- Open ETL Developer jobs
- Open Principal Data Scientist jobs
- Open Azure Data Engineer jobs
- Open Data Product Manager jobs
- Open Business Intelligence-related jobs
- Open Data quality-related jobs
- Open Privacy-related jobs
- Open Data management-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open ML models-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open Deep Learning-related jobs
- Open APIs-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open PhD-related jobs
- Open TensorFlow-related jobs
- Open Snowflake-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open Data warehouse-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open DevOps-related jobs
- Open CI/CD-related jobs