Data Modeler - Technical Platform
Poznań, Poland
Job Description
A hybrid work model that incorporates solutions developed by the leader and the team
The Technical Platform Teams provide the foundations for the structure and operations of the entire Allegro platform. This is where we build tools, development and infrastructure stacks and automation systems used by more than a hundred Allegro product teams and by the customer support department. We develop the full stack, starting from our own server room and private cloud, through containerization, mobile apps and application software to tools which offer AI support to our customers. At the same time, we are responsible for the performance, availability and security of the entire platform.
Why should you work with us?
- ~800 databases in production
- ~2500 data nodes
- Total traffic to databases ~million ops
- ~300 TB of data
- We are innovatively doing databases on Kuberenetes
- Two datacenters, Kubernetes, bare metal and cloud
Szukamy osób, które:
- Have experience in designing models for relational or non-relational databases
- Know query tuning methods, can profile them, and choose optimal indexes
- Are not afraid of working with large data sets, know the difference between sharding and partitioning
- Can demonstrate experience with at least one SQL or NoSQL database
- Are willing to learn new things and spread this knowledge to other engineers
- Know English at B2 level
In your daily work you will handle the following tasks:
- You will support development teams in solving problems at the interface between applications and databases and in selecting appropriate solutions throughout the Allegro product development cycle
- Data modeling, so you will help in designing effective databases
- Analysis of long response times, selection of optimal indexes, query tuning
What we offer:
- A hybrid work model that you will agree on with your leader and the team. We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)
- Annual bonus up to 10% of the annual salary gross (depending on your annual assessment and the company's results)
- A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)
- English classes that we pay for related to the specific nature of your job
- Laptop with m1 processor, 32GB RAM, SSD - a 16” or 14” MacBook Pro or corresponding Dell with Windows (if you don’t like Macs), two monitors and all other gadgets that you should need
- Working in a team you can always count on — we have on board top-class specialists and experts in their areas of expertise
- A high degree of autonomy in terms of organizing your team’s work; we encourage you to develop continuously and try out new things
- Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)
- If you want to learn more, check it out
Why is it worth working with us?
- The IT team is made up of over 1700 members who have shared their knowledge at multiple conferences, such as Devoxx, Geecon, Confitura and co-create a blog: allegro.tech
- Microservices – a few thousand microservices and 1.8m+ rps on our business data bus
- Big Data – several petabytes of data and Machine Learning used in production
- We practice Code Review, Continuous Integration, Scrum/Kanban, Domain Driven Design, Test Driven Development, Pair Programming, depending on the team
- Our internal ecosystem is based on self-service and widely used tools, such as Kubernetes, Docker, Consul, GitHub or GitHub Actions. This will allow you, from day one, to develop software using any language, architecture and scale, restricted only by your creativity and imagination.
- To match the scale, we also focus on building entire Platforms of tools and technologies that accelerate and facilitate day-to-day development, and we ensure the best Developer Experience to our teams
- Technological autonomy: you get to choose which technology solves the problem at hand (no need for management’s consent). You are responsible for what you create
- Our deployment environment combines private Data Centers (tens of thousands of servers) and Public Clouds (Google Cloud and Microsoft Azure)
- Over 100 original open source projects and a few thousand stars on github
- We organize Allegro Tech Live event, a 100% remote version of our offline Allegro Tech Talks meetups, and we make guest appearances at the invitation of such communities as Warsaw AI, JUG (Poznań, Łódź, Lublin, Wrocław), WG .Net, Dare IT, Women in Tech Summit
- We focus on development as well. We organize hackathons and internal conferences (e.g. the annual Allegro Tech Meeting), our employees regularly participate in events both in Poland and abroad (Europe and USA), and each team has its own budget for training and study aids. If you want to keep growing and share your knowledge, we will always support you
This may also be of interest to you:
Allegro Tech Podcast → https://podcast.allegro.tech/
Booklet → https://allegro.tech/booklet.pdf
Send in your CV and see why it is #dobrzetubyć (#goodtobehere)
Tags: Architecture Azure Big Data Docker GCP GitHub Google Cloud Kanban Kubernetes Machine Learning Microservices NoSQL Open Source RDBMS Scrum Security SQL
Perks/benefits: Career development Conferences Gear Health care Lunch / meals Salary bonus Team events
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open AI Engineer jobs
- Open Data Science Manager jobs
- Open MLOps Engineer jobs
- Open Data Manager jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Engineer II jobs
- Open Power BI Developer jobs
- Open Sr Data Engineer jobs
- Open Principal Data Engineer jobs
- Open Data Analytics Engineer jobs
- Open Business Intelligence Developer jobs
- Open Data Scientist II jobs
- Open Junior Data Scientist jobs
- Open Senior Data Architect jobs
- Open Product Data Analyst jobs
- Open Business Data Analyst jobs
- Open Sr. Data Scientist jobs
- Open Big Data Engineer jobs
- Open Data Analyst Intern jobs
- Open Manager, Data Engineering jobs
- Open Junior Data Engineer jobs
- Open Azure Data Engineer jobs
- Open Data Product Manager jobs
- Open Data Quality Analyst jobs
- Open ETL Developer jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open ML models-related jobs
- Open Data management-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open Privacy-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open APIs-related jobs
- Open Deep Learning-related jobs
- Open PyTorch-related jobs
- Open Snowflake-related jobs
- Open Consulting-related jobs
- Open TensorFlow-related jobs
- Open PhD-related jobs
- Open CI/CD-related jobs
- Open Kubernetes-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open LLMs-related jobs
- Open Airflow-related jobs
- Open Data warehouse-related jobs
- Open Databricks-related jobs
- Open Hadoop-related jobs