Software Engineer (Scala) - Data&AI
Poznań, Warszawa, Kraków, Poland
Company Description
The salary range for this position is (contract of employment):
mid: 12 300 - 17 600 PLN in gross terms
senior: 16 100 - 23 200 PLN in gross terms
A hybrid work model that incorporates solutions developed by the leader and the team.
As part of the Data & AI area, we implement projects based on the practical "data science" and "artificial intelligence" applications of an unprecedented scale in Poland. Data & AI is a group of over 100 experienced BigData engineers organized into over a dozen teams with various specializations. Some of them build dedicated tools for creating and launching BigData processes or implementing ML models for the entire organization. Others work closer to the client and are responsible for the implementation of the search engine, creating recommendations, building a buyer profile or developing an experimental platform. There are also research teams in the area whose aim is to find solutions to non-trivial problems requiring the use of machine learning.
We are looking for BigData engineers who want to build a data platform and solutions for millions of Allegro customers, e.g.:
- A system that provides the ability to materialize events (received from microservices) into a form that allows building tables using Big Data tools. Our daily challenges are the scale of traffic, information structuring and stream synchronization of collections between on-premises solutions and public cloud environments
- Metadata aggregation processes, integrated with Allegro's data platform to support tools and processes in the areas of: Data Quality, Data Lineage and Data Governance
Job Description
We are looking for people who:
- For several years are programming in technologies run on JVM
- Have experience in Scala or want to develop in this direction
- Have experience in the Big Data ecosystem (Hadoop, Spark, Airflow)
- Have knowledge of GCP (Dataproc, Dataflow, BigQuery, Pubsub) will be an added advantage
- Have ideas and willingness to get involved in designing solutions architecture
- Are looking for effective solutions to the requirements set by users
- Are interested in a variety of tools and the possibility of using them appropriately to the problem
- Are maintaining good programming practices
- Like to share their knowledge and experience with others
- Want to constantly develop and update their knowledge
- Know English at B2 level
What we offer:
- A hybrid work model that you will agree on with your leader and the team. We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)
- Annual bonus up to 10% of the annual salary gross (depending on your annual assessment and the company's results)
- A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)
- English classes that we pay for related to the specific nature of your job
- Laptop with m1 processor, 32GB RAM, SSD - a 16” or 14” MacBook Pro or corresponding Dell with Windows (if you don’t like Macs), two monitors and all other gadgets that you should need
- Working in a team you can always count on — we have on board top-class specialists and experts in their areas of expertise
- A high degree of autonomy in terms of organizing your team’s work; we encourage you to develop continuously and try out new things
- Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)
- If you want to learn more, check it out
Why is it worth working with us?
- The IT team is made up of over 1700 members who have shared their knowledge at multiple conferences, such as Devoxx, Geecon, Confitura and co-create a blog: allegro.tech
- Microservices – a few thousand microservices and 1.8m+ rps on our business data bus
- Big Data – several petabytes of data and Machine Learning used in production
- We practice Code Review, Continuous Integration, Scrum/Kanban, Domain Driven Design, Test Driven Development, Pair Programming, depending on the team
- Our internal ecosystem is based on self-service and widely used tools, such as Kubernetes, Docker, Consul, GitHub or GitHub Actions. This will allow you, from day one, to develop software using any language, architecture and scale, restricted only by your creativity and imagination.
- To match the scale, we also focus on building entire Platforms of tools and technologies that accelerate and facilitate day-to-day development, and we ensure the best Developer Experience to our teams
- Technological autonomy: you get to choose which technology solves the problem at hand (no need for management’s consent). You are responsible for what you create
- Our deployment environment combines private Data Centers (tens of thousands of servers) and Public Clouds (Google Cloud and Microsoft Azure)
- Over 100 original open source projects and a few thousand stars on github
- We organize Allegro Tech Live event, a 100% remote version of our offline Allegro Tech Talks meetups, and we make guest appearances at the invitation of such communities as Warsaw AI, JUG (Poznań, Łódź, Lublin, Wrocław), WG .Net, Dare IT, Women in Tech Summit
- We focus on development as well. We organize hackathons and internal conferences (e.g. the annual Allegro Tech Meeting), our employees regularly participate in events both in Poland and abroad (Europe and USA), and each team has its own budget for training and study aids. If you want to keep growing and share your knowledge, we will always support you
This may also be of interest to you:
Allegro Tech Podcast → https://podcast.allegro.tech/
Booklet → https://allegro.tech/booklet.pdf
Send in your CV and see why it is #dobrzetubyć
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture Azure Big Data BigQuery Dataflow Data governance Dataproc Data quality Docker GCP GitHub Google Cloud Hadoop Kanban Kubernetes Machine Learning Microservices ML models Open Source Research Scala Scrum Spark
Perks/benefits: Career development Conferences Gear Health care Lunch / meals Salary bonus Team events
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Marketing Data Analyst jobs
- Open MLOps Engineer jobs
- Open AI Engineer jobs
- Open Data Engineer II jobs
- Open Junior Data Scientist jobs
- Open Senior Data Architect jobs
- Open Sr Data Engineer jobs
- Open Data Analytics Engineer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Power BI Developer jobs
- Open Principal Data Engineer jobs
- Open Manager, Data Engineering jobs
- Open Product Data Analyst jobs
- Open Business Data Analyst jobs
- Open Data Manager jobs
- Open Data Quality Analyst jobs
- Open Sr. Data Scientist jobs
- Open Data Scientist II jobs
- Open Big Data Engineer jobs
- Open Business Intelligence Developer jobs
- Open Data Analyst Intern jobs
- Open Principal Data Scientist jobs
- Open ETL Developer jobs
- Open Azure Data Engineer jobs
- Open Data Product Manager jobs
- Open Business Intelligence-related jobs
- Open Data quality-related jobs
- Open Privacy-related jobs
- Open Data management-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open ML models-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open Deep Learning-related jobs
- Open APIs-related jobs
- Open PyTorch-related jobs
- Open PhD-related jobs
- Open TensorFlow-related jobs
- Open Consulting-related jobs
- Open Snowflake-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open Data warehouse-related jobs
- Open Airflow-related jobs
- Open Databricks-related jobs
- Open Hadoop-related jobs
- Open LLMs-related jobs
- Open DevOps-related jobs
- Open CI/CD-related jobs