Middle/Senior Data Engineer
Sofia, Bulgaria
Devexperts
Devexperts provides financial software and services for brokerages, exchanges and wealth management firms in equity, options, futures and Forex markets.Company Description
Devexperts has been working for nearly two decades consulting and developing for the financial industry. We solve complex technological challenges facing the most well-respected financial institutions worldwide.
By becoming a part of Devexperts, you’ll become a part of a company that fosters self-improvement and actively seeks out-of-the-box ideas. Our teams work together to create the next generation of financial software solutions. We welcome all candidates who believe, as we do, that innovation is grounded in education.
Job Description
We are looking for a Middle/Senior Data Engineer with a Java / Scala / Python background to join the project for a Top-5 US retail broker (by the number of users). The project is devoted to trading experience, finance reports, and risk management.
You will join a cross-functional team that excels in getting features done from zero to production.
Key responsibilities:
1. Data Pipeline Development:
- Design, develop, and maintain robust data pipelines using Java within AWS infrastructure.
- Implement scalable solutions for data analysis and transformation using Apache Spark and PySpark.
- Utilise Airflow for efficient workflow orchestration in complex data processing tasks.
- Ensure fast and interactive querying capabilities through the use of Presto.
2. Infrastructure Management:
- Containerise applications using Docker for streamlined deployment and scaling.
- Orchestrate and manage containers effectively with Kubernetes in production environments.
- Implement infrastructure as code using Terraform for provisioning and managing AWS resources.
3. Collaboration and Communication:
- Collaborate with cross-functional teams to understand data requirements and architect scalable solutions aligned with business goals.
- Ensure data quality and reliability through robust testing methodologies and monitoring solutions.
- Stay updated with emerging technologies and industry trends to continuously enhance the data engineering ecosystem.
Qualifications
1. Education and Experience:
- Bachelor's degree in Computer Science, Engineering, or related field.
- Minimum 4 years of hands-on experience in Java / Scala / Python development, emphasising object-oriented principles.
2. Technical Proficiency:
- Proficient in Apache Spark or PySpark for large-scale data processing.
- Experience with Airflow for workflow orchestration in production environments.
- Familiarity with Docker for containerisation and Kubernetes for container orchestration.
- Knowledge of Terraform for infrastructure as code implementation in AWS environments.
- Experience managing AWS services such as S3, EMR, Glue, Athena, and Redshift.
- Strong background in SQL and relational databases, with proficiency in technologies like Postgres.
- Preference for experience with streaming platforms such as Kafka for real-time data processing.
3. Communication Skills:
- Excellent English language communication skills, both verbal and written.
- Ability to collaborate effectively with technical and non-technical stakeholders.
Additional Information
- Paid vacation 20 + 5 days
- Free MultiSport card
- Medical insurance – premium package
- Мodern office space
- Panoramic view of Vitosha mountain
- Gym & billiard in the office
- Parking spot or public transport card
- Mentorship program
- Training, courses, workshops
- Paid pro certifications
- Subscriptions to pro sources
- Participation in conferences
- English courses
- Trading contest within the company
- Tech meetup dxTechTalk
- Speaker's club
- Opportunity to develop your personal brand as a speaker
- Internal referral program
- Remote work / Hybrid mode
- Flexible schedule
- Work & Travel program
- Relocation opportunities
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Athena AWS Computer Science Consulting Data analysis Data pipelines Data quality Docker Engineering Finance Java Kafka Kubernetes Pipelines PostgreSQL PySpark Python RDBMS Redshift Scala Spark SQL Streaming Terraform Testing
Perks/benefits: Career development Conferences Flex hours Flex vacation Relocation support
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Marketing Data Analyst jobs
- Open Data Science Manager jobs
- Open MLOps Engineer jobs
- Open Data Engineer II jobs
- Open Senior Business Intelligence Analyst jobs
- Open Principal Data Engineer jobs
- Open Data Manager jobs
- Open Power BI Developer jobs
- Open Data Scientist II jobs
- Open Junior Data Scientist jobs
- Open Sr Data Engineer jobs
- Open Business Data Analyst jobs
- Open Data Analytics Engineer jobs
- Open Business Intelligence Developer jobs
- Open Product Data Analyst jobs
- Open Data Analyst Intern jobs
- Open Sr. Data Scientist jobs
- Open Senior Data Architect jobs
- Open Big Data Engineer jobs
- Open Manager, Data Engineering jobs
- Open Principal Data Scientist jobs
- Open Azure Data Engineer jobs
- Open Data Quality Analyst jobs
- Open Research Scientist jobs
- Open Data Product Manager jobs
- Open Data quality-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open Business Intelligence-related jobs
- Open ML models-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open PhD-related jobs
- Open Deep Learning-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open PyTorch-related jobs
- Open TensorFlow-related jobs
- Open APIs-related jobs
- Open NLP-related jobs
- Open Consulting-related jobs
- Open Snowflake-related jobs
- Open LLMs-related jobs
- Open Generative AI-related jobs
- Open CI/CD-related jobs
- Open Kubernetes-related jobs
- Open Hadoop-related jobs
- Open Data governance-related jobs
- Open Airflow-related jobs
- Open Databricks-related jobs