Senior Data Engineer
Poland
Exadel
Advance your business through technology and pave the way to becoming a digital leader with Exadel, an enterprise software development and consulting company.We are looking for an experienced Senior Data Engineer to join our team of professionals!
Data engineers work in various settings to build systems that collect, manage, and convert raw data into usable information for data scientists and business analysts to interpret. Their ultimate goal is to make data accessible so that organizations can use it to evaluate and optimize their performance. Additionally, they develop different technical tools/services to enable large-scale machine learning solutions. As a Data Engineer, you will be focused on value creation and growth by making data accessible to the organization, thus enabling excellent customer experience and business results.
Work at Exadel - Who We Are:
Since 1998, Exadel has been engineering its own software products and custom software for clients of all sizes. Headquartered in Walnut Creek, California, Exadel currently has 2700+ employees in development centers across America, Europe, and Asia. Our people drive Exadel’s success, and they are at the core of our values, so Exadel is a people-first cultured company.
About the Customer:
The customer is one of the largest international clothing-retail companies known for its fast-fashion clothing. The business concept is to offer fashion and quality at the best price in a sustainable way.
Project Team:
When you join our team, you'll be immersed in a culture where teammates always help each other achieve better results. We believe that together we are greater and can find brilliant solutions by sharing ideas.
Requirements:
- Knowledge of Data Manipulation, ETL
- Advanced knowledge and coding skills in Python or/and SQL
- Ability to work proficiently with Cloud computing and develop Cloud solutions with Spark and Hadoop
- Hands-on experience working with development and deployment tools (CI/CD): Git, Jenkins, PowerShell/Bash
- Familiarity with Azure Data Factory
- General Understanding of Blockchain, IoT, API architecture, End-user computing
- Basic understanding of Machine Learning
- Experience with Microservices architecture
- Knowledge of Agile principles, both in practice and method
- A high customer and consumer focus
- Ability to establish clear goals and responsibilities to achieve a high level of performance
- Ability to evaluate different options proactively and ability to solve problems in an innovative way
- Develop new solutions or combine existing methods to create new approaches
- Comfortable in working with external product teams to establish the optimal data integration patterns/solutions
Nice to have:
- Knowledge of Kubernetes and Docker will be a plus
- Knowledge of GCP
- Knowledge of Architecture and Design principles
- Some Domain Knowledge: Network management and hosting, User experience, and Business resilience will be a plus
- Familiarity with SAP
- Data Science and Analytics experience
English level:
Intermediate+
Responsibilities:
- Design, develop and build real-time data pipelines from various sources (streaming data, APIs, data warehouse, data mesh, messages etc.)
- Manage project delivery, planning, steering, and risk to ensure that the delivery is effective and anchored to value
- Handle product development and management, as well as product risk and issue management
- Assist during application deployment and in need of support
- Leverage the understanding of software architecture, software development lifecycle, and software design patterns to write scalable, maintainable, well-designed and future-proof software
- Manage existing pipelines and create new pipelines from various sources (relational, XML, etc.)
- Propose and implement solutions for data pipeline stabilization and data quality checks
- Coordination with other teams to design optimal patterns for data ingest and egress, as well as lead and coordinate data quality initiatives and troubleshooting
- Design and build solutions to track data quality, stabilize data pipeline, etc. to ensure reliable operations
- Ensure best practices are followed across architecture, codebase, and configuration
- Manage on-premise hardware
- Ensure assurance, audit, compliance, and testing related to cyber security issues
- Implement and manage secure systems
- Secure incident management, investigation, and digital forensics
- Handle security governance and management, as well as threat assessment and information risk management
Advantages of Working with Exadel:
- You can build your expertise with our Client Engagement team, who provide assistance with existing and potential projects
- You can join any Exadel Community or create your own to communicate with like-minded colleagues
- You can participate in continuing education as a mentor or speaker. You will not only be emotionally but also financially rewarded for mentoring
- You can take part in internal and external meetups as a speaker or listener. We support you in broadening your horizons and encourage knowledge sharing for all of our employees
- You can learn English with the support of native speakers
- You can take part in cultural, sporting, charity, and entertainment events
- Working at Exadel means always upgrading your skills and proficiency, so we provide plenty of opportunities for professional development. If you’re looking for a challenge that will lead you to the next level of your career, you’ve found the right place
- We work hard to ensure honest and open relations between employees and leadership, so our offices are friendly environments
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile APIs Architecture Azure Blockchain CI/CD CX Data pipelines Data quality Data warehouse Docker Engineering ETL GCP Git Hadoop Kubernetes Machine Learning Microservices Pipelines Python Security Spark SQL Streaming Testing XML
Perks/benefits: Career development Startup environment Team events
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Marketing Data Analyst jobs
- Open MLOps Engineer jobs
- Open Junior Data Scientist jobs
- Open AI Engineer jobs
- Open Data Engineer II jobs
- Open Senior Data Architect jobs
- Open Sr Data Engineer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Analytics Engineer jobs
- Open Power BI Developer jobs
- Open Manager, Data Engineering jobs
- Open Product Data Analyst jobs
- Open Principal Data Engineer jobs
- Open Business Data Analyst jobs
- Open Data Quality Analyst jobs
- Open Data Manager jobs
- Open Sr. Data Scientist jobs
- Open Data Scientist II jobs
- Open Big Data Engineer jobs
- Open Business Intelligence Developer jobs
- Open Data Analyst Intern jobs
- Open Principal Data Scientist jobs
- Open ETL Developer jobs
- Open Azure Data Engineer jobs
- Open Data Product Manager jobs
- Open Business Intelligence-related jobs
- Open Data quality-related jobs
- Open Privacy-related jobs
- Open Data management-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open ML models-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open Deep Learning-related jobs
- Open APIs-related jobs
- Open PyTorch-related jobs
- Open PhD-related jobs
- Open Consulting-related jobs
- Open TensorFlow-related jobs
- Open Snowflake-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open Data warehouse-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open DevOps-related jobs
- Open Kubernetes-related jobs