Data Engineer, ETL/Hadoop Developer
Budapest, Budapest, Hungary
MP Solutions Ltd.
Primary responsibilities
- Design and development of ETL and Hadoop/ Snowflake applications.
- Undertaking end-to-end project delivery (from inception to post-implementation support), including review and finalization of business requirements, creation of functional specifications and/or system designs, and ensuring that end-solution meets business needs and expectations.
- Responsibilities around deployment support (late hour, weekend).
- Development of new transformation processes to load data from source to target, or performance tuning of existing ETL code (mappings, sessions) and Hadoop/ Snowflake Platform.
- Analysis of existing designs and interfaces and applying design modifications or enhancements.
- Coding and documenting data processing scripts and stored procedures.
- Providing business insights and analysis findings for ad-hoc data requests
- Testing software components and complete solutions (including debugging and troubleshooting) and preparing migration documentation. Providing reporting-line transparency through periodic updates on project or task status.
Requirements
- Bachelor’s/master's degree in engineering, preferably Computer Science/Engineering.
- 3+ years of experience with the technical analysis and design, development and implementation of data warehousing / Data Lake solutions.
- Strong SQL programming and stored procedure development skills.
- 2+ years of experience developing in Informatica or any other ETL tool.
- 2+ years relational database experience.
- Strong UNIX Shell scripting experience to support data warehousing solutions.
- Process oriented, focused on standardization, streamlining, and implementation of best practices delivery approach.
- Excellent problem solving and analytical skills.
- Excellent verbal and written communication skills.
- Experience in optimizing large data loads.
Advantages
- Understanding/experience in Hive/Impala/Spark/Snowflake.
- Experience with Teradata is a big plus.
- Ability to architect an ETL solution and data conversion strategy.
- Exposure to an Agile Development environment.
- Knowledge about TWS Scheduler.
- Strong understanding of Data warehousing domain.
- Good understanding of dimensional modelling.
- Should be a good Team player.
Benefits
- You will have the opportunity to gain experience in exciting, long-term, innovative projects
- Flexible working arrangements (core hours and opportunity to work from home)
- Work in a multinational team/environment,
- A team of great engineers,
- Cafeteria
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Computer Science Data Warehousing Engineering ETL Hadoop Informatica RDBMS Shell scripting Snowflake Spark SQL Teradata Testing
Perks/benefits: Flex hours
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Data Engineer II jobs
- Open Data Science Manager jobs
- Open Software Engineer jobs
- Open Senior Software Engineer jobs
- Open Research Scientist jobs
- Open Principal Data Scientist jobs
- Open Business Data Analyst jobs
- Open Data Scientist II jobs
- Open BI Analyst jobs
- Open Sr Data Engineer jobs
- Open Software Engineer, Machine Learning jobs
- Open Business Intelligence Engineer jobs
- Open Sr. Data Scientist jobs
- Open Lead Data Analyst jobs
- Open MLOps Engineer jobs
- Open Data Science Intern jobs
- Open Senior Business Intelligence Analyst jobs
- Open Junior Data Scientist jobs
- Open Data Engineer III jobs
- Open Azure Data Engineer jobs
- Open Data Analyst II jobs
- Open Manager, Data Engineering jobs
- Open Junior Data Engineer jobs
- Open Data Engineering Manager jobs
- Open Product Data Analyst jobs
- Open Tableau-related jobs
- Open Data management-related jobs
- Open Excel-related jobs
- Open Power BI-related jobs
- Open APIs-related jobs
- Open Data quality-related jobs
- Open Deep Learning-related jobs
- Open LLMs-related jobs
- Open PyTorch-related jobs
- Open Data pipelines-related jobs
- Open Finance-related jobs
- Open PhD-related jobs
- Open TensorFlow-related jobs
- Open Generative AI-related jobs
- Open NLP-related jobs
- Open Consulting-related jobs
- Open Data visualization-related jobs
- Open Kubernetes-related jobs
- Open CI/CD-related jobs
- Open DevOps-related jobs
- Open Docker-related jobs
- Open Business Intelligence-related jobs
- Open Data governance-related jobs
- Open Git-related jobs
- Open Databricks-related jobs