Data Engineer, ETL/Hadoop Developer

Budapest, Budapest, Hungary

Applications have closed

Primary responsibilities 

  • Design and development of ETL and Hadoop/ Snowflake applications.
  • Undertaking end-to-end project delivery (from inception to post-implementation support), including review and finalization of business requirements, creation of functional specifications and/or system designs, and ensuring that end-solution meets business needs and expectations. 
  • Responsibilities around deployment support (late hour, weekend). 
  • Development of new transformation processes to load data from source to target, or performance tuning of existing ETL code (mappings, sessions) and Hadoop/ Snowflake Platform. 
  • Analysis of existing designs and interfaces and applying design modifications or enhancements. 
  • Coding and documenting data processing scripts and stored procedures. 
  • Providing business insights and analysis findings for ad-hoc data requests 
  • Testing software components and complete solutions (including debugging and troubleshooting) and preparing migration documentation. Providing reporting-line transparency through periodic updates on project or task status. 

Requirements

  • Bachelor’s/master's degree in engineering, preferably Computer Science/Engineering. 
  • 3+ years of experience with the technical analysis and design, development and implementation of data warehousing / Data Lake solutions. 
  • Strong SQL programming and stored procedure development skills. 
  • 2+ years of experience developing in Informatica or any other ETL tool. 
  • 2+ years relational database experience. 
  • Strong UNIX Shell scripting experience to support data warehousing solutions. 
  • Process oriented, focused on standardization, streamlining, and implementation of best practices delivery approach. 
  • Excellent problem solving and analytical skills. 
  • Excellent verbal and written communication skills. 
  • Experience in optimizing large data loads. 

 

Advantages 

 

  • Understanding/experience in Hive/Impala/Spark/Snowflake. 
  • Experience with Teradata is a big plus. 
  • Ability to architect an ETL solution and data conversion strategy.  
  • Exposure to an Agile Development environment. 
  • Knowledge about TWS Scheduler. 
  • Strong understanding of Data warehousing domain. 
  • Good understanding of dimensional modelling. 
  • Should be a good Team player. 

Benefits

  • You will have the opportunity to gain experience in exciting, long-term, innovative projects
  • Flexible working arrangements (core hours and opportunity to work from home) 
  • Work in a multinational team/environment, 
  • A team of great engineers, 
  • Cafeteria

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  4  1  0
Category: Engineering Jobs

Tags: Agile Computer Science Data Warehousing Engineering ETL Hadoop Informatica RDBMS Shell scripting Snowflake Spark SQL Teradata Testing

Perks/benefits: Flex hours

Region: Europe
Country: Hungary

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.