Data Engineer

Remote, Romania

Nagarro

A digital product engineering leader, Nagarro drives technology-led business breakthroughs for industry leaders and challengers through agility and innovation.

View all jobs at Nagarro

Apply now Apply later

Company Description

We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (15000+ experts across 26 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!

Job Description

 

  • Creation of Data Products in Data Mesh Architecture (based on Data Vault Methodology in Snowflake).
  • Creation/Analysis of Data Products in SAP Analytics Cloud (S/4Hana, BW) and finding integration solutions with the current implementations.
  • Collaborate with cross-functional teams to gather requirements, define data schemas, and design scalable data pipelines and workflows.
  • Mentor and coach junior members of the data engineering team, providing guidance on best practices, code reviews, and technical solutions.
  • Drive innovation and continuous improvement in data engineering processes and methodologies, staying abreast of emerging trends and technologies in the field.
  • Partner with data architects, data scientists, and business analysts to understand data requirements and deliver solutions that meet business needs.
  • Ensure data quality, integrity, and security standards are maintained across all data products and pipelines.

Qualifications

  • Over 3 years of experience in data engineering, with a proven track record of designing and implementing data products in a production environment.
  • Proficiency in data engineering concepts and technologies, including but not limited to Data Modeling, ETL Concepts, SQL, Snowflake - Data Vault Methodology, Data Pipelines (relevant experience can be also considered if you worked with other similar technologies, but you are open to expand your skills with the tech stack used in the project).
  • Experience working with SAP Analytics Cloud/similar technologies OR willingness to learn and work with this tool. 
  • Experience working within a Data Mesh Architecture framework is highly desirable.
  • Excellent communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at all levels of the organization.
  • Strong problem-solving skills and attention to detail, with a commitment to delivering high-quality solutions on time and within budget.

Nice-to-have skills:

  • Talend
  • SAP Analytics Cloud
  • CI/CD knowledge
  • Data Mesh Principles
  • Tableau
  • SQL Server, PostgreSQL, SAP
  • Python
  • Release Management
  • Automated Testing / Regression Testing
Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  4  1  0
Category: Engineering Jobs

Tags: Architecture CI/CD Data pipelines Data quality Engineering ETL Pipelines PostgreSQL Python Security Snowflake SQL Tableau Talend Testing

Regions: Remote/Anywhere Europe
Country: Romania

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.