Data Engineer

Frankfurt, Hessen, Germany

payabl. empowers businesses to grow through payments innovation and banking services. Our ambition is to expand our strong portfolio of global financial services we provide to businesses and make them all available in one place on our platform we call payabl.one. As a licensed financial company with principal membership with card schemes, we specialize in global payments and providing businesses with multi-currency accounts.

About the role

Data Team, where you play a vital role in collecting, analyzing, and interpreting data to support decision-making across the organization. Your tasks include data collection, ensuring quality, and developing databases. You're skilled in SQL, Python for data manipulation and visualization tools. Collaborative and communicative, you work with various teams to understand their needs and provide actionable insights. Passionate about staying updated with new technologies, you drive innovation and contribute to a culture of continuous improvement.

Responsibilities

  • Data Architecture Design: Design and implement scalable, reliable, and efficient data pipelines and infrastructure to ingest, process, and store large volumes of structured and unstructured data.
  • Data Modeling: Develop and maintain data models to support analytics and reporting needs, ensuring data integrity, consistency, and optimization for performance. Design and implement OLAP (Online Analytical Processing) cubes and structures for multidimensional analysis.
  • Data Integration: Integrate data from various sources, both internal and external, ensuring compatibility, consistency, and reliability across datasets.
  • Data Quality and Governance: Implement processes and tools to ensure data quality, consistency, and compliance with regulatory requirements. Establish data governance policies and procedures.
  • Performance Optimization: Monitor and optimize data pipelines and systems for performance, scalability, and cost-effectiveness. Troubleshoot and resolve issues as they arise.
  • Collaboration: Collaborate with cross-functional teams, including analysts, software engineers, and business stakeholders, to understand data requirements and deliver solutions that meet business objectives.
  • Documentation: Document data infrastructure, processes, and workflows to ensure knowledge sharing and maintainability.

Requirements

  • Proven experience in data engineering or related roles, with a strong understanding of data structures, algorithms, data management principles, and best practices
  • Have experience working with real-time or near-real-time ingestion
  • Proficiency in programming languages such as Python, Java, or Scala
  • Working knowledge of Unix OS / Shell scripting
  • Experience with ETL / ELT pipeline design, implementation, and maintenance (Airflow, Luigi)
  • Experience with SQL, relational, non-relational, and column-oriented databases (e.g., PostgreSQL, MySQL, MongoDB, Apache Druid, ClickHouse)
  • Experience with big data technologies and frameworks such as Hadoop, Spark, Kafka
  • Experience with data warehouse platforms such as Snowflake and Databricks, including data modeling, performance optimization, and administration.
  • Familiarity with cloud platforms (e.g., AWS, Azure, GCP)
  • Experience with containerization and orchestration technologies such as Docker and Kubernetes
  • Experience with data visualization tools (e.g., Tableau, Power BI, Superset)
  • Experience with finance data (transactions, clearing reports)

Benefits

  • Competitive Compensation: Receive a market-aligned salary with potential bonus payments reflecting the company's annual performance.
  • Vacation Policy: Enjoy 28 vacation days, plus extra holidays on December 24th and 31st, providing ample time for personal activities and relaxation.
  • Career Growth in a Flat Hierarchy: Experience significant career opportunities in an environment that encourages swift professional development and open communication.
  • International Working Environment: Work in a diverse, multicultural setting that broadens your professional experience and perspectives.
  • Personalized Technology Choice: Select the technology that suits you best, with options including Mac or Windows laptops, to enhance your work efficiency and comfort.
  • Team Events and Collaborative Culture: Participate in regular team events that strengthen community bonds and promote a collaborative workplace.
  • Hybrid Working and Relaxation Amenities: Benefit from the flexibility of hybrid working and unwinding in our relaxation room featuring a massage chair.
  • Daily Lunch Benefit with Sodexo Checks: Enhance your workday with the convenience of daily lunches, catered to by Sodexo checks.

Location:

  • Our office is based in Frankfurt, and hybrid working is available.

Join the payabl.Team Now!

If this role seems like a good match, please submit your resume all applications are treated with the strictest confidentiality.

Please note that we may keep your CV for a period of one (1) year for future relevant job opportunities. For more information about how we process your data please see our privacy policy.

Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Airflow Architecture AWS Azure Banking Big Data Databricks Data governance Data management Data pipelines Data quality Data visualization Data warehouse Docker ELT Engineering ETL Finance GCP Hadoop Java Kafka Kubernetes MongoDB MySQL OLAP Pipelines PostgreSQL Power BI Privacy Python Scala Shell scripting Snowflake Spark SQL Superset Tableau Unstructured data

Perks/benefits: Career development Competitive pay Flat hierarchy Flex vacation Lunch / meals Salary bonus Team events

Region: Europe
Country: Germany
Job stats:  12  1  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.