Senior Data Integration Engineer

Czech Republic

3Pillar Global

3Pillar builds breakthrough software products that power digital businesses. Discover product development services.

View company page

We are 3PILLAR GLOBAL 
We build breakthrough software products that power digital businesses. We are an innovative product development partner whose solutions drive rapid revenue, market share, and customer growth for industry leaders in Software and SaaS, Media and Publishing, Information Services, and Retail. Our key differentiator is our Product Mindset. Our development teams focus on building for outcomes and all of our team members around the globe are trained on the Product Mindset’s core values – Minimize Time to Value, Solve For Need, and Excel at Change. Our teams apply this mindset to build digital products that are customer-facing and revenue-generating. Our business-minded approach to agile development ensures that we align to client goals from the earliest conceptual stages through market launch and beyond.

RESPONSIBILITIES

  • Design, develop, and maintain new data pipelines using AWS Services (Lambda, Glue, SNS etc) and other relevant technologies to ensure data availability and quality. 
  • Build and maintain RESTful APIs to facilitate data access and integration with other systems. 
  • Develop and optimize SQL queries to extract, transform, and load data efficiently. 
  • Manage multiple projects concurrently, prioritize tasks effectively, and meet project deadlines.
  • Work with large datasets, ensuring data quality, accuracy, and performance
  • Collaborate with technical teams and business stakeholders to understand data requirements and translate them into technical solutions.
  • Implement data transformation, integration, and validation processes to support analytics and reporting needs
  • Optimize and fine-tune data pipelines for improved speed, reliability, and efficiency
  • Implement best practices for data storage, retrieval, and archival to ensure data accessibility and security
  • Troubleshoot and resolve data-related issues, collaborating with the team to identify root causes
  • Document data processes, data lineage, and technical specifications for future reference
  • Participate in code reviews, ensuring adherence to coding standards and best practices
  • Collaborate with DevOps teams to automate deployment and monitoring of data pipelines
  • Leverage cloud services, with a preference for AWS, to design and implement scalable and cost-effective solutions. 
  • Adapt quickly to new technologies, tools, and methodologies to stay current in the ever-evolving data engineering landscape.

REQUIREMENTS

  • Proven experience in data engineering  with a strong track record of designing, developing, and maintaining data pipelines using AWS Services (Lambda, Glue, SNS etc) and other relevant technologies to ensure data availability and quality.
  • Strong understanding of cloud concepts and services (AWS preferred) to design and implement scalable and cost-effective solutions.
  • Strong skills in REST APIs to facilitate data access and integration with other systems.
  • Strong skills in SQL query language (to develop and optimize SQL queries to extract, transform, and load data efficiently).
  • Good knowledge of data architecture, data modeling, and data warehousing concepts.
  • Experience with Python (data processing and analysis) and with AWS services (for data storage, computation, and analytics).
  • Experience in DevOps practices, showcasing hands-on experience in CI/CD automation (Terraform , Jenkins etc)
  • Must be a fast learner in order to adapt quickly to new technologies, tools, and methodologies and to stay current in the ever-evolving data engineering landscape.
  • Fluent in English to facilitate communication with a diverse and global team.

BENEFITS

  • Flexible working hours,
  • Work from home allowance,
  • Work from home or from the office (Brno, Olomouc, Ostrava),
  • Pet-friendly office 🐶🐢, family-friendly office,
  • An extra week of holiday (25 days/year),
  • Extra time to compensate public holidays on weekends,
  • Company devices,
  • Wellbeing program (psychotherapist, mental health program)
  • Free refreshments (water, coffee, tea, milk available in the office),
  • Meal allowance,
  • Employee tariff (discounts for employee’s family members and friends),
  • Teambuilding activities,
  • You can use a company 3D printer,
  • English courses in working hours,
  • Training and qualification opportunities,
  • Better price for Multisport Card,
  • Business trips

  • #LI-Remote

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile APIs Architecture AWS CI/CD Data pipelines Data quality Data Warehousing DevOps Engineering Excel Lambda Pipelines Python Security SQL Terraform

Perks/benefits: Flex hours Health care Pet friendly Snacks / Drinks Team events

Regions: Remote/Anywhere Europe
Country: Czechia
Job stats:  6  2  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.