Senior Data Engineer

Atlanta, GA, United States

KION Group

Wir sind ein führender Anbieter für Gabelstapler und Lagertechnik sowie Automatisierungstechnologien und Softwarelösungen für die Optimierung von Lieferketten.

View all jobs at KION Group

Apply now Apply later

We are seeking a dynamic and highly skilled Senior Data Engineer who has extensive experience building self -service enterprise scale data platforms with microservices architecture and lead these foundational efforts. This role demands someone who not only possesses a profound understanding of the data engineering landscape but also has a very strong software engineering background specially building microservices frameworks and architectures. The ideal candidate will be an individual contributor as well as the technical lead and contribute significantly to platform development and actively shape our data ecosystem.

What we offer:

What We Offer:

  • Career Development
  • Competitive Compensation and Benefits
  • Pay Transparency
  • Global Opportunities


Learn More Here:


Dematic provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.


This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.


The base pay range for this role is estimated to be $82,000.00 - $166,000.00 at the time of posting. Final compensation will be determined by various factors such as work location, education, experience, knowledge and skills.


Tasks and Qualifications:

This is What You Will do in This Role: 

  • As a senior engineer, you will be responsible for ideation, architecture, design and development of our enterprise data platform.
  • Architect and design core components with a microservices architecture, abstracting platform, and infrastructure intricacies.
  • Create and maintain essential data platform SDKs and libraries, adhering to industry best practices.
  • Design and develop connector frameworks and modern connectors to source data from disparate systems both on-prem and cloud.
  • Design and optimize data storage, processing, and querying performance for large-scale datasets using industry best practices while keeping costs in check.
  • Design and develop data quality frameworks and processes to ensure the accuracy and reliability of data.
  • Design and develop microservices based semantic layer and metadata management components.
  • Collaborate with data scientists, analysts, and cross functional teams to design data models, database schemas and data storage solutions.
  • Design and develop advanced analytics and machine learning capabilities on the data platform.
  • Design and develop observability and data governance frameworks and practices.
  • Stay up to date with the latest data engineering trends, technologies, and best practices.
  • Drive the deployment and release cycles, ensuring a robust and scalable platform.

What We are Looking For:

  • Bachelor's or Master's degree in Computer Science, Engineering, or related field.
  • 8-10 years of proven experience in modern cloud data engineering, broader data landscape experience and exposure and solid software engineering experience.
  • Prior experience architecting and building successful self-service enterprise scale data platforms in a green field environment with microservices based architecture.
  • Proficiency in building end to end data platforms and data services in GCP is a must.
  • Proficiency in tools and technologies: BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, Airflow, PubSub.
  • Experience with Microservices architectures - Kubernetes, Docker. Our microservices are build using TypeScript, NestJS, NodeJS stack. Prefer candidates with this experience.
  • Experience building Symantec layers.
  • Proficiency in architecting and designing and development experience with batch and real time streaming infrastructure and workloads.
  • Solid experience with architecting and implementing metadata management including data catalogues, data lineage, data quality and data observability for big data workflows.
  • Hands-on experience with GCP ecosystem and data lakehouse architectures.
  • Strong understanding of data modeling, data architecture, and data governance principles.
  • Excellent experience with DataOps principles and test automation.
  • Excellent experience with observability tooling: Grafana, Datadog.

Nice to have: 

  • Experience with Data Mesh architecture
  • Experience building Semantic layers for data platforms.
  • Experience building scalable IoT architectures

#LI-DP1 #In-post

Apply now Apply later
  • Share this job via
  • or
Job stats:  5  0  0
Category: Engineering Jobs

Tags: Airflow Architecture Big Data BigQuery Computer Science Dataflow Data governance DataOps Dataproc Data quality Docker Engineering GCP Grafana Kubernetes Machine Learning Microservices Node.js Python SQL Streaming TypeScript

Perks/benefits: Career development Competitive pay

Region: North America
Country: United States

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.