Data Engineer Java

India

3Pillar Global

3Pillar builds breakthrough software products that power digital businesses. Discover product development services.

View company page

We are looking for energetic and talented Data Engineer Java  to be part of our Data Operations  team. The candidate should have good development experience of over 3 to 12 years.

We are 3PILLAR GLOBALWe build breakthrough software products that power digital businesses. We are an innovative product development partner whose solutions drive rapid revenue, market share, and customer growth for industry leaders in Software and SaaS, Media and Publishing, Information Services, and Retail. Our key differentiator is our Product Mindset. Our development teams focus on building for outcomes and all of our team members around the globe are trained on the Product Mindset’s core values – Minimize Time to Value, Solve For Need, and Excel at Change. Our teams apply this mindset to build digital products that are customer-facing and revenue-generating. Our business-minded approach to agile development ensures that we align to client goals from the earliest conceptual stages through market launch and beyond.
In 2021, 3Pillar Global India was named as a “Great Place to Work” for the fifth year in a row based on how our employees feel about our company, collaborative culture, and work/life balance - come join our growing team!

Roles & Responsibilities

  • Understanding the business requirements
  • Writing code and implementing the proposed solutions
  • Creating data pipelines, versioning and change management.
  • Manage the complexity inherent in versioned data pipelines.
  • Building Python based ETL processes
  • Logging and instrumentation of pipelines and services
  • Making suggestions

Technical Competencies

  • Technologies used
  • AWS S3, IoT services, Glue, Athena, Athena 
  • SQL and NoSQL Databases
  • Python based Workflows and SDK
  • Postgres based file catalog
  • Other: PySpark and Redshift on roadmap, but not really used yet

  • Must have: 
  • Strong Core Python skills
  • Panda
  • NumPy
  • Data engineering experience

  • Nice to have: 
  • PySpark
  • Redshift
  • Ability to write ETL / pipeline code in more than one language Python / Java / Scala

Benefits

  • A competitive annual salary based on experience and market demands 
  • Flexi-timings
  • Work from anywhere Policy /Option to choose permanent Work from Home
  • Medical insurance with the option to purchase a premium plan or HSA option for your entire family
  • Meal Vouchers
  • Regular Health check-up camps arranged by the company
  • Business casual atmosphere
  • #Li-Remote
  • #LI-DV1

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile Athena AWS Data pipelines Engineering ETL Excel NoSQL NumPy Pipelines PostgreSQL PySpark Python Redshift Scala SQL

Perks/benefits: Competitive pay Health care

Region: Asia/Pacific
Country: India
Job stats:  2  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.