Data Engineer

South Jakarta, South Jakarta City, Indonesia - Remote

Applications have closed

Amartha

Investasi online berdampak sosial untuk pendanaan modal UMKM. Cukup modalin mulai dari 100 ribu untuk raih imbal hasil up to 15%*.

View company page

Amartha is embarking on an exciting new journey and is in need of experienced engineers to work with senior management, existing engineers, and product in shaping the next wave of innovative product offerings, ensuring Amartha leapfrogs into the next phase of its journey!

Successful candidates will need strong communication skills as they will be actively involved in discussions and instrumental in bridging the gap between product and technology, as well as coaching, mentoring, and guiding junior engineers in best practices/solving technical challenges.


Job Description

As a Data Engineer, you will implement and data pipelines while demonstrating expertise in a number of areas including cloud computing, database design and development.


Responsibilities

  • Build, test, and maintain optimal data pipeline architecture
  • Build the infrastructure necessary for optimal extraction, transformation, and loading of data from a variety of sources
  • Assemble large & complex data sets to meet business demands
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Requirements

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Strong analytic skills related to working with structured & unstructured datasets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Experience building and optimizing data pipelines, architectures and data sets.
  • Experience using the following software/tools:
  • Experience with programming languanges: Python/Java/Go
  • Experience with relational SQL and NoSQL databases (PostgreSQL, MongoDB)
  • Experience with data pipeline and workflow management tools: prefer Airflow, nice to have: Azkaban, Luigi etc
  • Experience with Cloud services (GCP & AWS)

Preferred:

  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Airflow Architecture AWS Azkaban Big Data Data pipelines GCP Hadoop Kafka MongoDB NoSQL Pipelines PostgreSQL Python RDBMS Spark SQL Streaming

Regions: Remote/Anywhere Asia/Pacific
Country: Indonesia
Job stats:  11  3  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.