ETL Developer

Bangkok

Applications have closed

Sertis

Sertis | Unlock the full potential of your data with cutting-edge AI and Big Data solutions, tailored to drive growth and innovation across multiple industries, brought to you by Thailand's leading AI solutions provider

View company page

We aim to improve the World by using data and AI.   At Sertis, we consume massive amounts of data each data to produce actionable insights that answer client’s day-to-day operational needs. Moving data from the client’s environment to our system and ensuring that our data analysts and the analytic dashboards have accurate and complete data to work with and display is therefore of paramount importance. 

As an ETL Developer, you will be architecting and developing entire ETL pipelines from start to finish to suit the requirements and SLAs of each project. You will be exposed to various tools in the market and have to make decisions on which ones will most suit the needs of the project at hand. Your primary considerations will be how to move the largest amount of data in the shortest amount of time using the least amount of resources and still be able to do so accurately, reliably, transparently, and securely.

Our ETL Developer(you) will be: 

  • Work with project managers, business users, and the client to understand data requirements and identify appropriate data sources and  define and build required data transformation logic.
  • Architect an end-to-end ETL pipeline, starting from the source of data on the client’s environment all the way to the place where our business analyst can utilize the data
  • Design where and how the data should be stored, with possible quirks in requirements such as data versioning, backup maintenance, and latency minimization
  • Create data control structures to ensure the accuracy and quality of data through all upstream and downstream data channels.
  • Be concerned with the SLA of your ETL pipeline, as well as other aspects of a system, including scalability, reliability, and security of the pipeline
  • Setup infrastructures and deploy your pipeline to the actual environment
  • Stay up-to-date with new tools, technologies, and practices and apply them to projects as appropriate
  • Assist in nurturing internal capabilities to build and manage scalable ETL pipelines to serve the ever-changing business and regulatory landscape
  • Build automated tools to help extend team’s capabilities on various fronts, including service integration, data governance and management, self-service interface for all things data-related
  • Support data users by monitoring and tuning system performance, including minimizing query time, reducing data ingestion time, and increasing orchestration throughput 

What you will need:

  • Strong knowledge of programming languages (e.g. Python, SQL, Linux CLI, etc.)
  • Experience in performing root cause analysis on daily data pipelines and identify opportunities for improvement
  • Experience in building, maintaining and optimizing Big Data pipelines and architectures
  • A firm grasp on database architecture (schema design, master data management, slowly changing dimension), and database programming techniques (e.g. stored procedures, UDFs, triggers, etc.)
  • Working knowledge of data transformation, data structure, metadata, data streaming, and workload management
  • General understanding of business data
  • General understanding of data and data flows
  • Interest in big data, and related technologies (Hadoop & Hive, BigQuery, etc.)
  • An ability to translate business requirements to technical teams.

 It's a plus if you have:

  • Background on software architecture and best practices in software development
  • A solid understanding of distributed programming concepts, including parallel programming, race condition, leader election, state management, cache management, inter-process communications, etc.
  • Experience working with the following tools/technologies:
    • Big data tools: Hadoop, Hive, Spark, Kafka, etc.
    • Cloud services: Google, AWS, or Azure
    • Data pipeline and workflow management tools: Airflow, NiFi, etc.
    • Relational SQL databases (PostgreSQL, MySQL, SQL Server)
    • Deployment tools and practices: docker, CI/CD, kubernetes
  • An understanding of common BI development processes, software development process or project management.

What you are like:

  • Very open-minded (eager to ask for comments/suggestions)
  • Able to share and suggest ideas
  • Have a can do and will do attitude!
  • Ready to tackle any challenge

Why Sertis?

  • We work at the frontier of innovation in the AI industry
  • Your chance to work on meaningful products that solve and improve real-life problems and challenges
  • Attractive WFH Policy
  • Up early or slow starter in the morning? We have flexible office hours
  • OKR-driven workplace; We provide direction, not orders and give you the autonomy to deliver your best work
  • Mentorship programs for every level; from executive-level coaching to fresh grad
  • We run like a startup, and embrace the adventure; we focus on getting things done, while still having a down to earth and informal culture.
  • Get to work and learn from the best in the industry, and share your ideas with like-minded individuals
Because we cultivate intelligence and learning and help people grow beyond their potential. This is your chance to build your career in a growing data driven industry. We seek passion and ambition to turn data into action!   Apply Now! 

Tags: Airflow AWS Azure Big Data BigQuery CI/CD Data management Data pipelines Docker ETL Hadoop Kafka Kubernetes Linux MySQL Pipelines PostgreSQL Python R Security Spark SQL Streaming

Perks/benefits: Career development Flex hours Startup environment

Region: Asia/Pacific
Country: Thailand
Job stats:  3  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.