QIS Data Engineer

Sertis Bangkok

Full Time
Sertis logo
Sertis
Apply now Apply later

Posted 1 month ago

We aim to improve the world by using data and AI.


QIS Capital is an institutional investment fund created by financiers and technologists. We use AI, big data and cloud computing to
analyse financial markets data and develop price prediction models. We focus on investing in digital assets and cryptocurrencies.


As a Data Engineer, you will be responsible for designing and building state of the art data infrastructure for our internal platforms.
Collaboration and coordination with other departments to solve their use cases using data technology will be a key focus, while also
keeping up to date with new technologies to improve our current stack at QIS Capital.


What makes us different? You will learn about cutting-edge and fast-moving technologies at a leading institutional investment firm in the digital asset space.


Our Data Engineer (you) will:
● Manipulating complex data from different sources
● Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities
● Implement ETL process
Monitoring performance and advising any necessary infrastructure changes
● Working within a team supporting software & data system development, training and integration & test infrastructure
● Identifying and solving business challenges utilizing large structured and unstructured data from different sources
● Integrate our system with multi-trading platforms and third parties as well as our internal systems

What you will need:
● Bachelors degree or higher in Computer Science, Computer Engineering or related fields
● At least 2 years proven work experience in data engineering.
● Passion for big data, software engineering, and systems
● Excellent analysis and reasoning of system behaviours
● Knowledge of SQL, Streaming Data Processing and Python, Scala, Java or Elixir
● Good communications skills; ability to work independently or in a team environment
● Have worked with one of the following: Hadoop, Apache Hive, Apache Flink, Amazon EMR, Apache Kafka, AWS Lambda,
Amazon DynamoDB, Amazon EKS or related technology
● Ideally experience connecting trading platforms to multi-trading platforms and exchanges and the ability to build a data
pipeline system that covers the entire lifecycle
● Google Cloud or AWS Experience
● A “CAN DO” attitude and friendly disposition

Job tags: AI AWS Big Data Engineering ETL Hadoop Java Kafka Python Scala SQL
Job region(s): Asia/Pacific
Job metrics:  6  0  0
Share this job: