Big Data DevOps Engineer (mid-senior-lead)

Europe

Applications have closed

Binance

Binance cryptocurrency exchange - We operate the worlds biggest bitcoin exchange and altcoin crypto exchange in the world by volume

View company page

Binance is the global blockchain company behind the world’s largest digital asset exchange by trading volume and users, serving a greater mission to accelerate cryptocurrency adoption and increase the freedom of money.
Are you looking to be a part of the most influential company in the blockchain industry and contribute to the crypto-currency revolution that is changing the world?
As one of the biggest player in cryptocurrency exchange technology, we have immense amount of data and we want to leverage this to power us in building strategies, optimize our stack, and make inform decisions to help our business grow.
We are looking for driven and hard-working Big Data Engineer that will take ownership of our Big Data Infrastructure. Ideally they would have extensive knowledge in Big Data Solutions and have experience in dealing with large scale and highly critical applications. Your tasks will be designing Big Data pipelines and infrastructure for our team, continuously improve our solutions and provide analysis and reports for our stakeholders. You will be working with both tech and business teams.
This is a full-time position that can be remote from any location.

Responsibilities

  • Ensure performance and availability of big data infrastructure
  • Deploy and operate big data components according to business scenarios
  • Response to incidents promptly and identify potential issues
  • Keep learning big data technologies and continuously optimize performance, availability and scalability

Qualifications

  • 3-6+ years of Big Data DevOps experience
  • Familiar with data pipeline including batch ETL, real-time streaming and job scheduling
  • Hands on experience on big data components including Hadoop, Hive, Spark, Spark Streaming, Presto, HBase, ElasticSearch, Kafka, ZooKeeper, Redis, Airflow
  • Experienced in deploying, operating, monitoring, optimizing and troubleshooting large scale big data infrastructure
  • Proficient in shell scripting, Python and SQL
  • Experiences with AWS big data services is a plus
  • Team working, analytical mind, be optimistic, willing to accept challenges and quick response against incidents
#LI-Remote #LI-EF1
Conditions• Do something meaningful; Be a part of the future of finance technology and the no.1 company in the industry• Fast moving, challenging and unique business problems• International work environment and flat organisation• Great career development opportunities in a growing company• Possibility for relocation and international transfers mid-career• Competitive salary• Flexible working hours, Casual work attire

Tags: Airflow AWS Big Data Blockchain Crypto Data pipelines DevOps Elasticsearch ETL Finance Hadoop HBase Kafka Pipelines Python Spark SQL Streaming

Perks/benefits: Career development Competitive pay Flat hierarchy Flex hours Relocation support

Regions: Remote/Anywhere Europe
Job stats:  9  0  0

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.