Big Data Engineer, DevOps Architect

Remote - San José, San José Province, Costa Rica

Applications have closed

Caylent

Caylent is an AWS cloud services company that helps organizations thrive in a technology-centric world.

View company page



USA-based Devops-as-a-Service provider, Caylent, seeks a Big Data Engineer with Hadoop & some DevOps experience.

We specialize in Kubernetes for cloud-native projects with AWS, GCP and serverless. Our Engineers are leaders and mentors who wear different technology hats of a small boutique firm without the corporate headaches.

Enjoy the benefits of a full-time, permanent position where you work REMOTE from home and only speak in English with interesting clients in machine learning and block chain indutries.

NOTE: Work sponsorship not available.

Requirements

  • Professional level English fluency and excellent written and verbal communication skills
  • 3+ years recent experience SysAdmin (Hadoop, Cassandra, MySQL, Hadoop components such as Spark, HDFS, Yarn, Hive, Kafka, Sqoop
  • 3+ years data analytics experience
  • Hands-on cloud infrastructure environments (AWS, GCP)
  • Staff level or higher DevOps Engineer experience
  • Bachelor's degree in Engineering or Computer Science preferred
  • GCP, AWS or other Professional Certificates preferred
  • DevOps Certificates a plus

Duties:

  • Build and manage Hadoop Production Clusters, Kafka, Hive and HDFS
  • Responsible for full life cycle support of GCP hosted data systems
  • Work with team on strategic shift from on-prem apps and deploying workloads to GCP
  • Develop and maintain Database-as-a-Service and Kubernetes-as-a-Service
  • Configure and maintain Puppet, SaltStack
  • Cobbler configurations for infrastructure automation, including patch management and bare metal server deployment
  • Implement solutions for Cloud infrastructure, containerized application workloads, platform scalability, or CI/CD pipelines and tooling
  • Develop and improve operational practices and procedures
  • Bring new features and services into production
  • Infrastructure as Code design with Terraform
  • Release automation, system administration, system configuration, and system debugging experience

Benefits

  • BigData DevOps Hybrid Engineer role - Opportunity to grow your DevOps expertise
  • Compensation paid in $USD
  • Generous paid-time off
  • Work from home or shared office space
  • Paid for internal training programs and memberships to online trainings
  • Company-paid certification courses, test, and bonus for each certificaton
  • Work with cool, new tech - NO legacy systems
  • Exciting and challenging work
  • Work with great people and supportive management

Tags: AWS Big Data Cassandra CI/CD Computer Science Data Analytics DevOps Engineering GCP Hadoop HDFS Kafka Kubernetes Machine Learning MySQL Pipelines Spark Terraform

Perks/benefits: Career development

Regions: Remote/Anywhere North America
Country: Costa Rica
Job stats:  50  3  1

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.