Senior Cloud Engineer Big Data Engineer - Remote USA or LATAM

Remote - Buenos Aires, Buenos Aires, Argentina

Applications have closed

Caylent

Caylent is an AWS cloud services company that helps organizations thrive in a technology-centric world.

View company page

AWS Data Engineer

We are looking for a skilled AWS Data Engineer to join our growing data team and assist in building production-grade data solutions to enable our customers to manage, transform and analyze their data more efficiently at scale, allowing for faster and better insights into their businesses that will help them drive their growth and transformation.

Description

Day to day:

  • A key figure within our service delivery team
  • Engage in a highly collaborative team environment where your expertise will help create and maintain scalable and cost-efficient data architectures.
  • Assemble large, complex data sets that meet our customers' business requirements.
  • Design, build and deploy large scale data solutions using AWS data and analytics services in combination with 3rd-party software.
  • Design, build and deploy production-ready data pipelines to extract, transform, and load data from various data sources using SQL and Big Data technologies.
  • Perform detailed assessments of existing data platforms and create suitable transition plans to help our customers fully leverage the advantages of the AWS cloud.
  • Re-architect or re-platform legacy data warehouses to data platforms based on the AWS cloud.
  • Collaborate with customer stakeholders to assist with data-related technical issues and support their data infrastructure needs.
  • Take on work already in-line with projects and deliverables and research new projects (R&D).

Requirements

Must-have skills:

  • Proficiency in English language, both verbal and written.
  • Ability to communicate technical objectives and solutions clearly to clients while establishing and cultivating a professional relationship
  • Demonstrated critical and strategic thinking about business, products, and technical challenges to stakeholders.
  • Strong project management and organizational skills.
  • Advanced SQL knowledge and experience working with relational databases.
  • Strong experience designing and implementing fully operational Data Lake or Data Warehousing solutions at scale.
  • Experience optimizing existing Big Data architectures and data pipelines.
  • Strong analytic skills related to working with and extracting value from largely unstructured or disconnected datasets.
  • Strong hands-on working experience on Amazon Web Services.
  • Experience with Big Data tools and frameworks such as Hadoop/Spark.
  • Experience with programming languages like Python, Java or Scala.
  • Expert knowledge of relational SQL databases such as PostgreSQL.
  • Experience with data pipeline and workflow management tools like Luigi and Airflow.
  • Experience with analytics, data warehousing and data movement AWS services: Redshift, Redshift Spectrum, Glue, Glue Databrew, S3, Athena, EMR.

Nice-to-have qualifications:

  • Graduate degree in Computer Science, Informatics, Statistics, Information Systems or related field.
  • AWS Big Data or Data Analytics Specialty Certification
  • AWS Developer Associate Certification
  • AWS Solution Architect Associate or Professional Certification
  • Experience with Business Intelligence (BI) and analytics tools such as Metabase or Tableau.
  • Experience automating workflows and processes with Serverless technologies such as AWS Lambda and Step Functions.
  • Experience working with message queuing and stream processing technologies such as Kafka and Kinesis.
  • Experience with Snowflake.
  • Exposure to Linux.

Benefits

  • 100% remote, work from home or in a shared We-Work Space
  • Competitive salary in USD
  • Generous holidays and Unlimited paid time off
  • Bonus payout for each new certification
  • Paid for exams and certifications
  • Work with cutting-edge technology - NO legacy systems!
  • No after hour on-call work
  • Paid online continuing training
  • Internal training
  • Sophisticated Internal library
  • Individual professional development plan
  • Company paid English classes and tutoring available upon request
  • Fun corporate events
  • DevOps & cloud community-centric corporate culture including Meetups, Conferences etc.

Tags: Airflow Athena AWS Big Data Business Intelligence Computer Science Data Analytics Data pipelines Data Warehousing DevOps Hadoop Kafka Kinesis Lambda Linux Metabase Pipelines PostgreSQL Python R R&D RDBMS Redshift Research Scala Snowflake Spark SQL Statistics Tableau

Perks/benefits: Competitive pay Conferences Flex vacation Team events Unlimited paid time off

Regions: Remote/Anywhere South America
Country: Argentina
Job stats:  2  0  0

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.