AWS Data Engineer
Remote - Argentina
Applications have closed
Caylent
Caylent is an AWS cloud services company that helps organizations thrive in a technology-centric world.AWS Data Engineer
We are looking for a skilled AWS Data Engineer to join our growing data team and assist in building production-grade data solutions to enable our customers to manage, transform and analyze their data more efficiently at scale, allowing for faster and better insights into their businesses that will help them drive their growth and transformation.
DescriptionDay to day:
- A key figure within our service delivery team
- Engage in a highly collaborative team environment where your expertise will help create and maintain scalable and cost-efficient data architectures.
- Assemble large, complex data sets that meet our customers' business requirements.
- Design, build and deploy large scale data solutions using AWS data and analytics services in combination with 3rd-party software.
- Design, build and deploy production-ready data pipelines to extract, transform, and load data from various data sources using SQL and Big Data technologies.
- Perform detailed assessments of existing data platforms and create suitable transition plans to help our customers fully leverage the advantages of the AWS cloud.
- Re-architect or re-platform legacy data warehouses to data platforms based on the AWS cloud.
- Collaborate with customer stakeholders to assist with data-related technical issues and support their data infrastructure needs.
- Take on work already in-line with projects and deliverables and research new projects (R&D).
Requirements
Must-have skills:
- Professional level of English communication.
- Ability to communicate technical objectives and solutions clearly to clients while establishing and cultivating a professional relationship
- Demonstrated critical and strategic thinking about business, products, and technical challenges to stakeholders.
- Strong project management and organizational skills.
- Advanced SQL knowledge and experience working with relational databases.
- Strong experience designing and implementing fully operational Data Lake or Data Warehousing solutions at scale.
- Experience optimizing existing Big Data architectures and data pipelines.
- Strong analytic skills related to working with and extracting value from largely unstructured or disconnected datasets.
- Strong hands-on working experience on Amazon Web Services.
- Experience with Big Data tools and frameworks such as Hadoop/Spark.
- Experience with programming languages like Python, Java or Scala.
- Expert knowledge of relational SQL databases such as PostgreSQL.
- Experience with data pipeline and workflow management tools like Luigi and Airflow.
- Experience with analytics, data warehousing and data movement AWS services: Redshift, Redshift Spectrum, Glue, Glue Databrew, S3, Athena, EMR.
Nice-to-have qualifications:
- Graduate degree in Computer Science, Informatics, Statistics, Information Systems or related field.
- AWS Big Data or Data Analytics Specialty Certification
- AWS Developer Associate Certification
- AWS Solution Architect Associate or Professional Certification
- Experience with Business Intelligence (BI) and analytics tools such as Metabase or Tableau.
- Experience automating workflows and processes with Serverless technologies such as AWS Lambda and Step Functions.
- Experience working with message queuing and stream processing technologies such as Kafka and Kinesis.
- Experience with Snowflake.
- Exposure to Linux.
Benefits
- 100% remote, work from home or in a shared We-Work Space
- Competitive salary in USD
- Generous holidays and Unlimited paid time off
- Bonus payout for each new certification
- Paid for exams and certifications
- Work with cutting-edge technology - NO legacy systems!
- No after hour on-call work
- Paid online continuing training
- Internal training
- Sophisticated Internal library
- Individual professional development plan
- Free English Courses
- Fun corporate events
- DevOps & cloud community-centric corporate culture including Meetups, Conferences etc.
About Caylent:
Caylent is a specialty Consultant and DevOps/Cloud Managed Services provider (DevOps-as-a-Service) with an emphasis on Kubernetes to software-enabled companies, mostly high-growth software startups. Our clients rely on Caylent engineers to architect, develop and integrate highly complex DevOps pipelines, including build automation, CI/CD, Infrastructure as Code, security, monitoring, logging, and alerting.
Part of our value proposition is that we structure our relationships with clients as real partnerships, where we become part of their team and share in their challenges and successes. That means every Caylent engineer is client-facing. Our engineers love the challenge of working on new stuff every single day without the corporate headaches.
You will construct and plan joint backlogs, kickoff new client engagements, perform daily tasks, participate in standups, sprint planning/retrospectives, and more. You will be semi-autonomous while managing a busy schedule, meeting deadlines, and setting proper expectations with our clients.
Identify and propose technical improvements using your software engineering knowledge and principals. Foster an environment of shared learning, promote DevOps principles, and embrace the cultural aspect of the DevOps movement in a team environment.
We are only as good as each person’s contribution to the delivery. Enormous effort is put forth into nurturing each employee’s career development, professional curiosity, technical innovation, and client interaction. If these things matter to you, read on, and apply.
NOTE: Not able to provide sponsorship.
Tags: Airflow Athena AWS Big Data Business Intelligence CI/CD Computer Science Data Analytics Data pipelines Data Warehousing DevOps Engineering Hadoop Kafka Kinesis Kubernetes Lambda Linux Metabase Pipelines PostgreSQL Python R R&D RDBMS Redshift Research Scala Security Snowflake Spark SQL Statistics Tableau
Perks/benefits: Career development Competitive pay Conferences Flex vacation Salary bonus Team events Unlimited paid time off
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Data Science Manager jobs
- Open Lead Data Analyst jobs
- Open Marketing Data Analyst jobs
- Open Data Engineer II jobs
- Open Senior Business Intelligence Analyst jobs
- Open MLOps Engineer jobs
- Open Principal Data Engineer jobs
- Open Power BI Developer jobs
- Open Data Scientist II jobs
- Open Data Analytics Engineer jobs
- Open Business Intelligence Developer jobs
- Open Junior Data Scientist jobs
- Open Business Data Analyst jobs
- Open Sr Data Engineer jobs
- Open Data Analyst Intern jobs
- Open Product Data Analyst jobs
- Open Sr. Data Scientist jobs
- Open Senior Data Architect jobs
- Open Research Scientist jobs
- Open Big Data Engineer jobs
- Open Data Quality Analyst jobs
- Open Azure Data Engineer jobs
- Open Principal Data Scientist jobs
- Open Manager, Data Engineering jobs
- Open Junior Data Engineer jobs
- Open Data quality-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open ML models-related jobs
- Open Business Intelligence-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open PhD-related jobs
- Open Deep Learning-related jobs
- Open Data visualization-related jobs
- Open Finance-related jobs
- Open PyTorch-related jobs
- Open NLP-related jobs
- Open TensorFlow-related jobs
- Open APIs-related jobs
- Open Consulting-related jobs
- Open LLMs-related jobs
- Open CI/CD-related jobs
- Open Generative AI-related jobs
- Open Snowflake-related jobs
- Open Kubernetes-related jobs
- Open Hadoop-related jobs
- Open Data governance-related jobs
- Open Airflow-related jobs
- Open Databricks-related jobs