Data Engineer

La Ribera de Belen, Heredia, Costa Rica, 40702

Applications have closed

 

Want to build a stronger, more sustainable future and cultivate your career? Join Cargill's global team of 160,000 employees who use new technologies, dynamic insights and over 157 years of experience to connect farmers with markets, customers with ingredients, and people and animals with the food they need to thrive.

JOB PURPOSE AND IMPACT

 

The Data Engineer will design, build and operate high performance data centric solutions utilizing the comprehensive big data capabilities for the company’s data platform environment. In this role, you will act as an authority for data access pathways and techniques working with analysts within the functional data analytics team. You will design data structures and pipelines to collect data and design and implement data transformations, combinations or aggregations.

KEY ACCOUNTABILITIES

 

  • Collaborate with businesses, application and process owners, and product team members to define requirements and design solutions for the company’s big data and analytics solutions.
  • Participate in the decision-making process related to architecting solutions.
  • Develop technical solutions utilizing big data and cloud-based technologies and ensuring they are designed and built to be sustainable and robust.
  • Perform data modeling and prepare data in databases for use in various analytics tools and configurate and develop data pipelines to move and optimize data assets.
  • Provide necessary technical support through all phases of solution life cycle.
  • Build prototypes to test new concepts and be a key contributor of ideas and code that improve the core software infrastructure, patterns and standards.
  • Help drive the adoption of new technologies and methods within the functional data and analytics team and be a role model and mentor for data engineers.
  • Independently handle complex issues with minimal supervision, while escalating only the most complex issues to appropriate staff.
  • Other duties as assigned.

QUALIFICATIONS

 

MINIMUM QUALIFICATIONS

  • Bachelor’s degree in a related field or equivalent experience
  • Proficiency in at least one backend programming language such as Java, Python, Golang
  • At least 2 years of experience using SQL or NoSQL, including performance tuning
  • Understanding of Cloud Computing
  • Experience with Unix shell scripting
  • Experience in API development, ETL processes and building CI/CD pipelines
  • Experience developing software using agile methodologies such as Scrum and/or Kanban
  • Result orientation and ability to work in ambiguous situations where requirements are not clear

 

PREFERRED QUALIFICATIONS

  • Experience with data modeling, data warehousing and or analytical tools
  • Experience in Big Data, using platforms such as a Hadoop Distribution (preferably Cloudera), Snowflake, Databricks, or other Big Data Platform
  • Experience in building batch and streaming data pipelines using Sqoop, Kafka and/or Spark
  • Experience working with various types of data sources such as SAP, JDE and other ERP systems

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile API Development APIs Big Data CI/CD Data Analytics Databricks Data pipelines Data Warehousing ETL Golang Hadoop Java Kafka Kanban NoSQL Pipelines Python Scrum Shell scripting Snowflake Spark SQL Streaming

Perks/benefits: Career development

Region: North America
Country: Costa Rica
Job stats:  6  1  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.