AWS Data Engineer


Redwood Materials

If you love sustainability, chemistry, materials science, advanced equipment design, automation, and seeing your hard work pay off, you will enjoy building your career here.

View company page

About Redwood Materials

Redwood Materials was founded in 2017 to create a circular supply chain for electric vehicles and clean energy products, making them more sustainable and driving down the cost for batteries. We’re doing this by developing and deploying new technologies to increase the scope and scale of recycled and sustainable materials in the global battery supply chain.

AWS Data Engineer 

Redwood Materials is looking for motivated and talented data engineers to help model and manage data assets in a data lake architecture, overseeing the full data lifecycle from ingestion to processing to consumption. The ideal candidate is someone experienced with both data engineering and creating/managing/supporting AWS infrastructure. This is an opportunity to join during a critical growth phase and build green field software experiences and capabilities which will have a significant impact on the company’s day-to-day operations and ability to scale. 

 Responsibilities will include: 

  • Build and manage a data lake in AWS leveraging and augmenting existing LakeFormation based architecture. 
  • Build and maintain data pipelines from a variety of data sources, including streaming datasets, APIs, and various data stores, leveraging PySpark and AWS Glue. 
  • Create data sets from the data lake to support various use cases, such as business analytics, dashboards, reports and machine learning. 
  • Drive technical decisions on the best ways to serve data consumers (data warehouse vs. search engine databases, etc.).
  • Leverage existing AWS architectures and design new ones where needed, using the CDK tool kit. 
  • Operationalize data workloads in AWS, automating pipelines and implementing appropriate monitoring. 
  • Work with cross functional teams to discover business needs and design appropriate data flows. 

 Desired qualifications:  

  • Bachelor’s degree in computer science, similar technical field of study, or equivalent practical experience.
  • Minimum 3 years of hands-on experience developing data solutions in a modern cloud environment. 
  • Fluency in Python.
  • Experience authoring and maintaining ETL jobs (PySpark experience a plus).
  • Experience designing and interacting with relational and non-relational data stores.
  • Experience with AWS ecosystem and resources and using Infrastructure-as-code methodologies (CDK a plus).
  • Demonstrated ability to manage production data workloads (detecting and diagnosing issues, monitoring).
  • A passion for excellence and making the world a better place!  


The position is full-time. Compensation will be commensurate with experience.


We collect personal information (PI) from you in connection with your application for employment with Redwood Materials, including the following categories of PI: identifiers, personal records, professional or employment information, and inferences drawn from your PI. We collect your PI for our purposes, including performing services and operations related to your potential employment. If you have additional privacy-related questions, please contact us at

Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: APIs Architecture AWS AWS Glue Business Analytics Computer Science Data pipelines Data warehouse Engineering ETL Machine Learning Pipelines Privacy PySpark Python Streaming

Perks/benefits: Career development

Region: Remote/Anywhere
Job stats:  10  4  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.