Data Engineer
Seattle, Washington, USA

Amazon.com
Are you excited about the digital media revolution and passionate about designing and delivering advanced analytics that directly influence the product decisions of Amazon's digital businesses. Do you see yourself as a champion of innovating on behalf of the customer by turning data insights into action?
The Amazon Digital content and commerce Services (DCCS) Analytics team is looking for an analytical and technically skilled individual to join our team. In this role, you will play a critical part in developing foundational data instrumentation components to seamlessly surface relevant digital content to Amazon customers.
An ideal individual is someone who has deep data engineering skills around ETL, data modeling, database architecture and big data solutions.
You should have strong business judgement, excellent written and verbal communication skills.
This position requires a Bachelor's Degee in Computer Science or a related technical field, and 5+ years of experience.
· 5+ years of relevant work experience in Big data engineering, ETL, Data Modeling, and Data Architecture.
· Expert-level skills in writing SQL and query performance tuning.
· Experience with Big Data technologies such as Hive/Spark/EMR, Redshift, Teradata etc
· Proficiency in one of the scripting languages - python, ruby, java or similar.
· Demonstrated ability in data modeling, ETL development, and data warehousing.
· Experience operating very large data warehouses and data lakes and building data pipelines.
· Master’s or Bachelor degree in Computer Science, Engineering, or related field
· Experience in ETL optimization, designing, coding, and tuning big data processes .
· Experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.)
· Experience in large-scale data warehousing projects using AWS technologies (Redshift, S3, EMR, Lambda) - design and implement batch data processing pipelines, knows how to optimize the distribution, partitioning, and MPP of high-level data structures.
· Coding proficiency in at least one modern programming language (Python, Ruby, Java, etc)
· Knowledge of Engineering and Operational Excellence using standard methodologies.
The Amazon Digital content and commerce Services (DCCS) Analytics team is looking for an analytical and technically skilled individual to join our team. In this role, you will play a critical part in developing foundational data instrumentation components to seamlessly surface relevant digital content to Amazon customers.
An ideal individual is someone who has deep data engineering skills around ETL, data modeling, database architecture and big data solutions.
You should have strong business judgement, excellent written and verbal communication skills.
Basic Qualifications
This position requires a Bachelor's Degee in Computer Science or a related technical field, and 5+ years of experience.
· 5+ years of relevant work experience in Big data engineering, ETL, Data Modeling, and Data Architecture.
· Expert-level skills in writing SQL and query performance tuning.
· Experience with Big Data technologies such as Hive/Spark/EMR, Redshift, Teradata etc
· Proficiency in one of the scripting languages - python, ruby, java or similar.
· Demonstrated ability in data modeling, ETL development, and data warehousing.
· Experience operating very large data warehouses and data lakes and building data pipelines.
Preferred Qualifications
· Master’s or Bachelor degree in Computer Science, Engineering, or related field
· Experience in ETL optimization, designing, coding, and tuning big data processes .
· Experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.)
· Experience in large-scale data warehousing projects using AWS technologies (Redshift, S3, EMR, Lambda) - design and implement batch data processing pipelines, knows how to optimize the distribution, partitioning, and MPP of high-level data structures.
· Coding proficiency in at least one modern programming language (Python, Ruby, Java, etc)
· Knowledge of Engineering and Operational Excellence using standard methodologies.
Job tags:
AWS
Big Data
Data Warehousing
Engineering
ETL
Hadoop
Java
MPP
Python
Redshift
Ruby
Spark
SQL
Job region(s):
North America
Job stats:
10
0
0