Data Engineer III - AMZ4539

Cupertino, California, USA

Full Time
Amazon.com logo
Amazon.com
Apply now Apply later

Posted 1 month ago

MULTIPLE POSITIONS AVAILABLE
Entity: Amazon.com Services LLC, an Amazon.com Company
Title: Data Engineer III
Worksite: Cupertino, CA

Position Responsibilities:
Design, implement, and support ETL pipelines for enterprise-scale datasets, using SQL and Big Data tools, such as Hive, Pig, Spark, and Presto. Model data and metadata to support adhoc and pre-built reporting. Interface with business customers, gathering requirements, and delivering complete BI solutions. Tune application and query performance using Unix profiling tools and SQL. Provide guidance and support for engineers with industry best practices and direction. Evaluate and select tools for processing data, including both internally developed and industry tools.

Amazon.com is an Equal Opportunity-Affirmative Action Employer – Minority / Female / Disability / Veteran / Gender Identity / Sexual Orientation #0000

Basic Qualifications


Position Requirements:

Master's degree or foreign equivalent in Computer Science, Engineering, Information Systems, Mathematics, or a related field and one year of experience in the job offered, or as an Operations Research Analyst, Database Developer, or a related occupation. Employer will accept a Bachelor's degree or foreign equivalent in Computer Science, Engineering, Information Systems, Mathematics, or a related field and five years of experience in the job offered or a related occupation as equivalent to the Master's degree and one year of experience. Must have one year of experience in the following skill(s): developing and operating large-scale data structures for business intelligence analytics using each of the following: ETL/ELT processes; OLAP technologies; data modeling; SQL; and Oracle.

Preferred Qualifications

Applicants must meet all of the above listed requirements for this position.
Job tags: Big Data Business Intelligence Engineering ETL Oracle Research Spark SQL
Share this job: