Data Engineer

Pune, India

Applications have closed
Snowflake Inc. logo
Snowflake Inc.

Posted 1 month ago

About Snowflake

Snowflake is growing fast and we’re scaling our team to help enable and accelerate our growth. We’re passionate about our people, our customers, our values and our culture! We’re also looking for people with a growth mindset and the pragmatic insight to solve for today while building for the future. And as a Snowflake employee, you will be accountable for supporting and enabling diversity and belonging.

Snowflake started with a clear vision: make modern data warehousing effective, affordable, and accessible to all data users. Because traditional on-premises and cloud solutions struggle with this, Snowflake developed an innovative product with a new built-for-the-cloud architecture that combines the power of data warehousing, the flexibility of big data platforms, and the elasticity of the cloud at a fraction of the cost of traditional solutions.

We’re looking for a strong  Data Engineer to build state of the art Data pipelines for snowflake .  In this role, you will work closely with many cross-functional teams to build a data pipeline to ingest data into our internal Snowflake environment . This is a strategic, high-impact role that will also help shape the future of Snowflake products and services.

Responsibilities

  • Architect & build data pipelines for snowflake . 
  • Manage and improve the data integrity and reliability of data services. 
  • Build data frameworks  to onboard new data into our snowflake data warehouse 
  • Foster collaboration among engineering teams, IT & other business groups to ensure data access is secure & are audit-able. 
  • Train distributed team members in data pipelines. 

What you will need

  • 5+ years of experience in Data warehousing, data modeling, and SQL .
  • 3 + years of experience in working on public cloud  (AWS, Azure or GCP)
  • 3+ years of experience in MPP or Cloud data warehouse solutions like Snowflake, Redshift, BigQuery or Teradata
  • Experience in ELT based data pipeline build outs is useful..
  • Experience with sourcing and modeling data from application APIs.
  • Strong communication and cross functional collaboration skill

Preferred Qualifications:

  • B.S or MS in Computer Science or equivalent practical experience.
  • 5 + years experience in building data pipelines using Python/Java & SQL
  • Experience in ETL tools is nice to have. 

Snowflake is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, color, gender identity or expression, marital status, national origin, disability, protected veteran status, race, religion, pregnancy, sexual orientation, or any other characteristic protected by applicable laws, regulations and ordinances. 

Job tags: AWS Big Data Data Warehousing Engineering ETL Java MPP Python Redshift SQL