Cloud Data Architect

Chennai, Tamil Nadu, India

Ford Motor Company

Since 1903, we have helped to build a better world for the people and communities that we serve. Welcome to Ford Motor Company.

View company page

  • Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. 
  • Build ETL pipelines to ingest the data from heterogeneous sources into our system 
  • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data
  • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets
  • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements
  • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs.
  • Implement security measures and data governance policies to ensure the integrity and confidentiality of data.
  • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure.
  • Work with data scientists and analysts to integrate machine learning models into data pipelines for predictive analytics.
  • Troubleshoot and resolve issues related to data processing, storage, and retrieval.
  • Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. 
  • Build ETL pipelines to ingest the data from heterogeneous sources into our system 
  • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data
  • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets
  • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements
  • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs.
  • Implement security measures and data governance policies to ensure the integrity and confidentiality of data.
  • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure.
  • Work with data scientists and analysts to integrate machine learning models into data pipelines for predictive analytics.
  • Troubleshoot and resolve issues related to data processing, storage, and retrieval.
  • Stay updated on industry best practices and emerging technologies within the GCP ecosystem for continuous improvement of data engineering processes.
  • Collaborate with stakeholders to gather and define data requirements, ensuring alignment with business objectives.
  • Design and implement scalable, fault-tolerant solutions for data ingestion, processing, and storage.
  • Develop and maintain documentation for data engineering processes, ensuring knowledge transfer and ease of system maintenance.

Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures.

  • Implement version control and CI/CD practices for data engineering workflows to ensure reliable and efficient deployments.
  • Collaborate with DevOps teams to automate infrastructure provisioning and management tasks.
  • Optimize and tune database queries and indexing strategies for improved query performance.
  • Stay abreast of advancements in GCP services and incorporate relevant technologies into data engineering solutions.
  • Provide mentorship and guidance to junior team members, fostering a collaborative and knowledge-sharing environment.
  • Participate in on-call rotations to address critical issues and ensure the reliability of data engineering systems. 

Bachelor's Degree

 

At least 12 years of progressive experience

Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Bigtable CI/CD Data governance Data pipelines Dataproc DevOps Engineering ETL GCP Google Cloud Java Machine Learning ML models NoSQL Pipelines Python Security SQL

Region: Asia/Pacific
Country: India
Job stats:  2  0  0
Category: Architecture Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.