Data Architect

Buenos Aires, Argentina

Applications have closed

Descripción de la empresa

Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.

Descripción del empleo

Publicis Sapient is looking for a Data Architect to join our team of bright thinkers and
doers. You will team with top-notch technologists to enable real business outcomes for
our enterprise clients by translating their needs into transformative solutions that provide
valuable insight. Working with the latest data technologies in the industry, you will be
instrumental in helping the world’s most established brands evolve for a more digital
future.


Your Impact:

  • Play a key role in delivering data-driven interactive experiences to our clients
  • Work closely with our clients in understanding their needs and translating them to technology solutions
  • Provide expertise as a technical resource to solve complex business issues that translate into data integration and database systems designs
  • Problem solving to resolve issues and remove barriers throughout the lifecycle of client engagements
  • Ensuring all deliverables are high quality by setting development standards, adhering to the standards and participating in code reviews
  • Participate in integrated validation and analysis sessions of components and subsystems on production servers
  • Mentor, support and manage team members

Requisitos

Your Skills & Experience:

  • Demonstrable experience in enterprise-level data platforms involving implementation of end-to-end data pipelines
  • Good communication and willingness to work as a team
  • Hands-on experience with at least one of the leading public cloud data platforms (Amazon Web Services, Azure or Google Cloud)
  • Experience with column-oriented database technologies (i.e. Big Query, Redshift, Vertica),
  • NoSQL database technologies (i.e. DynamoDB, BigTable, Cosmos DB, etc.) and traditional database systems (i.e. SQL Server, Oracle, MySQL)
  • Experience architecting data pipelines and solutions for streaming and batch integrations using tools/frameworks like Glue ETL, Lambda, Google Cloud DataFlow, Azure Data Factory, Spark, Spark Streaming, etc.
  • Ability to handle multiple responsibilities simultaneously in leadership and contributing to tasks “hands-on”
  • Understanding of data modeling, warehouse design, and fact/dimension concepts
  • High Englis level 

Set Yourself Apart With:

  • Certifications for any of the cloud services like AWS, GCP or Azure
  • Experience working with code repositories and continuous integration
  • Understanding of development and project methodologies
  • Willingness to travel

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile AWS Azure BigQuery Bigtable Consulting Cosmos DB CX Dataflow Data pipelines DynamoDB Engineering ETL GCP Google Cloud Lambda MySQL NoSQL Oracle Pipelines Redshift Spark SQL Streaming

Region: South America
Country: Argentina
Job stats:  1  0  0
Category: Architecture Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.