Lead Big Data Administrator

Bengaluru, India

Publicis Groupe

View company page

Company Description

When you’re one of us, you get to run with the best. For decades, we’ve been helping marketers from the world’s top brands personalize experiences for millions of people with our cutting-edge technology, solutions and services. Epsilon’s best-in-class identity gives brands a clear, privacy-safe view of their customers, which they can use across our suite of digital media, messaging and loyalty solutions. We process 400+ billion consumer actions each day and hold many patents of proprietary technology, including real-time modeling languages and consumer privacy advancements. Thanks to the work of every employee, Epsilon India is now Great Place to Work-Certified™. Epsilon has also been consistently recognized as industry-leading by Forrester, Adweek and the MRC. Positioned at the core of Publicis Groupe, Epsilon is a global company with more than 8,000 employees around the world. For more information, visit epsilon.com/apac or our LinkedIn page.

    Job Description

    About BU

    At the heart of everything we do is data and this team. Our premium data assets empower the team to drive desirable outcomes for leading brands across industries. Armed with high volumes of transactional data, digital expertise and unmatched data quality, the team plays a key role in improving all our product offerings. Our data artisans are keen on embracing the latest in technology and trends, so there’s always room to grow and something new to learn here.

    Why we are looking for you?

    Expected to be familiar with database architecture, logical and physical design, automation, documentation, installs, shell scripting, Azure Databricks Admin, backup & recovery concepts, and database performance and tuning.

    What will you enjoy in this role?

    Seeking a highly motivated and well-rounded Azure Databricks DBA to be part of a fast paced production support and full-lifecycle implementation team. Candidate need to have exposure AWS/Azure cloud environments with Linux or Shell scripting skills will be preferred.

    What you’ll do

    • 3 plus years’ experience with Databricks Lakehouse Platform
    • Experience as the Databricks account owner, managing workspaces, AWS accounts, audit logs, and high-level usage monitoring.
    • Experience as Databricks workspace admin, managing workspace users and groups including single sign-on, provisioning, access control, and workspace storage.
    • Demonstrated ability to troubleshoot end-user and platform-level issues.
    • Familiarity with configuring cluster and job settings in Databricks.
    • Monitor and manage platform costs, including optimizing resource utilization to control expenses.
    • Implement and enforce security policies, access controls, and encryption mechanisms.
    • Strong understanding of S3 access management across a diverse user base.
    • Monitor and manage auto-scaling to ensure resources are allocated efficiently.
    • Stay up to date with security best practices and compliance requirements.
    • Maintain detailed documentation of configurations, procedures, and best practices.
    • Interact with vendor support to ensure open issues are resolved within defined SLA.
    • Conduct training sessions to educate users on platform best practices and capabilities.
    • Collaborate with cross-functional teams, including data scientists, data engineers, and business analysts, to understand their requirements and provide technical solutions.
    • Collaborate with data engineers to integrate Databricks with other data sources, data warehouses, and data lakes.
    • Provide 24 x 7 x 365 support through on-call rotation.

    Qualifications

    • B.S. in Computer Science or related discipline or equivalent work experience required. Databricks Platform Administration and/or AWS certification is a plus.
    • 8+ years of general technology experience (preferably database administration)
    • 3 years’ experience with Databricks Lakehouse Platform (AWS preferred)
    • Strong knowledge of Python, Apache Spark and Databricks architecture.
    • Proficiency in cloud platforms like AWS, Azure, or Google Cloud.
    • Experience with Linux/Unix system administration.
    • Strong Unix/Linux and shell scripting
    • Competence to read and write SQL code.
    • Ability to diagnose problems and resolve issues across various tiers
    • Adherence to SDLC, Change Management, troubleshooting and development methodologies.
    • Ability to perform work with minimal supervisory direction
    • Excellent written and verbal communication skills
    • Superb analytical and problem-solving skills
    Apply now Apply later
    • Share this job via
    • or

    * Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

    Tags: Architecture AWS Azure Big Data Computer Science Databricks Data quality GCP Google Cloud Linux Privacy Python SDLC Security Shell scripting Spark SQL

    Region: Asia/Pacific
    Country: India
    Job stats:  3  1  0

    More jobs like this

    Explore more AI, ML, Data Science career opportunities

    Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.