Senior Data Engineer II

New York

Applications have closed

DoubleVerify

DoubleVerify is driven by a mission – to make the digital advertising ecosystem stronger, safer and more secure.

View company page

Company: DoubleVerify

Role: Senior Data Engineer II (P4) - Social Integrations

Type: Full Time

Reports To: Team Lead, Social Integration

Location: New York, NY

Who we are

DoubleVerify is the leading independent provider of marketing measurement software, data, and analytics that authenticates the quality and effectiveness of digital media for the world's largest brands and media platforms. DV provides media transparency and accountability to deliver the highest level of impression quality for maximum advertising performance. Since 2008, DV has helped hundreds of Fortune 500 companies gain the most from their media spending by delivering best-in-class solutions across the digital ecosystem, helping to build a better industry. Learn more at www.doubleverify.com.

The Role

As a Senior Data Engineer II in Social Integrations, you own new initiatives, and design and build world-class data systems to ingest and analyze billions of records per day from the world’s biggest social platforms, like Facebook, Youtube, Snap, and more. You build high-volume, high-availability, low-latency APIs to support live traffic metrics collection. You develop systems that analyze and categorize social content to power tools that allow advertisers to understand and control where their ads run.  You use state-of-the-art technologies, frameworks, and strategies to address complex challenges with Big-Data processing and analytics to achieve the above objectives.

What you’ll do

  • Write solid code with a focus on high performance for services supporting high-throughput and low-latency
  • Architect, design, and build big-data-processing platforms, handling tens of TBs/day, serve thousands of clients and support advanced analytic workloads
  • Provide meaningful and relevant feedback to junior developers and stay up-to-date with technology changes
  • Explore the technological landscape for new ways of producing, processing, and analyzing data, in order to gain insights on our users and product features
  • Design, develop, and test data-driven products, features, and APIs that scale
  • Continuously improve the quality of deliverables and SDLC processes
  • Operate production environments, investigate issues, assess their impact, and come up with feasible solutions
  • Understand business needs and work with product owners to establish priorities 
  • Translate between business / product requirements and technical details
  • Work in multi-functional agile teams with end-to-end responsibility for product development and delivery

Who you are

  • 5+ years of programming experience in coding, object-oriented design, and/or functional programming including Python, Scala, or related language
  • Lead by example - design, develop and deliver quality solutions
  • Love what you do and are passionate about crafting clean code 
  • Deep understanding of distributed system technologies, standards, and protocols, and have 2+ years of experience working in distributed systems like Airflow, BigQuery, Spark Streaming, Kafka ecosystem ( Kafka Connect, Kafka Streams, or Kinesis), and building data pipelines at scale
  • Hands-on experience building low-latency, high-throughput APIs, and comfortable using external APIs from platforms
  • Expertise in relational database concepts, data modeling and crafting complex SQL queries
  • Cares about agile software processes, data-driven development, software reliability, and responsible experimentation
  • Genuine desire to automate decision-making, processes, and workflows
  • Experience working with process orchestration tools such as Luigi/Airflow
  • Experience with DevOps domain - working with build servers, docker, and containers clusters (kubernetes)
  • Experience mentoring and growing a diverse team of talented data engineers
  • B.S./M.S. in Computer Science or a related field
  • Excellent communication skills and a team player
  • Experience with the following technologies is a plus:
    • Columnar data stores
    • Cloud environment, Google Cloud Platform 
    • Ad-serving technologies and standards

The successful candidate’s starting salary will be determined based on a number of non-discriminating factors, including qualifications for the role, level, skills, experience, location, and balancing internal equity relative to peers at DV.

The estimated salary range for this role based on the qualifications set forth in the job description is between $99,000 - $197,000. This role will also be eligible for bonus/commission (as applicable), equity, and benefits. 

The range above is for the expectations as laid out in the job description; however, we are often open to a wide variety of profiles, and recognize that the person we hire may be more or less experienced than this job description as posted.

Don’t meet every single requirement? Studies have shown that women and people of color are less likely to apply to jobs unless they meet every single qualification. At DoubleVerify we are dedicated to building a diverse, inclusive and authentic workplace, so if you’re excited about this role but your past experience doesn’t align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right candidate for this or other roles!



Tags: Agile Airflow APIs BigQuery Computer Science Data pipelines DevOps Distributed Systems Docker GCP Google Cloud Kafka Kinesis Kubernetes Pipelines Python Scala SDLC Spark SQL Streaming

Perks/benefits: Equity Salary bonus

Region: North America
Country: United States
Job stats:  5  1  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.