Big Data Expert/Architect (Remote)

Austin, TX, United States

Applications have closed

Company Description

We are Vericast.  We create meaningful connections between business and the people that they serve-how, when and where it matters. By pushing the boundaries of data and insights, we spark discovery and inspire action to create profitable results. Valassis, the digital division of Vericast, is a leader in the marketing technology space, processing over 12 trillion calculations and analyzing more than 40 billion digital marketing opportunities daily. We influence the purchasing and transaction behavior of more than 120 million households – impacting where they eat, shop, buy, save and borrow. Follow Vericast on LinkedIn and Twitter to learn how Vericast drives commerce.

Job Description

Vericast is seeking a Principal Software Engineer to develop services, tools, bots, and workflows for our big data processing infrastructure.

The Big Data Platform Team owns and operates the world-class big data processing infrastructure that over a dozen engineering teams use to power their 24/7 technical marketing products. We own and operate a large hadoop+spark processing cloud on which we store 6PB of data, and run 30 thousand jobs every day. We write a fabric of java microservices and bots to manage all those jobs, extend hadoop's functionality, and help us operate and optimize our investment. We write custom data streaming services that ingest and curate 100TB of data each day that drives the entire advertising data ecosystem. We also participate in all our users' groundbreaking workflow projects so that we can constantly stay abreast of our developer's tooling needs.

What you're like:

This position is perfect for a far-sighted engineer who always wants to be the first to apply cutting-edge technologies to solve complex business and engineering problems. You want to have a leading voice on our team and across our organization. You will work throughout the software lifecycle including customer interaction and product planning, requirements analysis, architecture, directing our team of developers, development, testing and operations. If you are energized by the thought of developing new system stacks and tools for big data processing and analytics we want to talk to you. If you have worked on big data engineering, cloud migration, or infrastructure tooling projects, we want to talk to you. If you have ever worked on a collection of data workflows or a service mesh and thought 'I could make this better', we want to talk to you!

What you'll do:

  • Work with our users, architects, and product leaders to architect and plan our data platforms
  • Design, develop, and maintain the software and systems that make up the data platform that runs our entire business
  • Partner with the Data Engineering and Data Science teams who use our platform to diagnose, predict and address scaling problems
  • Work on new products initiatives to provide design support and establish best practices
  • Contribute to our team’s growing set of development platforms, tools, processes, and products


Your skills:

  • Experience working on big data systems and technologies with emphasis on the Hadoop platform
  • General knowledge of design patterns & UML with a few years of taking a lead on architectural design and development
  • Proficiency in Java, Scala, or Python programming; exposure to microservices and Spark dataframe programming.
  • Proficiency in networking, Thrift, Spring Framework and/or Spring Boot for microservices is a plus. 
  • Understand RDMS and proficiency in DML, SQL & PL/SQL a plus
  • Hands on experience with Spark; exposure to Kafka and YARN or similar technologies
  • Experience with migration of infrastructure from on-prem to cloud or vice versa is a big advantage
  • Curiosity to learn and apply new technologies and a background full of diverse design challenges
  • Excellent problem-solving abilities
  • Excellent verbal, graphical, and written communication skills
  • Experience with agile development methodologies

Qualifications

Your qualifications:

  • BS/MS in Computer Science or other technical discipline (with significant computer coursework)
  • 10+ recent years of professional software development experience using java, scala, or python
  • 3+ recent years working with the hadoop+spark big data platform or similar.

Additional Information

Salary: $175,000-185,000

Position is eligible for an annual bonus incentive program

The ultimate compensation offered for the position will depend upon several factors such as skill level, cost of living, experience, and responsibilities.

Vericast is an equal opportunity employer.   All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.

Applicants who have disabilities may request that accommodations be made in order to complete the selection process by contacting our Recruiting Administrator.

#LI-LC1

#LI-Remote

 

Tags: Agile Architecture Big Data Computer Science Engineering Hadoop Kafka Microservices Python Scala Spark SQL Streaming Testing

Perks/benefits: Career development Salary bonus

Regions: Remote/Anywhere North America
Country: United States
Job stats:  6  0  0

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.