Senior Devops Engineer - Big Data - Federal

Chicago, ILLINOIS, United States

Applications have closed

ServiceNow

ServiceNow allows employees to work the way they want to, not how software dictates they have to. And customers can get what they need, when they need it.

View company page

Company Description

At ServiceNow, our technology makes the world work for everyone, and our people make it possible. We move fast because the world can’t wait, and we innovate in ways no one else can for our customers and communities. By joining ServiceNow, you are part of an ambitious team of change makers who have a restless curiosity and a drive for ingenuity. We know that your best work happens when you live your best life and share your unique talents, so we do everything we can to make that possible. We dream big together, supporting each other to make our individual and collective dreams come true. The future is ours, and it starts with you. 

With more than 7,400+ customers, we serve approximately 80% of the Fortune 500, and we're proud to be one of FORTUNE's 100 Best Companies to Work For® and World's Most Admired Companies® 2022.

Learn more on Life at Now blog and hear from our employees about their experiences working at ServiceNow.

Unsure if you meet all the qualifications of a job description but are deeply excited about the role? We still encourage you to apply! At ServiceNow, we are committed to creating an inclusive environment where all voices are heard, valued, and respected. We welcome all candidates, including individuals from non-traditional, varied backgrounds, that might not come from a typical path connected to this role. We believe skills and experience are transferrable, and the desire to dream big makes for great candidates.

Job Description

Please Note:  This position will include supporting our US Federal customers.
This position requires passing a ServiceNow background screening, USFedPASS (US Federal Personnel Authorization Screening Standards). This includes a credit check, criminal/misdemeanor check and taking a drug test. Any employment is contingent upon passing the screening.  Due to Federal requirements, only US citizens, US naturalized citizens or US Permanent Residents, holding a green card, will be considered.

As a Senior DevOps Engineer on our Big Data Federal Team you will help deliver 24x7 support for our Government Cloud infrastructure.
The Federal Big Data Team has 3 shifts that provide 24x7 production support for our Big Data Government cloud infrastructure. 

Below are some highlights.

  • 4 Day work week (either Sunday to Wednesday or Wednesday to Saturday)
  • No on-call rotation
  • Shift Bonuses for 2nd and 3rd shifts
  • This is a 2nd Shift position with work hours from 3 pm - 1 am Pacific Time

The Big Data team plays a critical and strategic role in ensuring that ServiceNow can exceed the availability and performance SLAs of the ServiceNow Platform powered Customer instances - deployed across the ServiceNow cloud and Azure cloud.  Our mission is to:

Deliver state-of-the-art Monitoring, Analytics and Actionable Business Insights by employing new tools, Big Data systems, Enterprise Data Lake, AI, and Machine Learning methodologies that improve efficiencies across a variety of functions in the company: Cloud Operations, Customer Support, Product Usage Analytics, Product Upsell Opportunities enabling to have a significant impact both on the top-line and bottom-line growth. The Big Data team is responsible for:

  • Collecting, storing, and providing real-time access to large amounts of data
  • Provide real-time analytic tools and reporting capabilities for various functions including:
    • Monitoring, alerting, and troubleshooting
    • Machine Learning, Anomaly detection and Prediction of P1s
    • Capacity planning
    • Data analytics and deriving Actionable Business Insights

What you get to do in this role

  • Responsible for deploying, production monitoring, maintaining and supporting of Big Data infrastructure, Applications on ServiceNow Cloud and Azure environments.
  • Architect and drive the end-end Big Data deployment automation from vision to delivering the automation of Big Data foundational modules (Cloudera CDP), prerequisite components and Applications leveraging Ansible, Puppet, Terraform, Jenkins, Docker, Kubernetes to deliver end-end deployment automation across all ServiceNow environments.
  • Automate Continuous Integration / Continuous Deployment (CI/CD) data pipelines for applications leveraging tools such as Jenkins, Ansible, and Docker.
  • Performance tuning and troubleshooting of various Hadoop components and other data analytics tools in the environment: HDFS, YARN, Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Hue, Kerberos, Tableau, Grafana, MariaDB, and Prometheus.
  • Provide production support to resolve critical Big Data pipelines and application issues and mitigating or minimizing any impact on Big Data applications. Collaborate closely with Site Reliability Engineers (SRE), Customer Support (CS), Developers, QA and System engineering teams in replicating complex issues leveraging broad experience with UI, SQL, Full-stack and Big Data technologies.
  • Responsible for enforcing data governance policies in Commercial and Regulated Big Data environments.

Qualifications

To be successful in this role you have:

  • 4 + years of overall experience with at least 2+ years as a Big Data DevOps / Deployment Engineer
  • Demonstrated expert level experience in delivering end-end deployment automation leveraging Puppet, Ansible, Terraform, Jenkins, Docker, Kubernetes or similar technologies.
  • Deep understanding of Hadoop/Big Data Ecosystem. Good knowledge in Querying and analyzing large amount of data on Hadoop HDFS using Hive and Spark Streaming and working on systems like HDFS, YARN, Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Hue, Tableau, Grafana, MariaDB, and Prometheus.
  • Experience securing Hadoop stack with Sentry, Ranger, LDAP, Kerberos KDC
  • Experience supporting CI/CD pipelines on Cloudera on Native cloud and Azure/AWS environments
  • Good knowledge of Perl, Python, Bash, Groovy and Java.
  • In-depth knowledge of Linux internals (Centos 7.x) and shell scripting
  • Ability to learn quickly in a fast-paced, dynamic team environment

Additional Information

ServiceNow is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status or any other category protected by law.

At ServiceNow, we lead with flexibility and trust in our distributed world of work. Click here to learn about our work personas: flexible, remote and required-in-office.

If you require a reasonable accommodation to complete any part of the application process, or are limited in the ability or unable to access or use this online application process and need an alternative method for applying, you may contact us at talent.acquisition@servicenow.com for assistance.

For positions requiring access to technical data subject to export control regulations, including Export Administration Regulations (EAR), ServiceNow may have to obtain export licensing approval from the U.S. Government for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by the U.S. Government.

Please Note: Fraudulent job postings/job scams are increasingly common. Click here to learn what to watch out for and how to protect yourself. All genuine ServiceNow job postings can be found through the ServiceNow Careers site.

 

From Fortune. © 2022 Fortune Media IP Limited All rights reserved. Used under license.

Fortune and Fortune Media IP Limited are not affiliated with, and do not endorse products or services of, ServiceNow.

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Ansible AWS Azure Big Data CI/CD Data Analytics Data governance Data pipelines DevOps Docker Engineering Grafana Hadoop HBase HDFS Java Kafka Kubernetes Linux Machine Learning MariaDB Perl Pipelines Python RabbitMQ Shell scripting Spark SQL Streaming Tableau Terraform

Perks/benefits: Career development Flex hours

Region: North America
Country: United States
Job stats:  3  0  0

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.