Staff Engineer - DevOps & Release - Big Data - Federal

Santa Clara, CALIFORNIA, United States

Applications have closed

ServiceNow

ServiceNow allows employees to work the way they want to, not how software dictates they have to. And customers can get what they need, when they need it.

View company page

Company Description

At ServiceNow, our technology makes the world work for everyone, and our people make it possible. We move fast because the world can’t wait, and we innovate in ways no one else can for our customers and communities. By joining ServiceNow, you are part of an ambitious team of change makers who have a restless curiosity and a drive for ingenuity. We know that your best work happens when you live your best life and share your unique talents, so we do everything we can to make that possible. We dream big together, supporting each other to make our individual and collective dreams come true. The future is ours, and it starts with you.

With more than 7,400+ customers, we serve approximately 80% of the Fortune 500, and we're proud to be one of FORTUNE's 100 Best Companies to Work For® and World's Most Admired Companies® 2022.

Learn more on Life at Now blog and hear from our employees about their experiences working at ServiceNow.

Unsure if you meet all the qualifications of a job description but are deeply excited about the role? We still encourage you to apply! At ServiceNow, we are committed to creating an inclusive environment where all voices are heard, valued, and respected. We welcome all candidates, including individuals from non-traditional, varied backgrounds, that might not come from a typical path connected to this role. We believe skills and experience are transferrable, and the desire to dream big makes for great candidates.

Job Description

Please Note:  This position will include supporting our US Federal customers.

This position requires passing a ServiceNow background screening, USFedPASS (US Federal Personnel Authorization Screening Standards). This includes a credit check, criminal/misdemeanor check and taking a drug test.  Any employment is contingent upon passing the screening.  Due to Federal requirements, only US citizens, US naturalized citizens or US Permanent Residents, holding a green card, will be considered.
 

The Big Data Team plays a critical and strategic role in ensuring that ServiceNow can exceed the availability and performance SLAs of the ServiceNow Platform powered Customer instances - deployed across the ServiceNow cloud and Azure cloud.  Our mission is to:

Deliver state of the art Monitoring, Analytics and Actionable Business Insights by employing new tools, Big Data systems, Enterprise Data Lake, AI, and Machine Learning methodologies that improve efficiencies across a variety of functions in the company: Cloud Operations, Customer Support, Product Usage Analytics, Product Upsell Opportunities enabling to have a significant impact both on the top-line and bottom-line growth. The Big Data team is responsible for:

  • Collecting, storing, and providing real-time access to large amount of data
  • Provide real-time analytic tools and reporting capabilities for various functions including:
    • Monitoring, alerting, and troubleshooting
    • Machine Learning, Anomaly detection and Prediction of P1s
    • Capacity planning
    • Data analytics and deriving Actionable Business Insights

 

What you get to do in this role:

  • Building, deploying, maintaining, and managing Big Data applications based on established best practices and ensuring availability, performance, scalability, and security of the Big Data systems. This is achieved using Software Configuration Management (SCM) and Build tools such as Git, GitLab, Nexus, Maven, Grunt and Node.Js.
  • Establishing Continuous Integration, and Continuous Deployment (CI/CD) pipelines for applications using tools such as Jenkins, Docker and Ansible.
  • Providing production support to resolve critical build and release issues and mitigating or minimizing any impact on Big Data applications. Providing support to Development, QA and System engineering teams in replicating complex issues leveraging experience with Build tools, Jenkins, CI/CD, Ansible etc
  • Enhancing build and release tools with new requirements from different stakeholders by ensuring that the technical specifications meet business needs. This involves automating build, release, deployment and configuration management system of non-prod and production environments using RPMs and scripting languages like Perl, Python, Bash and Groovy.
  • Supporting Cloudera based production Big Data analytics platform which includes resolving incident tickets created by Site Reliability Engineers (SRE).
  • Performance tuning and troubleshooting of various Hadoop components and other data analytics tools in the environment: HDFS, YARN, Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Kerberos, Tableau, Grafana and Prometheus.
  • Responsible for enforcing data governance policies in Commercial and Regulated Big Data environments.
  • Perform production monitoring and support for Big Data infrastructure and Big Data applications in ServiceNow cloud and Azure cloud. 

 

    Qualifications

    To be successful in this role you have:

    • 6+ years of overall experience with at least 3+ years as a Big Data DevOps / Release Engineer
    • Expert level experience in Software Configuration Management (SCM) and Build tools such as Git, GitLab, Nexus, Maven, Grunt and Node.js.
    • Demonstrated experience in many of these technologies - Jenkins, Ansible, Terraform, Puppet, Docker, Kubernetes orchestration and similar technologies
    • Demonstrated experience involving automating builds, release and configuration management system of non-prod and production environments using RPMs and scripting languages like Perl, Python, Bash and Groovy.
    • Good knowledge in Querying and analyzing large amount of data on Hadoop HDFS using Hive and Spark Streaming and working on systems like HDFS, YARN, Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Tableau, Grafana and Prometheus.
    • Experience securing Hadoop stack with Sentry, Ranger, LDAP, Kerberos KDC
    • Experience supporting CI/CD pipelines on Cloudera on Native cloud and Azure/AWS environments
    • In-depth knowledge of Linux internals (Centos 7.x) and shell scripting
    • Ability to learn quickly in a fast-paced, dynamic team environment
    • BS Degree in Computer Science or equivalent experience

     

    JB0023213

    Additional Information

    ServiceNow is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status or any other category protected by law.

    At ServiceNow, we lead with flexibility and trust in our distributed world of work. Click here to learn about our work personas: flexible, remote and required-in-office.

    If you require a reasonable accommodation to complete any part of the application process, or are limited in the ability or unable to access or use this online application process and need an alternative method for applying, you may contact us at talent.acquisition@servicenow.com for assistance.

    For positions requiring access to technical data subject to export control regulations, including Export Administration Regulations (EAR), ServiceNow may have to obtain export licensing approval from the U.S. Government for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by the U.S. Government.

    Please Note: Fraudulent job postings/job scams are increasingly common. Click here to learn what to watch out for and how to protect yourself. All genuine ServiceNow job postings can be found through the ServiceNow Careers site.

    Tags: Ansible AWS Azure Big Data CI/CD Computer Science Data Analytics Data governance DevOps Docker Engineering Git GitLab Grafana Hadoop HBase HDFS Kafka Kubernetes Linux Machine Learning Maven Node.js Perl Pipelines Python RabbitMQ Security Shell scripting Spark Streaming Tableau Terraform

    Perks/benefits: Career development Flex hours

    Region: North America
    Country: United States
    Job stats:  1  0  0

    More jobs like this

    Explore more AI, ML, Data Science career opportunities

    Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.