BigData Multi-Cloud DevOps Platform Engineer
Hyderabad, India
Experian
Experian is committed to helping you protect, understand, and improve your credit. Start with your free Experian credit report and FICO® score.Company Description
Experian is the world’s leading global information services company. During life’s big moments — from buying a home or a car to sending a child to college to growing a business by connecting with new customers — we empower consumers and our clients to manage their data with confidence. We help individuals to take financial control and access financial services, businesses to make smarter decisions and thrive, lenders to lend more responsibly, and organizations to prevent identity fraud and crime.
We have 17,800 people operating across 44 countries, and every day we’re investing in new technologies, talented people and innovation to help all our clients maximize every opportunity. We are listed on the London Stock Exchange (EXPN) and are a constituent of the FTSE 100 Index.
Learn more at www.experianplc.com or visit our global content hub at our global news blog for the latest news and insights from the Group
Experian is the world’s leading global information services company. During life’s big moments — from buying a home or a car to sending a child to college to growing a business by connecting with new customers — we empower consumers and our clients to manage their data with confidence. We help individuals to take financial control and access financial services, businesses to make smarter decisions and thrive, lenders to lend more responsibly, and organizations to prevent identity fraud and crime.
We have 17,800 people operating across 44 countries, and every day we’re investing in new technologies, talented people and innovation to help all our clients maximize every opportunity. We are listed on the London Stock Exchange (EXPN) and are a constituent of the FTSE 100 Index.
Learn more at www.experianplc.com or visit our global content hub at our global news blog for the latest news and insights from the Group
Job Description
· Deliver innovative CI/CD solutions using the most cutting-edge techno stack!
· Automating infrastructure and Big Data technologies deployment, build and configuration using DevOps tools.
· Design and configuration of monitoring systems (e.g. ELK, Prometheus, Cloudera Manager, Splunk, Dynatrace).
· Managing application containerization using Docker and Kubernetes would be highly critical.
· Expert knowledge with delivering Solutions in the cloud with AWS or Azure or Oracle Cloud.
· Responsible for implementation and ongoing administration of Big Data infrastructure.
· Leading the thinking on automation of repetitive tasks and enabling a better engineering experience (release/monitoring and guiding efforts through the team.
· Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Big Data Environments.
· Collaborating with multiple teams to perform updates, patches, version upgrades when required.
· General operational expertise such as good troubleshooting skills, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks
· Solid Understanding On-premise and Cloud network architectures
· Ability to configure and support API and Open Source integrations
· Experience working with DevOps environment, developing solutions utilizing terraform Ansible, Chef etc.
· Help to shape enterprise solutions to allow better serviceability, monitoring and alerting to meet the increasing business requirements and demands.
· Keep track of latest release versions and update the running workload consistently across stack.
· Working with an agile team to develop, test, and maintain APIs
· Assisting in the collection and documentation of user's requirements, development of user stories, and estimates
· Preparing reports, manuals and other documentation on the status, operation and maintenance of software.
· Will collaborate and communication with all levels of technical and senior business management
· Will require on-call 24X7 support of production systems on a rotation basis with other team members
· Pro-actively evaluate evolving technologies and recommend solutions to business problems.
Qualifications
· Typically requires a bachelor's degree (in Computer Science or related field) or equivalent.
· 3+ years of Linux (Redhat) system administration
· 3+ years of DevOps Engineer Experience.
· Cloud Platforms IaaS/PaaS – Cloud solutions: AWS, Azure, Oracle Cloud, GCP, VMWare
· Strong hands on skills with scripting in Bash, Shell and Python would be mandatory.
· Experience working with DevOps and Continuous Integration tools and technologies including Docker, Ansible, Bamboo, Jenkins and Terraform.
· Experience with tools & applications (Eclipse, IntelliJ, JIRA, Confluence, Bitbucket, Git and Artifactory).
· Experience with Agile software development and DevOps methodologies, tools and practices.
· Aware of operational Support needs and good with documentation and automation.
· Must have strong experience in DevOps tools like Ansible. Automating deployments and monitoring/alerting tasks using terraform.
· Experience working with Open-source products.
· Strong Problem Solving and creative thinking skills
· Effective oral and written communications
· Experience working with geographically distributed teams
· Bachelors or master’s degree in Computer Science or equivalent experience
· Knowledge and understanding of the business strategy and use of back office applications.
· Ability to adapt to multi-lingual and multicultural environment, additional language skills are a bonus.
· Ability to handle conflicting priorities.
· Ability to learn.
· Adaptability.
· Receptive to change.
· Ability to communicate with business users at all levels
· Analytical skills
· Self-motivated and pro
Additional Information
Experian Careers - Creating a better tomorrow together
Find out what its like to work for Experian by clicking here
Tags: Agile Ansible APIs Architecture AWS Azure Big Data Bitbucket CI/CD Computer Science DevOps Docker ELK Engineering GCP Git Jira Kubernetes Linux Open Source Oracle Python Splunk Terraform
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open AI Engineer jobs
- Open Lead Data Analyst jobs
- Open MLOps Engineer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Engineer II jobs
- Open Sr Data Engineer jobs
- Open Data Manager jobs
- Open Principal Data Engineer jobs
- Open Power BI Developer jobs
- Open Data Analytics Engineer jobs
- Open Junior Data Scientist jobs
- Open Business Intelligence Developer jobs
- Open Product Data Analyst jobs
- Open Senior Data Architect jobs
- Open Data Scientist II jobs
- Open Sr. Data Scientist jobs
- Open Manager, Data Engineering jobs
- Open Business Data Analyst jobs
- Open Big Data Engineer jobs
- Open Data Analyst Intern jobs
- Open Data Quality Analyst jobs
- Open Principal Data Scientist jobs
- Open Data Product Manager jobs
- Open Junior Data Engineer jobs
- Open ETL Developer jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open ML models-related jobs
- Open GCP-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open Java-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open APIs-related jobs
- Open Deep Learning-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open Snowflake-related jobs
- Open TensorFlow-related jobs
- Open PhD-related jobs
- Open CI/CD-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open Kubernetes-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open Data warehouse-related jobs