Deloitte Jobs

Job Information

Deloitte DevOps Engineer (AWS, Azure, GCP) in Gilbert, Arizona

Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feel and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.

Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below ...

The Team

In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment.

The Analytics & Cognitive team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets.

Analytics & Cognitive will work with our clients to:

  • Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms

  • Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions

  • Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements

    Work you'll do/Responsibilities

    Qualifications (Required)

  • 3+ years experience in developing end to end technical solution involving data pipelines development ingestion, transformation and cleansing in the data analytics space collaborating with multi domain business stakeholders.

  • 3+ years experience in orchestrating and automating the data analytics pipeline, promoting features to production and automates quality.

  • 3+ years adept in agile methodologies and capable of applying DevOps and increasingly DataOps principles to data pipelines to improve the communication, integration, reuse, and automation of data flows between data managers and consumers across projects.

  • 3+ years strong experience in working on any Cloud platform that includes AWS, GCP & Azure

  • Should have good hands-on experience in utilizing common analytics languages such as Python, Shell etc

  • 3+ years on Infrastructure automation in client environment using Terraform and/or any deployment template service provided by any cloud platform provider such as AWS Codepipeline, Build, Deploy

  • 3+ years working experience with source code repo such as Github & configuration management tools such as Ansible and chef.

  • 3+ years experience with testing, building, designing and ability to maintain automated deployment/continuous integration process using any tools like Jenkins, Bamboo

  • Able to perform complex root cause analysis of problems and subsequently, make and implement recommendations to prevent future occurrences or customer impact.

  • Must live a commutable distance, or relocate, to one of the following cities: Atlanta, GA; Austin, TX; Charlotte, NC; Cincinnati, OH; Cleveland, OH; Detroit, MI; Gilbert, AZ; Indianapolis, IN; Kansas City, MO; Lake Mary, FL; Mechanicsburg, PA; Miami, FL; Nashville, TN; Phoenix, AZ; Pittsburgh, PA; St. Louis, MO; Tallahassee, FL; Tampa, FL

  • Travel up to 20% (While 20% of travel is a requirement of the role, due to COVID-19, non-essential travel has been suspended until further notice.)

  • Limited immigration sponsorship may be available

    Tools Experience (Required)

  • Cloud - AWS, GCP, Azure

  • Source Control version - Bitbucket, Github

  • CI/CD - Jenkins, Bamboo, Gitlab, Teamcity, AWS Codepipeline, Build, Deploy

  • Infrastructure as service - Terraform, AWS CloudFormation, Azure Resource Manager

  • Configuration Management - Chef, Puppet, Ansible

  • Container- Docker, Mesos, AWS ECS

  • Container Orchestration - Kubernetes (K8), AWS EKS

    Preferred Qualifications

  • ELK stack (ElasticSearch,Logstash,Kibana), Fluentd, Fluentbit, Filebeat

  • Working knowledge on Containers (Docker) and orchestration tool (Kubernetes)

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.