Deloitte Jobs

Job Information

Deloitte Sr. Big Data Consultant - Hadoop in Gilbert, Arizona

Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feel and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.

Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below ...

Responsibilities

Function as an integrator between business needs and technology solutions, helping to create technology solutions to meet clients' requirements. Be responsible for developing and

testing solutions that aligns with clients' systems strategy, requirements, and design as well as supporting system implementation. Manage data pipeline process starting from acquisition to ingestion, storage, and provisioning of data to point-of-impact by modernizing and enabling new capabilities. Facilitate Data Integration on traditional and Hadoop environments by assessing client's enterprise IT environments. Guide clients to the future IT environment state to support meeting their long term business goals. Enhance business drivers through enterprise-scale applications that enable visualization, consumption and monetization of both structured and unstructured data.

The Team

Analytics & Cognitive

In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment.

The Analytics & Cognitive team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets.

Analytics & Cognitive will work with our clients to:

  • Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms

  • Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions

  • Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements

Qualifications

Required

  • 5+ years' experience working with Big Data eco-system including tools such as Hadoop, Spark, Map Reduce, Sqoop, HBase, Hive and Impala

    Strong technical expertise in most of the following:

  • Hadoop (Cloudera distribution)

  • Spark with Scala or Python programming

  • Experience in building Microservices using Java

  • Hive Tuning, Bucketing, Partitioning, UDF, UDAF

  • NOSQL Data Base such as HBase, MongoDB or Cassandra

  • Experience and knowledge working in Kafka, Spark streaming, Sqoop, Oozie, Airflow, Control-M

  • Knowledge of working in financial/insurance domain

  • 7+ years of experience of professional work experience

  • Strong technical skills including understanding of software development principles

  • Hands-on programming experience

  • Must be eligible to obtain a US Government security clearance

  • Must be legally authorized to work in the United States without sponsorship, now or any time in the future

  • Must live a commutable distance to one of the following cities: Atlanta, GA; Austin, TX; Boston, MA; Charlotte, NC; Chicago, IL; Cincinnati, OH; Cleveland, OH; Dallas, TX; Detroit, MI; Gilbert, AZ; Houston, TX; Indianapolis, IN; Kansas City, MO; Lake Mary, FL; Los Angeles, CA; Mechanicsburg, PA; Miami, FL; McLean, VA; Minneapolis, MN; Nashville, TN; Orange County, CA; Philadelphia, PA; Phoenix, AZ; Pittsburgh, PA; Rosslyn, VA; Sacramento, CA; St. Louis, MO; San Diego, CA; Seattle, WA; Tallahassee, FL; Tampa, FL; or be willing to relocate to one of the following USDC locations: Gilbert, AZ; Lake Mary, FL; Mechanicsburg, PA.

  • Ability to travel up to 15% (While 15% of travel is a requirement of the role, due to COVID-19, non-essential travel has been suspended until further notice.)

    Preferred

  • Proficiency in one or more modern programming languages like Python or Scala

  • Experience on data lakes, datahub implementation

  • Knowledge on AWS or Azure platforms

  • Knowledgeable in techniques for designing Hadoop-based file layout optimized to meet business needs

  • Able to translate business requirements into logical and physical file structure design

  • Ability to build and test solution in agile delivery manner

  • Ability to articulate reasons behind the design choices being made

  • Bachelor of Science in Computer Science, Engineering, or MIS, or equivalent experience

  • Any bigdata certification is a plus

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.

DirectEmployers