Deloitte Machine Learning Developer in Indianapolis, Indiana
Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feel, and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.
Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below.
Analytics & Cognitive
In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment.
The Analytics & Cognitive team leverages the power of data, analytics, robotics, science, and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets.
Analytics & Cognitive will work with our clients to:
• Implement large-scale data ecosystems including data management, governance, and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms.
• Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions.
• Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise, and providing As-a-Service offerings for continuous insights and improvements.
Work you'll do/Responsibilities
• Unify Machine Learning (ML) systems development and ML systems deployment to standardize and streamline the continuous delivery of high-performing models in production
• Use statistical and machine learning techniques to create scalable analytics solutions
• Develop end-to-end (Data/Dev/ML) Ops pipelines based on in-depth understanding of cloud platforms, AI lifecycle, and business problems to ensure analytics solutions are delivered efficiently, predictably, and sustainably.
• Facilitated the development of objectives by selecting and employing the appropriate SDLC methodologies, and gathering business requirements relevant to ML
• Execute best practices in version control and continuous integration / continuous delivery
• Operationalize and monitor ML models using high end tools and technologies
• Assist in the development and execution of an AI/Data governance framework, with a focus on ensuring that AI technologies are well researched and developed
• Help administer and work within an AI code of Ethics to tackle issues such as Privacy, Discrimination, Data Ethics, and promote Responsible Innovation
• Prototype and demonstrate solutions for clients in customer environments
• Explain model behavior/results to both Technical and Non-Technical Audiences
• Collaborate with data scientists, data engineers and other key stakeholders in a fast-paced cross-functional and diverse environment
• Stay current on new products and relevant industry trends in the field of ML
• 3+ years of IT experience. Minimum 2 years of relevant experience delivering AI/ML projects.
• Solid understanding of the ML lifecycle and concepts
• 3+ years experience with the design and implementation (building, containerizing, and deploying end to end automated data and ML pipelines) of automated cloud solutions
• Experience with TensorFlow, Pytorch, and other deep learning frameworks
• Experience with version control tools such as Git
• Extensive experience working in an Agile development environment
• Fluency in Python, R, and other common ML Languages
• Fluency in both structured and unstructured data (SQL, NOSQL)
• Production experience with Apache Spark
• Hands-on experience with web APIs, CI/CD for ML, and Serverless Deployment
• Familiarity with Linux OS and Windows servers
• Knowledge of Docker, Jenkins, Kubernetes, and other DevOps tools.
• Outstanding analytical and problem-solving skills
• Travel up to 20% (While 20% of travel is a requirement of the role, due to COVID-19, non-essential travel has been suspended until further notice).
• Must live a commutable distance to one of the following cities: Atlanta, GA; Austin, TX; Boston, MA; Charlotte, NC; Chicago, IL; Cincinnati, OH; Cleveland, OH; Dallas, TX; Detroit, MI; Gilbert, AZ; Houston, TX; Indianapolis, IN; Kansas City, MO; Lake Mary, FL; Los Angeles, CA; Mechanicsburg, PA; Miami, FL; McLean, VA; Minneapolis, MN; Nashville, TN; Orange County, CA; Philadelphia, PA; Phoenix, AZ; Pittsburgh, PA; Rosslyn, VA; Sacramento, CA; St. Louis, MO; San Diego, CA; Seattle, WA; Tallahassee, FL; Tampa, FL; or be willing to relocate to one of the following USDC locations: Gilbert, AZ; Lake Mary, FL; Mechanicsburg, PA.
• Limited Immigration sponsorship may be available
• Certification from any of the three major cloud platforms (AWS / Azure / GCP) in Cloud Architecture / Engineering / DevOps / ML.
• Familiarity with Kubeflow or mlflow
• Experience with machine learning pipelines (Azure ML)
• Familiarity with the latest Natural Language Processing or Computer Vision related algorithms
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.