Deloitte Data Engineer in Madison, Wisconsin
Are you an experienced, passionate pioneer in technology? An industry solutions professional who wants to work in a collaborative environment. As an experienced Data Engineer, you will have the ability to share new ideas and collaborate on projects as a consultant without the extensive demands of travel. If so, consider an opportunity with Deloitte under our Project Delivery Talent Model. Project Delivery Model (PDM) is a talent model that is tailored specifically for long-term, onsite client service delivery. PDM practitioners are local to project locations, minimizing extensive travel, and provides you with a full career path within the firm.
Work you'll do/Responsibilities
The Analytics team is responsible for collecting, analyzing, and reporting on customer insights. From this data we generate insights into how customers interact with our products and use these insights to drive improvements to user-facing features.
We are looking for an extraordinary engineer to join the worldwide business development and strategy team. This is an opportunity to join a fast-paced team that plays a key role in the overall success of our organization through technology enablement. You'll play a critical part in driving our technology vision forward and ensuring that we execute across multiple initiatives.
As a part of the US Strategy & Analytics Offering Portfolio, the AI & Data Operations offering provides managed AI, Intelligent Automation, and Data DevOps services across the advise-implement-operate spectrum.
Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience
Limited immigration sponsorship may be available
Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve
Lead ML/Data Scientist
• Advanced hands-on experience with Python, Spark (PySpark, Scala) programming languages
• Intermediate understanding of OLAP systems and Data warehousing & Data modelling concepts
• Advanced experience working on Big Data, BI or Analytics related projects as a technical lead and individual contributor
• Advanced knowledge of SQL/Hive/Trino
• Intermediate experience with various performance tuning techniques in Spark/Hive/Teradata
• Advanced experienced with job schedulers developing shell scripts, CRON, Airflow jobs to automate data workflows
• Experienced in leading a team of data engineers or individually developing technical solutions to business problems
• Experienced in leading a team of data engineers or individually working on the implementation of data integration requirements and developing the pipeline of data from raw to curation layers including the cleansing, transformation, derivation and aggregation of data
• Ability to communicate effectively (written and spoken).
• Ability to work with the multi-location development teams and self-manage individual and others work