Deloitte Jobs

Job Information

Deloitte Google Cloud Senior Data Engineer in Chicago, Illinois

Google Cloud Sr. Data Engineer-Specialist Master

Deloitte is a historic leader in the industry's multi-year journey to cloud. Our clients have turned the corner and are engaging Deloitte for assistance in exploiting cloud at the highest level. We are looking for cloud veterans who have experience in designing, developing, and deploying high value and high-volume cloud-native applications on hyper scaler cloud platforms. We are looking for experienced architects and engineers who can apply the latest in advanced analytics and AI/ML technology to help our clients realize the potential of big data and automation. We are looking for experts in scalable network and security architectures regulated industries and a cyber-focused world.

Work you'll do

In partnership with other leaders in the firm, you will focus on developing and delivering leading-edge cloud solutions using Google Cloud and, as a key member of our engineering practice solve the tough problems and ensure success in designing and building complex world-class applications on public cloud for our clients.

  • Enable our clients to tackle tough problems and innovate at scale with hands-on-the-keyboard architect and engineer high QoS solutions.

  • Leading by example, you will manage complex engagements and support others at the same time.

  • Leverage our deep relationships and partner programs to maintain leading edge skills - participate in trusted tester programs and alpha/betas.

  • Assist in developing our community of practice by sharing your knowledge and experience with your peers and apprentices.

  • Collaborate with our partners to build repeatable cloud-native solutions that accelerate our clients' path to value.

  • Author or otherwise contribute to public cloud customer-facing publications such as podcasts, blogs, and whitepapers .

Qualifications:

Required:

  • 3+ years of leading teams

  • 5+ years of data architecture experience, including schema design and query optimization

  • 5+ years of experience with data cleansing tools, methods and practices

  • 5+ years of experience building data pipelines in traditional information systems environments

  • 5+ years of Python and/or Java or Spark experience

  • 5+ years of experience with Relational and NoSQL database technologies

  • Ability to travel 0-50% on average, based on the work you do and the clients and industries/sectors you serve

Preferred:

  • Industry specific knowledge (healthcare, finance, etc)

  • Experience with Google Dataflow/Apache Beam

  • Any Cloud Experience and/or certifications. GCP Preferred.

Sponsorship:

Limited immigration sponsorship may be available

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.

DirectEmployers