Data Engineer at EY
Toronto, ON M5H 0B3, Canada -
Full Time


Start Date

Immediate

Expiry Date

21 Oct, 25

Salary

0.0

Posted On

21 Jul, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Integration, Master Data Management, Ab Initio, Tableau, Datastage, Analytics, Regulatory Compliance, Architecture, Predictive Analytics, Design, Thinking Skills, Enterprise Reporting Solutions, Microstrategy, Data Quality, Sas, Strategy, Data Warehouse, Spss, Data Science

Industry

Information Technology/IT

Description

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all.

THE OPPORTUNITY

We are looking for a Data Engineer. Our AI & Data practice works collaboratively with our clients to enhance their ability to use and interpret data and develop their own enhanced information management capabilities to support better decision making, along with regulatory compliance within their business. Our clients need the vision to articulate the big picture and the precision to see the smallest of details.

SKILLS AND ATTRIBUTES FOR SUCCESS

  • Strong analytical and creative thinking skills, must be a team player
Responsibilities

YOUR KEY RESPONSIBILITIES

As a Data Engineer, you will:

  • Lead event stream analytics initiative and create the foundation for event data collection at EY Canada.
  • Design and build highly scalable and responsive platform to collect the data across all the brands and all devices (mobile apps, desktop …)
  • Work with teams across EY to drive the new platform adoption

TO QUALIFY FOR THE ROLE YOU MUST HAVE

  • CS Degree or equivalent experience
  • Databricks
  • Data Platform
  • Data Engineering
  • 5-10 years of experience building and shipping highly scalable clickstream data pipelines and analytics systems on distributed data systems and cloud platforms (AWS/Azure/GCP)
  • Experience with Java, Scala, Python etc.
  • Solid experience in streaming technologies like Kafka or Spark Streaming.
  • Experience building batch, real-time and streaming analytics pipelines with data from event data streams, NoSQL and APIs.
  • Experience with batch & stream processing technologies such Spark, KStreams, Samza etc.
Loading...