Data Engineer (Snowflake/Microsoft Fabric), AI & Data, Technology Consultin at EY
Singapore 048583, , Singapore -
Full Time


Start Date

Immediate

Expiry Date

14 Jul, 25

Salary

0.0

Posted On

14 Apr, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Warehousing, Business Intelligence, Etl, Presentation Skills

Industry

Information Technology/IT

Description

At EY, we develop you with future-focused skills and equip you with world-class experiences. We empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams.
We work together across our full spectrum of services and skills powered by technology and AI, so that business, people and the planet can thrive together.
We’re all in, are you?
Join EY and shape your future with confidence.

ABOUT THE OPPORTUNITY

EY AI & Data is the data and advanced analytics capability within EY Asia-Pacific, with over 500 specialist employees working across multiple industry sectors. We implement information-driven strategies, data platforms and advanced data analytics solution systems that help grow, optimize and protect client organizations. We go beyond strategy and provide end to end design, build and implementation of real-life data environments and have some of the best architects, project managers, business analysts, data scientists, big data engineers, developers and consultants in the region. We are looking for Data Engineers with experience in Snowflake or Microsoft Fabric to join the AI & Data team in our Singapore office. This role is offered on a flexible full-time basis.

SKILLS AND ATTRIBUTES FOR SUCCESS

  • Experience in ETL, Data Engineering, Scripting.
  • Knowledge and experience in end-to-end project delivery, either traditional SDLC or agile delivery methodologies (or hybrid approaches)
  • Experience in a delivery role on Business Intelligence, Data Warehousing, Big Data or analytics projects
  • Exceptional communication, documentation and presentation skills and stakeholder management experiences
  • Experience in business intelligence, data warehousing/platform, and data strategy projects

COMPANY DESCRIPTION

EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
All in to shape the future with confidence.

Responsibilities

YOUR KEY RESPONSIBILITIES

  • Design, develop, and maintain scalable and efficient data pipelines to extract, transform, and load (ETL) data from various sources into data warehouses and data lakes.
  • Collaborate with data scientists, data analysts, and business stakeholders to understand data requirements and deliver high-quality data solutions.
  • Implement data quality checks and monitoring to ensure data accuracy, consistency, and reliability.
  • Optimize data pipelines for performance, scalability, and cost-efficiency.
  • Develop and maintain data models, schemas, and metadata to support data analytics and reporting.
  • Work with either Snowflake or Microsoft Fabric
  • Implement data security and privacy best practices to protect sensitive data.
  • Troubleshoot and resolve data-related issues in a timely manner.
  • Stay up-to-date with the latest trends and best practices in data engineering and big data technologies.

TO QUALIFY FOR THE ROLE, YOU MUST HAVE

  • At least 2 years’ experience as a Data Engineer with experience in development and maintenance support
  • Data Migration experience in AWS, Azure, Hadoop, GCP environments
  • Experience in development and maintenance of data processing pipelines
  • Experience developing machine learning workflows
  • Experience in using Snowflake or Microsoft Fabric
  • Work closely with business analysts to create data component.
Loading...