Technology Analyst - Azure, Snowflake, Python, ETL at Infosys
Burnaby, BC, Canada -
Full Time


Start Date

Immediate

Expiry Date

23 Nov, 25

Salary

75000.0

Posted On

23 Aug, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Unix, Dimensional Modeling, Travel, Data Vault, Data Streaming, Infosys, Data Modeling, Python, Snowflake, Time Travel, Sql, Hadoop, Hive, Data Warehouse Architecture, Data Warehousing, Communication Skills

Industry

Information Technology/IT

Description

Infosys is seeking an Technology Analyst - Azure, Snowflake, Python, ETL Data Engineer.
In this role, you will design, develop, and maintain data processing systems using Azure Data Lake Storage, Azure Synapse Analytics, Snowflake, Python, and SQL. You’ll collaborate with Product Owners, Data engineers, Analysts, and other stakeholders to understand requirements and deliver solutions in an entrepreneurial culture where teamwork is encouraged, excellence is rewarded, and diversity is valued.
Locations for this position are

Basic Qualifications:

  • Candidate must be located within commuting distance of Burnaby, BC or be willing to relocate to the area. This position may require travel.
  • Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
  • At least 2 years of Information Technology experience.
  • Candidates authorized to work for any employer in Canada without employer-based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time
  • Excellent English communication skills.

MANDATORY SKILLSETS:

  • 2-4 years of experience with Python and SQL.
  • 2-4 years designing and developing large-scale data processing pipelines using Azure Databricks and related technologies.

Good to have:

  • Expertise in Azure Data Lake Storage, Azure Synapse Analytics, and Snowflake.
  • Experience with Azure Data Factory, Azure Functions, and Azure Logic Apps.
  • Familiarity with Azure Data Factory for orchestration and Event Hubs/Service Bus for data streaming.
  • Strong knowledge of modern data warehouse architecture and data modeling.
  • Experience with Snowflake features such as Time Travel, Zero-Copy Cloning, and Snow pipe.
  • Experience with Hadoop, HDFS, Hive and other Big Data technologies.
  • Understanding of dimensional modeling, data vault, and other data warehousing methodologies.
  • UNIX shell scripting will be an added advantage in scheduling/running application jobs.

Other Experience:

  • 2-3 years in project development lifecycle activities and maintenance/support.
  • Experience working in Agile environments.
  • Ability to translate requirements into technical solutions meeting quality standards.
  • Collaboration skills in diverse environments to identify and resolve data issues.
  • Strong problem-solving and analytical abilities.
  • Experience in global delivery environments.
  • Commitment to staying current with industry trends in modern data warehousing.
Responsibilities

Please refer the Job description for details

Loading...