Start Date
Immediate
Expiry Date
05 May, 25
Salary
50.67
Posted On
05 Feb, 25
Experience
5 year(s) or above
Remote Job
No
Telecommute
No
Sponsor Visa
No
Skills
Glue, Airflow, Communication Skills, Java, Architecture, Data Modeling, Performance Tuning, French, Python, Snowflake, Scala, Data Engineering
Industry
Information Technology/IT
DURATION: 12 MONTHS WITH POSSIBLE EXTENSION
We are seeking an experienced Senior Data Engineer to design, implement, and optimize scalable data solutions that support our business goals. In this role, you will oversee the development of our cloud data warehouse, lead data pipeline improvements, and establish data architecture standards. This is a unique opportunity to work with a talented team on cutting-edge technologies.
QUALIFICATIONS
Minimum of 5 years of experience in data engineering, with a strong focus on data pipelines, architecture, and cloud-based solutions.
Expertise in AWS services (S3, Lambda, Glue, Airflow) and proficiency in Snowflake.
Advanced SQL skills and proficiency in at least one programming language (Python, Java, or Scala).
Strong understanding of data modeling, governance, ETL processes, and performance tuning in distributed environments.
Proven experience in leading and managing technical teams.
Excellent communication skills, with the ability to articulate complex concepts to both technical and non-technical stakeholders.
Architect and design scalable, high-performance data solutions using technologies like Snowflake, Airflow, and DBT.
Hands-on experience with Apache Airflow for orchestrating complex data workflows.
Must be analytic, creative and self-motivated.
ETL tool DBT, Advanced SQL along with good Python scripting skill Data Modelling and source to target mapping skills.
Excellent SQL coding skills.
Oversee the development and maintenance of a cloud data warehouse, alongside the associated data pipelines.
Establish and enforce data architecture standards, ensuring long-term maintainability and alignment with best practices.
Monitor platform performance, identify potential bottlenecks, and implement solutions to optimize efficiency.
Implement robust security protocols to safeguard sensitive information.
Work Effectively within a global team environment. Conduct peer code reviews and provide technical mentorship to junior team members.
Contribute to a high-performing data engineering team, fostering a culture of continuous learning, innovation, and collaboration.