JOB DESCRIPTION:
We are seeking talented and versatile Data Engineer(s) to join our dynamic team. The ideal candidate(s) will have a strong foundation in data engineering practices, combined with the analytical skills necessary to derive actionable insights from data. This role involves designing, implementing, and maintaining robust data pipelines and architectures, as well as performing detailed data analysis to support business decisions.
Scope of Services
The Data Engineer(s) will be required on a full-time basis, working across two (2) to three (3) projects. Time, location and frequency of work will vary depending on the needs of the particular project. At the end of each term, it is expected that the Data Engineer(s) may work a maximum of 1,960 hours, unless otherwise agreed upon with the Province. However, Data Engineer(s) may be required to work fewer or more hours depending on the nature and needs of their work, as directed by the Province.
Services and project deliverables should evolve as the work progresses, in response to emerging user and business needs, as well as design and technical opportunities. However, the following must be delivered (iteratively) over the course of the project:
Data Engineering:
- Design, build, and maintain data pipelines on-premises and in the cloud (Azure, GCP, AWS) to ingest, transform, and store large datasets. Ensure pipelines are reliable and support multiple business use cases.
- Create and optimize dimensional models (star/snowflake) to improve query performance and reporting. Ensure models are consistent, scalable, and easy for analysts to use.
- Integrate data from SQL, NoSQL, APIs, and files while maintaining accuracy and completeness. Apply validation checks and monitoring to ensure high-quality data.
- Improve ETL/ELT processes for efficiency and scalability. Redesign workflows to remove bottlenecks and handle large, disconnected datasets.
- Build and maintain end-to-end ETL/ELT pipelines with SSIS and Azure Data Factory. Implement error handling, logging, and scheduling for dependable operations.
- Automate deployment, testing, and monitoring of ETL workflows through CI/CD pipelines. Integrate releases into regular deployment cycles for faster, safer updates.
- Manage data lakes and warehouses with proper governance. Apply security best practices, including access controls and encryption.
- Partner with engineers, analysts, and stakeholders to translate requirements into solutions. Prepare curated data marts and fact/dimension tables to support self-service analytics.
Data Analytics:
- Analyze datasets to identify trends, patterns, and anomalies. Use statistical methods, DAX, Python, and R to generate insights that inform business strategies.
- Develop interactive dashboards and reports in Power BI using DAX for calculated columns and measures. Track key performance metrics, share service dashboards, and present results effectively.
- Build predictive or descriptive models using statistical, Python, or R-based machine learning methods. Design and integrate data models to improve service delivery.
- Present findings to non-technical audiences in clear, actionable terms. Translate complex data into business-focused insights and recommendations.
- Deliver analytics solutions iteratively in an Agile environment. Mentor teams to enhance analytics fluency and support self-service capabilities.
- Provide data-driven evidence to guide corporate priorities. Ensure strategies and initiatives are backed by strong analysis, visualizations, and models.
Job Type: Fixed term contract
Contract length: 24 months
Pay: $70.00-$85.00 per hour
Expected hours: 36.25 per week
Work Location: In perso
Incase you would like to apply to this job directly from the source, please click here