Data and BI Architect/Architecte

at  KPI Digital

Montréal, QC, Canada -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate21 Dec, 2024Not Specified25 Sep, 2024N/APowerdesigner,Analytics,Sipoc,Erwin,Hive,Subject Matter Experts,It,Spark,Snowflake,Embarcadero,Dimensional Modeling,AdditionNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

Join a team that is immersed in a high-tech and very dynamic environment where projects and work days are never the same. We are specialists in data analysis and innovators in digital transformation. We are modelers, AI and machine learning assistants.
For 20 years, we’ve loved telling the story of how data makes businesses smarter, and we all share an obsession with customer service and a passion for changing the status quo.
KPI Digital is looking for a for a Data and BI Architect specialized in Data, BI using Azure components who will be providing consulting services to our major and strategic client part of the top of 500 companies.

REQUIRED QUALIFICATIONS AND PROFILE (RELEVANT TO THE PROJECT)

The Data & Analytics group is looking for a Senior Data & BI Architect who will be providing consulting services to one of our customers. This role could be as a mandate (contractor) or as an employee. The project starts November 22nd, 2021 (with multiple extensions possibility). The ideal candidate:

  • Is a senior Data architect that can develop detailed and comprehensive conceptual data model rapidly
  • Is a senior BI architect that can transpose the conceptual data model in dimensional models and go further by taking the dimensional models and physicalizing them on Azure Data Lake Store and related DB engines such as Synapse, Hive (potentially within Spark) and others as required. The role is also expected to help the visualisation specialist specifically for the dimensional metadata layer and most likely for the wireframes.
  • The Data Architect will be working with the lead architect who will facilitate work sessions with subject matter experts and will model live in real-time with the group. The Lead Architect has done hundreds of projects and modeled more than 6 000 entities to date so the working sessions will be done at a fast pace. The expectation is for the role to complete where the Lead Architect stops and participate actively in the working sessions. In addition, since we want speed, we want the Data Architect to also create the BRD (Business Requirements Document) and have it reviewed directly with the SMEs. This is the only way to have both a conceptual enterprise data model and a BRD literally days after the last session that will use the same business language
  • The Data Architect (can do ER style modeling for the Integration Zone of the Cold Layer of the Lakehouse) and can do the dimensional modeling with sophisticated dimensions (multiple hierarchies) for self-serve reporting and the basis for the Consumption Zone of the Lakehouse
  • The Data and BI Architect will also work with the Lead Architect to define all messages required from source systems to be sent to the lakehouse thru a push architecture (usually JSON) which will be sent to Event Hubs and then move to the Cold layer using Databricks streaming (which could also be set for micro batches via parameter). So, 3 zones will be defined (raw, integrated and consumption zones). The consumption zone is replicated to the server layer for speed where we will be using most likely Synapse Analytics (or Snowflake)
  • This is a real-time system for both the hot and cold layer hence the Lakehouse. And PowerBI will connect to the serving layer. Assuming in import mode for full functionality (PowerBI) and hoping to be with Premium edition to remove the data quantity constraint of PowerBI Pro version.

THE FOLLOWING EXPERIENCE/SKILLS ARE RELEVANT TO THE PROJECT MANDATE (OR FOR EMPLOYMENT)

  • Strong business processes / objectives understanding skills combined with the relationships with data (SIPOC, CRUD, etc)
  • Experience using Data Modeling tools such as PowerDesigner (selected for the project) or others such as Erwin, Embarcadero and similar
  • Experience with AZDLS, ADF, Synapse, Databricks, Event Hub mainly from an overall understanding not really from a technical angle. Essentially, the candidate has to understand the roles of the components in order to better interface with the Data Engineers (dev team)
  • Able to understand rapidly the patterns and directions taken by the Practice Lead (former chief architect)
  • Able to zero-in on ambiguous concepts and made them crystal clear rapidly
  • Pragmatic (consider constraints) but not to the expense of faulty designs/concepts
  • Ability to discern fundamental and key aspects worth ‘fighting for’ versus the less ‘important’ ones

QUALIFICATIONS REQUISES ET PROFIL (PERTINENT POUR LE PROJET)

Le groupe Data & Analytics est à la recherche d’un architecte senior Data & BI qui fournira des services de conseil à l’un de nos clients. Ce rôle peut être en tant que mandat (entrepreneur) ou en tant qu’employé. Le projet démarre le 22 novembre 2021 (avec possibilité de multiples extensions). Le candidat idéal :

  • Est un architecte de données senior capables de développer rapidement un modèle de données conceptuel détaillé et complet.
  • Est un architecte BI senior capable de transposer le modèle de données conceptuel dans des modèles dimensionnels et d’aller plus loin en prenant les modèles dimensionnels et en les matérialisant sur Azure Data Lake Store et les moteurs de base de données associés tels que Synapse, Hive (potentiellement dans Spark) et autres selon les besoins. Le rôle devrait également aider le spécialiste de la visualisation spécifiquement pour la couche de métadonnées dimensionnelles et très probablement pour les wireframes.
  • L’architecte de données travaillera avec l’architecte principal qui animera des séances de travail avec des experts en la matière et modéliser en direct en temps réel avec le groupe. L’architecte principal a réalisé des centaines de projets et modélisé plus de 6 000 entités à ce jour, de sorte que les sessions de travail se dérouleront à un rythme rapide. On s’attend à ce que le rôle se termine là où l’architecte principal s’arrête et participe activement aux séances de travail. De plus, puisque nous voulons de la rapidité, nous voulons que le Data Architect crée également le BRD (Business Requirements Document) et le fasse réviser directement avec les PME. C’est le seul moyen d’avoir à la fois un modèle de données d’entreprise conceptuel et un BRD littéralement jours après la dernière session qui utilisera le même langage métier.
  • L’architecte de données (peut faire une modélisation de style ER pour la zone d’intégration de la couche froide du Lakehouse) et peut faire la modélisation dimensionnelle avec des dimensions sophistiquées (hiérarchies multiples) pour les rapports en libre-service et la base de la zone de consommation du Lakehouse
  • L’architecte de données et de BI travaillera également avec l’architecte principal pour définir tous les messages requis des systèmes sources à envoyer au Lakehouse via une architecture push (généralement JSON) qui sera envoyé aux hubs d’événements, puis déplacé vers la couche froide à l’aide Diffusion Databricks (qui peut également être définie pour les micro-lots via un paramètre). Ainsi, 3 zones seront définies (zones brutes, intégrées et de consommation). La zone de consommation est répliquée sur la couche serveur pour la vitesse où nous utiliserons très probablement Synapse Analytics (ou Snowflake)
  • Il s’agit d’un système en temps réel pour les couches chaude et froide, d’où le Lakehouse. Et PowerBI se connectera à la couche de service. En supposant en mode d’importation pour une fonctionnalité complète (PowerBI) et en espérant être avec l’édition Premium pour supprimer la contrainte de quantité de données de la version PowerBI Pro.

Responsibilities:

The Data & Analytics group is looking for a Senior Data & BI Architect who will be providing consulting services to one of our customers. This role could be as a mandate (contractor) or as an employee. The project starts November 22nd, 2021 (with multiple extensions possibility). The ideal candidate:

  • Is a senior Data architect that can develop detailed and comprehensive conceptual data model rapidly
  • Is a senior BI architect that can transpose the conceptual data model in dimensional models and go further by taking the dimensional models and physicalizing them on Azure Data Lake Store and related DB engines such as Synapse, Hive (potentially within Spark) and others as required. The role is also expected to help the visualisation specialist specifically for the dimensional metadata layer and most likely for the wireframes.
  • The Data Architect will be working with the lead architect who will facilitate work sessions with subject matter experts and will model live in real-time with the group. The Lead Architect has done hundreds of projects and modeled more than 6 000 entities to date so the working sessions will be done at a fast pace. The expectation is for the role to complete where the Lead Architect stops and participate actively in the working sessions. In addition, since we want speed, we want the Data Architect to also create the BRD (Business Requirements Document) and have it reviewed directly with the SMEs. This is the only way to have both a conceptual enterprise data model and a BRD literally days after the last session that will use the same business language
  • The Data Architect (can do ER style modeling for the Integration Zone of the Cold Layer of the Lakehouse) and can do the dimensional modeling with sophisticated dimensions (multiple hierarchies) for self-serve reporting and the basis for the Consumption Zone of the Lakehouse
  • The Data and BI Architect will also work with the Lead Architect to define all messages required from source systems to be sent to the lakehouse thru a push architecture (usually JSON) which will be sent to Event Hubs and then move to the Cold layer using Databricks streaming (which could also be set for micro batches via parameter). So, 3 zones will be defined (raw, integrated and consumption zones). The consumption zone is replicated to the server layer for speed where we will be using most likely Synapse Analytics (or Snowflake)
  • This is a real-time system for both the hot and cold layer hence the Lakehouse. And PowerBI will connect to the serving layer. Assuming in import mode for full functionality (PowerBI) and hoping to be with Premium edition to remove the data quantity constraint of PowerBI Pro version

Responsibilities for the Data and BI Architect – on Azure

  • Ensure that all models (conceptual to physical) are complete, coherent, and consistent
  • Ensure that the metamodel layer is aligned with the intent of the dimensional model which is to support the production of the ‘Dynamic Dashboards’ covering Sales metrics in a complex multi-party’ companies (manufacturer, big boxes, architects, contractors, end customers, specialized store and so on)
  • Define a comprehensive BRD and get feedback and approvals from the customers’ SMEs and the Lead Architect
  • Help the data engineers from getting the data to the transformation toward the Cold Zone via specs and guidanc


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

Analytics & Business Intelligence

Software Engineering

Graduate

Proficient

1

Montréal, QC, Canada