Data Architect at CPI Security
Charlotte, NC 28273, USA -
Full Time


Start Date

Immediate

Expiry Date

01 Aug, 25

Salary

0.0

Posted On

01 May, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Computer Science, Information Systems, Data Engineering

Industry

Information Technology/IT

Description

CPI Security, a national leader in residential and commercial security solutions, is seeking a Data Architect to join us on our data transformational journey. This is an exciting, hands-on opportunity to implement a cloud data platform at a company that has fully embraced the Snowflake platform. This role will work directly with line of business leaders and technical users to design and implement our cloud data warehouse. CPI will leverage the data cloud for in our data warehouse, machine learning, and AI journies. The ideal person will have extensive experience designing, building, and implementing data warehouses in the cloud. This is an on-site position at our HQ in Charlotte, NC.

OTHER EXPERIENCE REQUIRED

  • Bachelor’s degree in Information Systems, Computer Science, or related field of study preferred or work experience equivalent
  • A minimum of 5 years of Snowflake experience and configuring declarative solutions
  • Snowflake certifications in data engineering and/or architecture is desired.
Responsibilities
  • Cloud migration: Play and integral role in planning, designing, and implementing the data and digital migration.
  • Modernize finance data systems: Support the data migration and modernization of the ERP system, redesign data structures, build new finance data processes, analytics models, and reporting capabilities.
  • External data integration: Integrate and operationalize data from external systems such as CRM and IVR platforms via secure cloud data sharing, CDC, and APIs.
  • Modernize BI & Analytics: Enhance and streamline reporting and analytics capabilities using leading BI and dashboarding platform that rely on timely, accurate, and correct data.
  • Data Modeling: Apply best practices for data modeling; knowledge of Data Vault 2.0, Kimball, Inmon and/or document modeling to build scalable, audit-friendly, business-aligned data structures.
  • Build & Automate Pipelines: Design and implement data transformation models using tools like VS Code, dbt, and SQL to build ingestion/processing pipelines using Python.
  • Cloud Data Engineering: Understand the modern data engineering experience and agile practices to deliver data solutions. Utilize cloud-native services to enable scalable, secure, and efficient data processing.
  • DataOps - Enable reliable, scalable, and automated data workflows by implementing DataOps best practices for continuous integration, testing, deployment, and monitoring across the data pipeline lifecycle.
Loading...