Senior Data Ingestion Engineer at McAfee LLC
Remote, British Columbia, Canada -
Full Time


Start Date

Immediate

Expiry Date

18 Mar, 25

Salary

0.0

Posted On

15 Feb, 25

Experience

10 year(s) or above

Remote Job

No

Telecommute

No

Sponsor Visa

No

Skills

Good communication skills

Industry

Information Technology/IT

Description

ABOUT YOU:

  • Bring 10+ years of experience in Data Engineering, ingestion pipelining, and ETL/ELT
  • Hold a bachelor’s degree in computer science, engineering, statistics, or related field
  • Have hands-on experience with and understanding of the following:
  • Spark/Scala
  • SQL/SparkSQL
  • Python/PySpark or similar programming language
  • Kong-API and Kinesis
  • Databricks / AWS
  • Unity Catalog
  • ETL/ELT development, monitoring and pipelining using tools such as Apache Airflow
  • Ingestion tools such as Dell Boomi
  • Data quality guidelines
  • CI/CD pipelines
  • Agile
  • Git and version control

    LI-Remote

Responsibilities

ROLE OVERVIEW:

Are you an experienced Data Engineer ready to join our Enterprise Data Platform team as a Senior Data Ingestion Engineer? Reporting to the Senior Manager of Data Engineering, you’ll be a key member of McAfee’s enterprise data platform team and responsible for data engineering activities on the Databricks platform.

This role requires a deep understanding of

  • Data Lake ingestion processes and best practices
  • ETL/ELT implementation
  • CI/CD
  • System integration tools
  • Data pipeline management

This is a remote position based in Canada. We will only consider candidates in Canada and are not offering relocation assistance at this time.

ABOUT THE ROLE:

  • Lead the Enterprise Data Platform Ingestion team, providing strategic guidance, mentorship, and leadership.
  • Ingest data from a variety of source systems and tailor ingestion approaches on a per-system basis.
  • Manage, maintain, and oversee ETL/ELT pipelines on the Databricks platform
  • Optimize data pipelines for scalability and speed
  • Document ingestion and integration flows and pipelines
  • Use Airflow to schedule and automate ingestion jobs
  • Manage metadata and master data in technical data catalog
  • Ensure ELT/ETL design meets required security and compliance guidelines, and ensure PII management, flagging and risk assessment during ingestion
  • Maintain ETL/ELT pipeline infrastructure and implement automated monitoring strategies
  • Ensure adherence to SDLC best practices
  • Strong deeper knowledge on SQL and Advance SQL using windows functions
  • Ingestion of data from the product data to AWS real-time. Candidate should demonstrate the end-to-end data pipeline development.
  • Product subscription model domain is a plus and good to have.
  • Knowledge on Kong-API and Kinesis is a plus and good to have.
Loading...