Analytics Engineer at Kenvue
Bangalore, karnataka, India -
Full Time


Start Date

Immediate

Expiry Date

14 Jun, 26

Salary

0.0

Posted On

16 Mar, 26

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Microsoft Fabric, Data Pipelines, Lakehouse, Warehouse Management, PySpark, Python, SQL, Delta Lake, Parquet, Power BI, DAX, Git, CI/CD, Azure Networking, Entra ID, Data Governance

Industry

Personal Care Product Manufacturing

Description
Kenvue is currently recruiting for a: Analytics Engineer What we do At Kenvue, we realize the extraordinary power of everyday care. Built on over a century of heritage and rooted in science, we’re the house of iconic brands - including NEUTROGENA®, AVEENO®, TYLENOL®, LISTERINE®, JOHNSON’S® and BAND-AID® that you already know and love. Science is our passion; care is our talent. Who We Are Our global team is ~ 22,000 brilliant people with a workplace culture where every voice matters, and every contribution is appreciated. We are passionate about insights, innovation and committed to delivering the best products to our customers. With expertise and empathy, being a Kenvuer means having the power to impact millions of people every day. We put people first, care fiercely, earn trust with science and solve with courage – and have brilliant opportunities waiting for you! Join us in shaping our future–and yours. For more information, click here. Role reports to: Manager Location: Asia Pacific, India, Karnataka, Bangalore Work Location: Hybrid What you will do Job Overview We are looking for a Analytics Engineer with 4+ years of experience to help build and optimize our next-generation data platform using Microsoft Fabric. You will be responsible for designing scalable data pipelines, managing Lakehouse/Warehouses, and ensuring our data architecture supports a "Plug-and-Play" model for downstream business projects. The ideal candidate is a hands-on engineer who can translate complex data requirements into efficient, governed, and reusable data assets. --- Key Responsibilities 1. Fabric Ecosystem Engineering · End-to-End Pipeline Development: Design and implement data ingestion and transformation workflows using Fabric Data Factory (Pipelines & Dataflows Gen2). · Lakehouse & Warehouse Management: Architect and maintain OneLake storage structures, including Lakehouse (Delta Parquet) and Data Warehouse environments, to ensure a single source of truth. · KPI Hub Integration: Contribute to the development of the centralized KPI Hub by building reusable datasets that reduce data redundancy across the enterprise. 2. Data Modeling & Transformation · Advanced Spark/Python: Utilize Fabric Notebooks and PySpark for complex data engineering tasks, optimization, and large-scale processing. · Semantic Layer Design: Build and optimize Power BI Semantic Models (Direct Lake mode) to provide high-performance reporting capabilities without data duplication. · Data Quality & Governance: Implement automated data validation checks and metadata management to ensure compliance with GxP, SOX, and PII standards. 3. Automation & Platform Operations · Fabric CI/CD: Implement version control and deployment automation for Fabric items using Git integration and deployment pipelines. · Cost & Performance Optimization: Monitor capacity utilization and optimize queries/storage to ensure the platform remains cost-efficient. · Technical Documentation: Create detailed data lineage maps, schema definitions, and deployment runbooks to support team collaboration. --- Required Qualifications · Experience: 4+ years of total experience in Data Engineering, with at least 1 year of hands-on experience in Microsoft Fabric or deep expertise in the Azure Data Stack (Lakeshouse, Data Factory, Databricks). · Technical Proficiency: o Strong SQL and PySpark skills for data manipulation. o Hands-on experience with Delta Lake and Parquet formats. o Experience with Power BI and DAX is highly preferred. · Cloud Knowledge: Firm understanding of Azure networking, security (Managed Identities, Service Principals), and Entra ID (Azure AD) integration. · Problem Solving: Ability to work in a fast-paced environment and deliver scalable solutions that align with a broader Solution Architecture. --- Desired Qualifications · Certification: DP-600 (Microsoft Fabric Analytics Engineer Associate) is a significant plus. · Architectural Mindset: Familiarity with broader solution architecture principles is a strong plus. If you are an individual with a disability, please check our Disability Assistance page for information on how to request an accommodation. At Kenvue, we foster a culture of belonging. In 2024, Kenvue was selected as one of Seramount’s 100 Best Companies for working parents and caregivers, recognizing our ongoing commitment to providing inclusive benefits for Kenvuers and families. Our community attracts curious and collaborative team members motivated by always striving to improve. Our team offers you plenty of opportunities to deepen your existing expertise and build on new skills by broadening your exposure outside of your own category. By working across the business, we harness the power of data and technology in new ways to better understand human insights and drive better health outcomes.
Responsibilities
The role involves designing and implementing end-to-end data ingestion and transformation workflows within the Microsoft Fabric ecosystem, focusing on Lakehouse and Data Warehouse management. Key tasks include building reusable datasets for a centralized KPI Hub and utilizing advanced Spark/Python for complex data engineering.
Loading...