Data Engineer – Specialist at Carrier
Bengaluru, karnataka, India -
Full Time


Start Date

Immediate

Expiry Date

03 Jun, 26

Salary

0.0

Posted On

05 Mar, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

SQL, Python, Data Modelling, Snowflake, AWS, GCP, Ataccama, Atlan, Data Governance, Data Pipelines, ETL, Dimensional Modelling, CI/CD, DevOps, Infrastructure-as-Code, Performance Tuning

Industry

Wholesale Building Materials

Description
Role: Data Engineer - Specialist Location: Bangalore, India Full/ Part-time: Full time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. About the role: The Data Engineer is a highly technical, hands-on senior individual contributor responsible for building scalable, enterprise-grade data pipelines and reusable data products across Snowflake, AWS, and GCP platforms. The role demands advanced skills in SQL and Python, hands-on data modelling, and experience with data governance tools such as Ataccama and Atlan. The engineer will actively build, optimise, debug, and manage production-scale data pipelines, integrating over 70 ERP systems (SAP ECC, Atlas, JDE, Baan, etc.). The ideal candidate is passionate about deep data engagement, solving complex modelling problems, and writing high-performance code at scale. Key Responsibilities: Data Engineering & Pipeline Development Design and build scalable ingestion, transformation, and serving pipelines. Develop highly optimised, advanced SQL transformations (heavy daily usage). Build Python-based data processing frameworks and reusable components. Work extensively across Snowflake, AWS, and GCP ecosystems. Handle ingestion and harmonisation from 70+ ERP systems. Optimise large-scale joins, aggregations, and complex transformation logic. Ensure performance, scalability, and cost efficiency. Advanced Data Modelling Design and implement conceptual, logical, and physical data models. Build enterprise-grade master data and finance data models. Develop reusable, consumption-ready datasets. Apply dimensional modelling and modern data platform patterns. Align models with data product and enterprise architecture principles. Optimise modelling decisions for performance in Snowflake and cloud platforms. This role requires hands-on modelling — not just conceptual knowledge. Technical Ownership & Deep Engineering Own end-to-end development lifecycle of pipelines and models. Perform coding, debugging, refactoring, and optimisation. Address complex data inconsistencies across fragmented ERP systems. Write production-grade, maintainable SQL and Python code. Actively troubleshoot performance bottlenecks and transformation issues. Data Quality, Governance & Metadata Implement and manage data quality rules using Ataccama (required experience). Work with Atlan or similar data catalog/governance tools (required). Ensure lineage, metadata documentation, and governance standards. Embed validation, monitoring, and observability into pipelines. Support enterprise data standardisation initiatives. Business Collaboration & Requirement Translation Partner with Finance and Corporate stakeholders. Understand KPIs, business definitions, and data nuances. Translate business logic into scalable SQL transformations and models. Challenge ambiguous requirements using data reasoning. Engineering Excellence & Platform Collaboration Work closely with Tech Leads, architects, and platform teams. Follow CI/CD, DevOps, and Infrastructure-as-Code practices. Adopt framework-first and reusable design principles. Continuously improve data product quality and engineering standards. Required Qualifications 7–12 years of strong data engineering experience. Advanced SQL expertise is mandatory Strong Python programming skills Hands-on data modelling expertise Experience with Snowflake Strong experience with AWS and GCP Experience with Ataccama Experience with Atlan or similar data catalog/governance tools. Proven experience building and optimising large-scale data pipelines. Experience handling complex, multi-source enterprise data environments. Strong understanding of reusable data product design. Experience working in modern data platform or Lakehouse environments. Demonstrated experience constructing and optimising complex transformations at scale. Preferred Qualifications ERP exposure (SAP ECC, JDE, Baan, etc.). Master Data or Finance domain experience. Experience supporting AI/ML data preparation use cases. Strong performance tuning experience in Snowflake. Exposure to automation and advanced pipeline orchestration patterns. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Enjoy your best years with our retirement savings plan Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules, parental leave and our holiday purchase scheme Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Programme. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice: Click on this link to read the Job Applicant's Privacy Notice At Carrier we make modern life possible by delivering groundbreaking systems and services that help homes, buildings and shipping become safer, smarter and more sustainable. We exceed the expectations of our customers by anticipating industry trends, working tirelessly to master and revolutionize them. Our team of approximately 56,000 dedicated individuals continues to mold industry standards by pursuing the latest research and developments to improve the lives of our customers. We’re constantly growing, seeking out talented, likeminded people who are committed to our primary duty: to be the world’s first choice in security, shipping and HVAC technology.
Responsibilities
The Data Engineer will be responsible for designing, building, and optimizing scalable, enterprise-grade data ingestion, transformation, and serving pipelines across Snowflake, AWS, and GCP ecosystems, integrating data from over 70 ERP systems. Key tasks include writing high-performance SQL and Python code, developing complex data models, and ensuring data quality and governance using tools like Ataccama and Atlan.
Loading...