Senior Software Engineer at NielsenIQ
Chennai, tamil nadu, India -
Full Time


Start Date

Immediate

Expiry Date

20 May, 26

Salary

0.0

Posted On

19 Feb, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Apache Spark, Scala, Python, PostgreSQL, Azure Databricks, Data Pipelines, Data Processing, Azure Services, ADLS, Key Vault, Git, CI/CD, DevOps, Data Modeling, Airflow

Industry

Software Development

Description
Company Description NielsenIQ is a global measurement and data analytics company that provides the most complete and trusted view available of consumers and markets worldwide. We provide consumer packaged goods manufacturers and retailers with accurate, actionable information and insights and a complete picture of the complex and changing marketplace that companies need to innovate and grow. Our approach marries proprietary Nielsen data with other data sources to help clients around the world understand what’s happening now, what’s happening next, and how to best act on this knowledge. We like to be in the middle of the action. That’s why you can find us at work in over 90 countries, covering more than 90% of the world’s population. For more information, visit www.niq.com. Job Description We are looking for an experienced Senior Data Engineer with strong expertise in Apache Spark, Scala, Python, PostgreSQL, and Azure Databricks. In this role, you will architect, build, and enhance large-scale data pipelines and analytical solutions that drive actionable business insights. If you are passionate about designing scalable data systems and thrive in a fast-paced engineering environment, we invite you to join our Chennai Technology Hub. Responsibilities Design, develop, and optimize scalable data solutions using Apache Spark and Scala on Azure Databricks. Write clean, modular, and production-grade Python and Scala code for data processing and transformation. Follow and implement best practices related to code quality, modular development, and performance tuning. Develop and maintain Databricks assets including notebooks, jobs, and workflows. Work extensively with Azure services such as ADLS and Key Vault. Collaborate with cross-functional Agile teams to deliver high-quality engineering solutions. Design, develop, and maintain PostgreSQL databases for applications and analytical workloads. Utilize Git and other configuration management tools effectively. Qualifications Bachelor’s or Master’s degree in Computer Science or a related field. 6-10+ years of professional experience as a Data Engineer. Hands-on experience with Apache Spark, including performance tuning. Strong programming expertise in Scala and Python. Proficiency in SQL with hands-on experience in PostgreSQL. Extensive experience with Azure Databricks and the Azure data ecosystem. Experience with CI/CD pipelines, Git, DevOps practices, and automated testing. Solid understanding of data modeling, distributed systems, and big data processing. Hands-on experience with Airflow. Additional Information You are passionate about coding, highly adaptable, and eager to continuously learn. You possess strong analytical, problem-solving, and communication skills. You excel in collaborative team environments and are capable of working with multiple stakeholders to build high-quality applications. Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) NIQ may utilize artificial intelligence (AI) tools at various stages of the recruitment process, including résumé screening, candidate assessments, interview scheduling, job matching, communication support, and certain administrative tasks that help streamline workflows. These tools are intended to improve efficiency and support fair and consistent evaluation based on job-related criteria. All use of AI is governed by NIQ’s principles of fairness, transparency, human oversight, and inclusion. Final hiring decisions are made exclusively by humans. NIQ regularly reviews its AI tools to help mitigate bias and ensure compliance with applicable laws and regulations. If you have questions, require accommodations, or wish to request human review were permitted by law, please contact your local HR representative. For more information, please visit NIQ’s AI Safety Policies and Guiding Principles: https://www.nielseniq.com/global/en/ai-safety-policies. About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion At NIQ, we are steadfast in our commitment to fostering an inclusive workplace that mirrors the rich diversity of the communities and markets we serve. We believe that embracing a wide range of perspectives drives innovation and excellence. All employment decisions at NIQ are made without regard to race, color, religion, sex (including pregnancy, sexual orientation, or gender identity), national origin, age, disability, genetic information, marital status, veteran status, or any other characteristic protected by applicable laws. We invite individuals who share our dedication to inclusivity and equity to join us in making a meaningful impact. To learn more about our ongoing efforts in diversity and inclusion, please visit the https://nielseniq.com/global/en/news-center/diversity-inclusion Career Site Team: Technology
Responsibilities
The role involves designing, developing, and optimizing scalable data solutions using Apache Spark and Scala on Azure Databricks to drive actionable business insights. Responsibilities include writing production-grade Python and Scala code, maintaining Databricks assets, and working extensively with Azure services like ADLS and Key Vault.
Loading...