Data Engineer at Entain
, Telangana, India -
Full Time


Start Date

Immediate

Expiry Date

07 Apr, 26

Salary

0.0

Posted On

07 Jan, 26

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

SQL, Python, Snowflake, BigQuery, Databricks, Google Analytics, Posthog, Qualtrics, Mixpanel, DBT, Airflow, Git, Data Quality Validation, Data Governance, Machine Learning, Docker

Industry

Entertainment Providers

Description
Company Description We’re Entain. Our vision is to be the world leader in sports betting and gaming entertainment by creating the most exciting and trusted experience for our customers, revolutionizing the gambling space as we go. We're home to a global family of more than 25 well-known brands, and with a focus on sustainability and growth, we will transform our sector for our players, for ourselves and for the industry. Job Description Data intelligence and AI plays a transformational role in how we create value today and how we shape our future. We have a bold ambition to be a world-class data and AI driven business for our customers across all our regions and business functions. This is a hands-on engineering role that will work within our within our ASE Data & AI function. You will support cross-functional teams by building and maintaining data infrastructure that enables efficient analysis and insights generation and generative AI and ML applications. The purpose of this role is to develop and maintain robust data pipelines and systems that serve our diverse data consumers across the business spanning our brands in the Americas and Southern Europe. By leveraging your data engineering skills, you'll enable data-driven decision making throughout the organization, supporting business growth and operational excellence. Key Responsibilities Develop and test data pipelines and ETL processes to support the ASE data consumer functions, ensuring data flows efficiently and is accessible for reporting, analytics, and AI teams Implement integrations with external and internal data sources, working with established connectors and APIs to collect, process, and store datasets. Work with Snowflake (or equivalent cloud data warehousing platform) to organize and optimize data storage and retrieval for marketing analytics. Utilize DBT to create reliable, maintainable data workflows that support the team's analytics needs. Monitor data quality and implement processes to ensure data integrity throughout the pipeline. Participate in code reviews and contribute to best practices in data engineering within the team. Document data pipelines, models, and processes to ensure knowledge sharing across the team. Troubleshoot and resolve data-related issues as they arise, working with cross-functional teams when necessary. Stay current with industry tools and practices in data engineering, suggesting improvements to existing processes when appropriate. Qualifications Essential: Bachelor's degree in Computer Science, Software Engineering, or related field, or equivalent practical experience Proficiency in SQL and Python programming for data manipulation, transformation and pipeline development Experience with Snowflake, BigQuery, Databricks, or equivalent cloud data warehousing platforms Experience working with behavioral datasets (Google Analytics, Posthog, Qualtrics, Mixpanel, etc.) Familiarity with data visualization tools (e.g., PowerBI, Tableau) Familiarity with modern data pipeline orchestration tools such as DBT, Airflow, or Composer Familiarity with version control systems such as Git Familiarity with data quality validation, testing, or monitoring frameworks Problem-solving mindset with strong attention to detail Desired: Understanding of data modeling concepts and best practices for analytics Understanding of data governance principles and practices Understanding of how genAI models can be used in data engineering Experience with event-driven or streaming data architectures (e.g. Kafka, Kinensis etc.) Experience with containerization and orchestration tools (Docker, Kubernetes) Experience with CI/CD practicles for data workflows (e.g. Github Actions, GitLab, Jenkins etc.) Experience using modern observability tools for data reliability (e.g. Monte Carlo, Great expectations etc.) Basic understanding of machine learning workflows and how data engineering supports them Additional Information #LI-Hybrid We’re looking for someone who: Approaches complex technical challenges with creativity and persistence Communicates technical concepts clearly to both technical and non-technical stakeholders Demonstrates adaptability when working with emerging technologies and changing requirements Takes initiative to improve system performance and identify innovative solutions Balances technical excellence with pragmatic delivery to meet business needs Collaborates effectively across multidisciplinary teams, especially with data engineers and AI SMEs Takes ownership of assigned projects from concept through implementation and maintenance Shows curiosity about AI advancements and applies relevant learnings to solve business problems Sets, achieves and is motivated by goals Able to collaborate and work well with others At Entain India, we do what’s right. It’s one of our core values and that’s why we're taking the lead when it comes to creating a diverse, equitable and inclusive future - for our people, and the wider global sports betting and gaming sector. However you identify, across any protected characteristic, our ambition is to ensure our people across the globe feel valued, respected and their individuality celebrated. We comply with all applicable recruitment regulations and employment laws in the jurisdictions where we operate, ensuring ethical and compliant hiring practices globally. Should you need any adjustments or accommodations to the recruitment process, at either application or interview, please contact us. Advertising Department: Data & Analytics
Responsibilities
The Data Engineer will develop and maintain data pipelines and systems to support data analysis and insights generation. This role involves building integrations with data sources and ensuring data quality for various business functions.
Loading...