Data Engineer at Electronic Arts
Southam, , United Kingdom -
Full Time


Start Date

Immediate

Expiry Date

21 Nov, 25

Salary

0.0

Posted On

21 Aug, 25

Experience

4 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Analytics, Programming Languages, C++, Statistics, Azure, Data Governance, Data Strategies, Monte Carlo, Python, Informatics, Dashboards, Data Quality, Code, Jira, Aws, Version Control, Trello, Tableau, Information Systems, Power Bi, Automated Processes, Airflow

Industry

Information Technology/IT

Description

GENERAL INFORMATION

Locations: Southam, Warwickshire, United Kingdom
Role ID
210096
Worker Type
Regular Employee
Studio/Department
CT - Data & Insights
Work Model
Hybrid

DESCRIPTION & REQUIREMENTS

Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter. A team where everyone makes play happen.
We are looking for an experienced Data Engineer with broad technical skills and ability to work with large amounts of data. You will collaborate with the Game and Product teams to implement data strategies and develop complex ETL pipelines that support dashboards for promoting deeper understanding of our games.
You will have experience developing and establishing scalable, efficient, automated processes for large-scale data analyses. You will also stay informed of the latest trends and research on all aspects of data engineering and analytics. You will work with leaders from an internal Game Studio, providing them with data for understanding game and player insights and report to the Technical Lead for this group. This is a hybrid role based in our Southam office.

REQUIRED QUALIFICATIONS:

  • 4+ years relevant industry experience in a data engineering role and graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field
  • Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions
  • Experience in data modelling and tools such as dbt, ETL processes, and data warehousing
  • Experience with at least one of the programming languages like Python, C++, Java
  • Experience with version control and code review tools such as Git
  • Knowledge of latest data pipeline orchestration tools such as Airflow
  • Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code tools (e.g., Docker, Terraform, CloudFormation).
  • Familiarity with data quality, data governance, and observability tools (e.g., Great Expectations, Monte Carlo).
  • Experience with BI and data visualization tools (e.g., Looker, Tableau, Power BI).
  • Experience working in an Agile development environment and familiar with process management tools such as JIRA, Target process, Trello or similar

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
  • As a Data Engineer you will be involved in the entire development life cycle, from brainstorming ideas to implementing elegant solutions to obtain data insights.
  • You will gather requirements, model and design solutions to support product analytics, business analytics and advance data science
  • Design efficient and scalable data pipelines using cloud-native and open source technologies
  • Develop and improve ETL/ELT processes to ingest data from diverse sources.
  • You will work with analysts, understand requirements, develop technical specifications for ETLs, including documentation.
  • You will support production code to produce comprehensive and accurate datasets.
  • Automate deployment and monitoring of data workflows using CI/CD best practices.
  • You will promote strategies to improve our data modelling, quality and architecture
  • Participate in code reviews, mentor junior engineers, and contribute to team knowledge sharing.
  • Document data processes, architecture, and workflows for transparency and maintainability.
  • You will work with big data solutions, data modelling, understand the ETL pipelines and dashboard tools
Loading...