Data Engineering Principal at BT
London, England, United Kingdom -
Full Time


Start Date

Immediate

Expiry Date

19 Sep, 25

Salary

0.0

Posted On

18 Aug, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Coding Practices, Testing, Data Engineering, Distributed Systems, Data Acquisition, Data Architecture, Unstructured Data, Cloud Computing, Design, Technical Direction, Data Services, Cloud Services, Data Processing, Talent Management, Apache Spark, Athena, Python

Industry

Information Technology/IT

Description

DATA ENGINEERING PRINCIPAL

Job Req ID: 50036
Posting Date: 11 Aug 2025
Function: Data & AI
Unit: Networks
Location:1 Braham Street, London, United Kingdom
Salary: .
Recruiter: Daniel McCarthy
Career Grade: C
Internal Closing Date: 20/8/25

THE SKILLS YOU’LL NEED

DevOps
Data Storage
Data Engineering
Data Integration
Data Architecture
Programming/Scripting
Data Quality
Big Data Processing
Cloud Computing
Performance Monitoring
Agile Methodologies
Data Management
Data Acquisition
Data Risk
Data Model Management
Talent Management
Decision Making
Growth Mindset
Performance Management
Inclusive Leadership

SKILLS REQUIRED:

  • Possess deep technical expertise in data engineering, with a strong command of modern practices and methodologies.
  • Recognised as an expert in AWS cloud services, particularly in designing and implementing scalable data engineering solutions.
  • Bring extensive experience in software architecture and solution design, ensuring robust and future-proof systems.
  • Hold specialised proficiency in Python and Apache Spark, enabling efficient processing of large-scale data workloads.
  • Demonstrate the ability to set technical direction, uphold high standards for code quality, and optimise performance in data-intensive environments.
  • Adept at using automation tools and CI/CD pipelines to streamline development, testing, and deployment processes.
  • An exceptional communicator, capable of translating complex technical concepts for diverse audiences including engineers, product managers, and senior leadership.
  • Provide thought leadership within engineering teams, fostering a culture of quality, efficiency, and collaboration
  • Experienced in mentoring engineers, guiding them in advanced coding practices, architectural thinking, and strategic problem-solving to elevate team capabilities.

EXPERIENCE YOU’D BE EXPECTED TO HAVE

  • Former Principal Engineer with a proven track record of leading teams in best practices across design, development, and implementation. Known for mentoring engineers and cultivating a culture of continuous learning and innovation.
  • Extensive background in software architecture and solution design, with deep expertise in microservices, distributed systems, and cloud-native architectures.
  • Advanced proficiency in Python and Apache Spark, with a strong focus on ETL data processing and scalable data engineering workflows.
  • In-depth technical knowledge of AWS data services, with hands-on experience implementing data pipelines using tools such as EMR, AWS Glue, AWS Lambda, Step Functions, API Gateway, and Athena.
  • Proven experience in designing and delivering Lakehouse architectures, enabling unified analytics across structured and unstructured data.

DON’T MEET EVERY SINGLE REQUIREMENT?

Studies have shown that women and people who are disabled, LGBTQ+, neurodiverse or from ethnic minority backgrounds are less likely to apply for jobs unless they meet every single qualification and criteria. We’re committed to building a diverse, inclusive, and authentic workplace where everyone can be their best, so if you’re excited about this role but your past experience doesn’t align perfectly with every requirement on the Job Description, please apply anyway - you may just be the right candidate for this or other roles in our wider team.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
  • Lead the design and implementation of robust, scalable, and secure data solutions using AWS services such as S3, Glue, Lambda, Redshift, EMR, Kinesis, and more—covering data pipelines, warehousing, and lakehouse architectures.
  • Drive the migration of legacy data workflows to Lakehouse architectures, leveraging Apache Iceberg to enable unified analytics and scalable data management.
  • Operate as a subject matter expert across multiple data projects, providing strategic guidance on best practices in design, development, and implementation.
  • Build and optimise data pipelines for ingestion, transformation, and loading from diverse sources, ensuring high standards of data quality, reliability, and performance.
  • Own the development of automation and monitoring frameworks that capture operational KPIs and pipeline health metrics, enabling proactive performance management.
  • Identify and resolve performance bottlenecks in data workflows, ensuring optimal resource utilisation and cost-efficiency.
  • Collaborate closely with architects, Product Owners, and development teams to decompose solutions into Epics, leading the design and planning of technical components.
  • Mentor and coach engineering professionals, fostering a culture of continuous learning, innovation, and technical excellence.
  • Champion inclusive and open team culture, leading complex projects autonomously and facilitating high-impact technical discussions.
  • Define and manage service level agreements (SLAs) for data products and production processes, ensuring reliability and accountability.
  • Develop and optimise data science procedures, including storage strategies using distributed structures, databases, and other scalable technologies.
  • Lead the implementation of continuous improvement initiatives, enhancing team processes and delivery capabilities.
  • Serve as a trusted advisor to internal stakeholders, including data science and product teams, translating complex technical concepts into actionable solutions.
Loading...