Apprentice Software Engineer at Pearson PlcWestminster
Chennai, tamil nadu, India -
Full Time


Start Date

Immediate

Expiry Date

28 Jun, 26

Salary

0.0

Posted On

30 Mar, 26

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Lakehouse, BigQuery, GCP, Azure, AWS, Monitoring, Data Pipelines, ETL/ELT, SQL, Data Validation, Documentation, Incident Handling, AI-Assisted Tools, Data Governance, Cloud Fundamentals, Batch Processing

Industry

education

Description
Job Description – Apprentice Software Engineer (Global Data) Role Title : Apprentice Software Engineer (Global Data) Career Level / Tier : IC10 (Apprenticeship) Location : Chennai, Tamil Nadu (On‑site / Hybrid as per team norms) Employment Type : Full‑time Apprenticeship (Fixed Term) Duration : 12 months (as per Apprenticeship Program guidelines)   About the Apprenticeship Program The Apprenticeship Program is designed to build a future‑ready technology talent pipeline by providing structured on‑the‑job learning, mentoring, and exposure to real enterprise data and technology environments. Apprentices gain hands‑on experience while developing foundational technical, operational, and professional skills required for long‑term roles in Global Data and Technology Engineering. About the Team The Global Data (Data Lakehouse & Data Operations) team supports modern, cloud‑native data platforms that enable analytics, reporting, and business insights across the organization. The team operates in a multi‑cloud environment (GCP, Azure, AWS) with strong usage of BigQuery for analytics workloads. The team actively leverages AI‑assisted tools such as Claude, ChatGPT, and Microsoft Copilot to enhance productivity, documentation, analysis, and operational efficiency. Role Overview As an Apprentice Software Engineer (Global Data), you will work closely with experienced engineers and data operations specialists to support day‑to‑day activities across data platforms. This role provides hands‑on exposure to Data Lakehouse architecture, BigQuery‑based analytics, multi‑cloud platforms, monitoring systems, and operational best practices in an enterprise environment. Key Responsibilities * Support daily Global Data and Data Lakehouse operations, including monitoring data pipelines, scheduled jobs, and platform health. * Assist in monitoring and supporting BigQuery workloads, including job execution status, data availability checks, and basic validations. * Help track, triage, and support resolution of data pipeline or job failures under supervision. * Monitor dashboards, alerts, and batch schedules across GCP, Azure, and AWS environments. * Perform basic data validation checks such as row counts, freshness checks, and completeness verification. * Assist in documenting operational procedures, runbooks, and knowledge articles. * Use AI tools (Claude, ChatGPT, Microsoft Copilot) responsibly to assist with analysis, learning, and documentation. * Participate in shift handovers, team meetings, learning sessions, and continuous improvement initiatives. * Follow security, compliance, and data governance standards while handling enterprise data. Learning & Experience You Will Gain * Practical exposure to Global Data platforms and Data Lakehouse architecture. * Hands‑on experience with BigQuery and analytics‑focused data operations. * Understanding of multi‑cloud data engineering and operations (GCP, Azure, AWS). * Foundational knowledge of data ingestion, transformation (ETL/ELT), monitoring, and incident handling. * Experience using AI‑assisted engineering tools in a real enterprise setup. * Development of professional skills such as communication, documentation, collaboration, and operational discipline. * Structured mentoring and regular feedback throughout the apprenticeship period. Eligibility & Mandatory Criteria * Graduate or final‑year student with 0–2 years of experience (as per Apprenticeship norms). * Must meet eligibility requirements under the Apprenticeship Act, 1961. * Must be eligible and willing to register on the NATS/NAPS portal as mandated. * Legally eligible to work in India. * Not previously engaged as a full‑time employee drawing PF (as per apprenticeship guidelines). * Preferred Skills (Good to Have) * Basic knowledge of SQL and data querying concepts (BigQuery exposure is a plus). * Familiarity with cloud fundamentals (GCP, Azure, AWS). * Introductory understanding of Data Lake / Data Lakehouse / Analytics platforms. * Awareness of monitoring, batch processing, or incident management concepts. * Comfort using AI tools for productivity and learning. * Strong learning mindset, attention to detail, and willingness to work in a data operations or support environment (including shifts, if required). Stipend As per Apprenticeship Program guidelines. Equal Opportunity Statement We are committed to building an inclusive workplace and encourage applications from diverse backgrounds.  

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
The apprentice will support daily Global Data and Data Lakehouse operations, including monitoring data pipelines, platform health, and BigQuery workloads under supervision. Key tasks involve tracking and assisting in the resolution of data failures, monitoring alerts across multi-cloud environments, and performing basic data validation checks.
Loading...