Principal Data Engineer
at HSBC
Sheffield, England, United Kingdom -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 30 Jan, 2025 | Not Specified | 30 Oct, 2024 | N/A | Storage,Sql,Programming Languages,Kubernetes,Bash,Data Science,Infrastructure Technologies,Computer Science,Powershell,Web,Devops,Learning,Technology Services,Azure,Cloud,Azure Active Directory,Code,Linux | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
Principal Data Engineer
Join a digital first bank that’s powered by people.
Our technology team builds innovative digital solutions rapidly and at scale to deliver the next generation of banking services for our customers around the world.
In our cybersecurity team you’ll be helping to safeguard the financial system on which millions of people depend.
You’ll be making banking more secure by designing, implementing, and operating controls to manage cybersecurity risk. You’ll help define HSBC Group cyber security standards, deliver Global Security Operations ad Threat management services, provide round-the-clock monitoring and security incident response services, and oversee Network/Application/Infrastructure Security. The work you do will provid3e assurance of the adequacy and effectiveness of security controls to Business Risk Owners.
The position is a senior technical, hands-on delivery role, requiring knowledge of data engineering, cloud infrastructure and platform engineering, platform operations and production support.
The Principal Cybersecurity Analytics Data Engineer role is a key technical role within the Platform & Data Engineering Team, contributing to, coordinating, and leading data engineering, data acquisition, cloud infrastructure and platform engineering, platform operations, and production support activities using ground-breaking cloud and big data technologies. Cybersecurity-specific knowledge is preferred for the role, but exceptional candidates from other technical disciplines are also encouraged to apply.
The ideal candidate will possess strong technical skills, an eagerness to learn, a keen interest in Cybersecurity, the ability to work collaboratively in a fast-paced environment, and an aptitude for picking up new tools and techniques on the job, building on existing skillsets as a foundation.
As an HSBC employee in the UK, you will have access to tailored professional development opportunities and a competitive pay and benefits package. This includes private healthcare for all UK-based employees, enhanced maternity and adoption pay and support when you return to work, and a contributory pension scheme with a generous employer contribution.
In this role you will:
- Ingestion and provisioning of raw datasets, enriched tables, and/or curated, re-usable data assets to enable Cybersecurity use cases.
- Driving improvements in the reliability and frequency of data ingestion including increasing real-time coverage.
- Support and enhancement of data ingestion infrastructure and pipelines.
- Designing and implementing data pipelines that will collect data from disparate sources across the enterprise, and from external sources, transport said data, and deliver it to our data platform.
- Extract Translate and Load (ETL) workflows, using both advanced data manipulation tools and programmatically manipulating data throughout our data flows, ensuring data is available at each stage in the data flow, and in the form needed for each system, service, and customer along said data flow.
- Identifying and onboarding data sources using existing schemas and, where required, conducting exploratory data analysis to investigate and determine new schemas.
To be successful in this role you should meet the following requirements:
- Experience with SRE and Azure DevOps
- Ability to script (Bash/PowerShell, Azure CLI), code (Python, C#, Java), query (SQL, Kusto query language) coupled with experience with software versioning control systems (e.g., GitHub) and CI/CD systems.
- Programming experience in the following languages: PowerShell, Terraform, Python Windows command prompt and object orientated programming languages.
- Technical knowledge and breadth of Azure technology services (Identity, Networking, Compute, Storage, Web, Containers, Databases)
- Cloud & Big Data Technologies such as Azure Cloud, Azure IAM, Azure Active Directory (Azure AD), Azure Data Factory, Azure Databricks, Azure Functions, Azure, Kubernetes, Service, Azure Logic App, Azure Monitor, Azure Log Analytics, Azure Compute, Azure Storage, Azure Data Lake Store, S3, Synapse Analytics and/or PowerBI
- Experience with server, operating system, and infrastructure technologies such as Nginx/Apache, CosmosDB, Linux, Bash, PowerShell, Prometheus, Grafana, Elasticsearch)
- Positive attitude, strong work ethic and passion for learning.
- Bachelor’s degree in Computer Science, Software Engineering, Data Science, or a related field advantageous
This role is based in Sheffield or Edinburgh.
Opening up a world of opportunity
Being open to different points of view is important for our business and the communities we serve. At HSBC, we’re dedicated to creating diverse and inclusive workplaces. Our recruitment processes are accessible to everyone - no matter their gender, ethnicity, disability, religion, sexual orientation, or age.
We take pride in being part of the Disability Confident Scheme. This helps make sure you can be interviewed fairly if you have a disability, long term health condition, or are neurodiverse.
If you’d like to apply for one of our roles and need adjustments made, please get in touch with our Recruitment Helpdesk:
Email: hsbc.recruitment@hsbc.com
Telephone: +44 207 832 850
Responsibilities:
In this role you will:
- Ingestion and provisioning of raw datasets, enriched tables, and/or curated, re-usable data assets to enable Cybersecurity use cases.
- Driving improvements in the reliability and frequency of data ingestion including increasing real-time coverage.
- Support and enhancement of data ingestion infrastructure and pipelines.
- Designing and implementing data pipelines that will collect data from disparate sources across the enterprise, and from external sources, transport said data, and deliver it to our data platform.
- Extract Translate and Load (ETL) workflows, using both advanced data manipulation tools and programmatically manipulating data throughout our data flows, ensuring data is available at each stage in the data flow, and in the form needed for each system, service, and customer along said data flow.
- Identifying and onboarding data sources using existing schemas and, where required, conducting exploratory data analysis to investigate and determine new schemas
To be successful in this role you should meet the following requirements:
- Experience with SRE and Azure DevOps
- Ability to script (Bash/PowerShell, Azure CLI), code (Python, C#, Java), query (SQL, Kusto query language) coupled with experience with software versioning control systems (e.g., GitHub) and CI/CD systems.
- Programming experience in the following languages: PowerShell, Terraform, Python Windows command prompt and object orientated programming languages.
- Technical knowledge and breadth of Azure technology services (Identity, Networking, Compute, Storage, Web, Containers, Databases)
- Cloud & Big Data Technologies such as Azure Cloud, Azure IAM, Azure Active Directory (Azure AD), Azure Data Factory, Azure Databricks, Azure Functions, Azure, Kubernetes, Service, Azure Logic App, Azure Monitor, Azure Log Analytics, Azure Compute, Azure Storage, Azure Data Lake Store, S3, Synapse Analytics and/or PowerBI
- Experience with server, operating system, and infrastructure technologies such as Nginx/Apache, CosmosDB, Linux, Bash, PowerShell, Prometheus, Grafana, Elasticsearch)
- Positive attitude, strong work ethic and passion for learning.
- Bachelor’s degree in Computer Science, Software Engineering, Data Science, or a related field advantageou
REQUIREMENT SUMMARY
Min:N/AMax:5.0 year(s)
Information Technology/IT
IT Software - Other
Software Engineering
Graduate
Computer science software engineering data science or a related field advantageous
Proficient
1
Sheffield, United Kingdom