Lead Technical Specialist
at HSBC
Sheffield, England, United Kingdom -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 06 Feb, 2025 | Not Specified | 06 Nov, 2024 | N/A | Confluence,Apache Kafka,Github,Software,Jenkins,Engineers,Jira | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
This role maps to a Lead Technical Specialist profile where knowledge of engineering and architecture combine to build strategies and roadmaps to align to our customers. The individual needs to be confident in understanding the general trends of the technologies in scope, how they are used by our customers, and anticipating how they need to evolve whilst meeting the HSBC architecture standards. This specific role will be building and deploying Kafka clusters on both onPrem and Public cloud Kubernetes environments.
In this role you will:
- Work alongside product owners, technical leads and customers to strategize 3–5-year plans for streaming software.
- Produce and maintain product and platform architecture in line with the central architecture governance functions.
- Implement patterns that provide customers with robust deployment patterns meeting HSBC service levels tiers.
- Ensure that platform and product architectures remain cost effective.
- Standing up and administer On-Prem clusters.
- Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center.
- Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume.
- Provide administration and operations of the Kafka platform - provisioning, access lists Kerberos and SSL configurations. Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
- Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
- Involve in design and capacity review meetings to provide suggestions in Kafka usage.
- Participate in work planning and estimation to ensure optimum performance, high availability, and stability of solutions.
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Create stubs for producers, consumers, and consumer groups for helping onboard applications from different languages/platforms.
- Use automation tools like provisioning using Docker, Jenkins, and GitLab.
- Setting up security on Kafka, Monitor, prevent and troubleshoot security-related issues.
- Ability to perform data related benchmarking, performance analysis and tuning.
To be successful in this role you should meet the following requirements:
- Must be able to communicate on technical levels with Engineers and stakeholders.
- Strong problem solving an analytical skill.
- Experience operating in an infrastructure as code and automation first principles environment.
- Understand the software development lifecycle process, specifically the standard engineering toolsets available to the team within CTO to design and develop software.
- Messaging technologies – Apache Kafka, Confluent Kafka
- DevOps toolsets – GitHub, JIRA, Confluence, Jenkins
- Automation – Ansible, Puppet
- Monitoring – Observability tools such as DataDog, NewRelic, Prometheus, Grafana, Instana, AppD
This role is based in Sheffield.
Opening up a world of opportunity
Being open to different points of view is important for our business and the communities we serve. At HSBC, we’re dedicated to creating diverse and inclusive workplaces. Our recruitment processes are accessible to everyone - no matter their gender, ethnicity, disability, religion, sexual orientation, or age.
We take pride in being part of the Disability Confident Scheme. This helps make sure you can be interviewed fairly if you have a disability, long term health condition, or are neurodiverse.
If you’d like to apply for one of our roles and need adjustments made, please get in touch with our Recruitment Helpdesk:
Email: hsbc.recruitment@hsbc.com
Telephone: +44 207 832 850
Responsibilities:
In this role you will:
- Work alongside product owners, technical leads and customers to strategize 3–5-year plans for streaming software.
- Produce and maintain product and platform architecture in line with the central architecture governance functions.
- Implement patterns that provide customers with robust deployment patterns meeting HSBC service levels tiers.
- Ensure that platform and product architectures remain cost effective.
- Standing up and administer On-Prem clusters.
- Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center.
- Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume.
- Provide administration and operations of the Kafka platform - provisioning, access lists Kerberos and SSL configurations. Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
- Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
- Involve in design and capacity review meetings to provide suggestions in Kafka usage.
- Participate in work planning and estimation to ensure optimum performance, high availability, and stability of solutions.
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Create stubs for producers, consumers, and consumer groups for helping onboard applications from different languages/platforms.
- Use automation tools like provisioning using Docker, Jenkins, and GitLab.
- Setting up security on Kafka, Monitor, prevent and troubleshoot security-related issues.
- Ability to perform data related benchmarking, performance analysis and tuning
To be successful in this role you should meet the following requirements:
- Must be able to communicate on technical levels with Engineers and stakeholders.
- Strong problem solving an analytical skill.
- Experience operating in an infrastructure as code and automation first principles environment.
- Understand the software development lifecycle process, specifically the standard engineering toolsets available to the team within CTO to design and develop software.
- Messaging technologies – Apache Kafka, Confluent Kafka
- DevOps toolsets – GitHub, JIRA, Confluence, Jenkins
- Automation – Ansible, Puppet
- Monitoring – Observability tools such as DataDog, NewRelic, Prometheus, Grafana, Instana, App
REQUIREMENT SUMMARY
Min:N/AMax:5.0 year(s)
Information Technology/IT
IT Software - Other
Software Engineering
Graduate
Proficient
1
Sheffield, United Kingdom