Big Data DevOps lead

at  Wells Fargo

Charlotte, NC 28202, USA -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate14 Nov, 2024Not Specified16 Aug, 20243 year(s) or aboveMongodb,Cloudera,Hive,Gradle,Artifactory,Jenkins,Kafka,Storage Architecture,Azure,Sql Server,Training,Sonarqube,Hadoop,Flat Files,Git,Apache SparkNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

At Wells Fargo, we are looking for talented people who will put our customers at the center of everything we do. We are seeking candidates who embrace diversity, equity and inclusion in a workplace where everyone feels valued and inspired.

APPLICANTS WITH DISABILITIES

To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo .

WELLS FARGO RECRUITMENT AND HIRING REQUIREMENTS:

a. Third-Party recordings are prohibited unless authorized by Wells Fargo.
b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process

Required Qualifications:

  • 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
  • 5+ years of experience in CI/CD tools such as Jenkins, maintaining a healthy application health scores, supporting Java applications using Gradle or Maven build-packs.
  • 5+ years of proficiency knowledge with Git and various branching strategies, code scanning tools like SonarQube, Checkmarx & Black Duck and dependency management tools like Artifactory
  • 3+ years of experience in setting up or migrating to cloud platform (Azure or GCP) and enable CI/C

Desired Qualifications:

  • Cloud Certifications
  • Experience with Hadoop, Cloudera and Hortonworks
  • Experience in building high performing and scalable data pipeline platform using Hadoop, Apache Spark, MongoDB and object storage architecture.
  • Implement CICD pipelines for Sourcing from various Sources e.g. Oracle, SQL Server, Hive, Kafka, Flat files, NDM end points using Spark based Framewor

Responsibilities:

Wells Fargo is seeking a Big Data DevOps Lead responsible for supporting our corporate risk data services platform within the Enterprise Function Technology (EFT) group. The Enterprise Functions Technology (EFT) group provides technology solutions and support for Risk, Audit, Finance, Marketing, Human Resources, Corporate Properties, and Stakeholder Relations business lines. In addition, EFT provides unique technology solutions and innovation for Wells Fargo Technology, Enterprise Shared Services, and Enterprise Data Management. This combined portfolio of applications and tools are continually engineered to meet the challenges of stability, security, scalability, and speed.
Technology sets IT strategy; enhances the design, development, and operations of our systems; optimizes the Wells Fargo infrastructure; provides information security; and enables Wells Fargo global customers to have 24 hours a day, 7 days a week banking access through in-branch, online, ATMs, and other channels.

In this role, you will:

  • Lead complex technology initiatives including those that are companywide with broad impact
  • Lead a DevOps a Big Data platform such as Hadoop
  • Enable cloud strategy and migration strategies
  • Act as a key participant in developing standards and companywide best practices for engineering complex and large scale technology solutions for technology engineering disciplines
  • Design, code, test, debug, and document for projects and programs
  • Review and analyze complex, large-scale technology solutions for tactical and strategic business objectives, enterprise technological environment, and technical challenges that require in-depth evaluation of multiple factors, including intangibles or unprecedented technical factors
  • Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives
  • Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals
  • Lead projects, teams, or serve as a peer mentor

Required Qualifications:

  • 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
  • 5+ years of experience in CI/CD tools such as Jenkins, maintaining a healthy application health scores, supporting Java applications using Gradle or Maven build-packs.
  • 5+ years of proficiency knowledge with Git and various branching strategies, code scanning tools like SonarQube, Checkmarx & Black Duck and dependency management tools like Artifactory
  • 3+ years of experience in setting up or migrating to cloud platform (Azure or GCP) and enable CI/CD

Desired Qualifications:

  • Cloud Certifications
  • Experience with Hadoop, Cloudera and Hortonworks
  • Experience in building high performing and scalable data pipeline platform using Hadoop, Apache Spark, MongoDB and object storage architecture.
  • Implement CICD pipelines for Sourcing from various Sources e.g. Oracle, SQL Server, Hive, Kafka, Flat files, NDM end points using Spark based Framework

Job Expectations:

  • This position is not eligible for Visa sponsorship
  • This position offers a hybrid work schedule
  • Must be able to work on-site at any of the listed locations
  • Relocation assistance is not available for this position


REQUIREMENT SUMMARY

Min:3.0Max:5.0 year(s)

Information Technology/IT

IT Software - Other

Software Engineering

Graduate

Proficient

1

Charlotte, NC 28202, USA