Senior Software Engineer - Big Data at Wells Fargo
Irving, Texas, USA -
Full Time


Start Date

Immediate

Expiry Date

29 Jul, 25

Salary

0.0

Posted On

30 Apr, 25

Experience

3 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Communication Skills, Github, Spark, Bitbucket, Data Modeling, Azure, Hive, Computer Science, Analytical Skills, Hadoop, Kafka, Training, Aws, Processing

Industry

Information Technology/IT

Description

APPLICANTS WITH DISABILITIES

To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo .

WELLS FARGO RECRUITMENT AND HIRING REQUIREMENTS:

a. Third-Party recordings are prohibited unless authorized by Wells Fargo.
b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process

Required Qualifications, US:

  • 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
  • 4+ years of hands-on experience in Hadoop
  • 4+ years of experience with Hive and Spark
  • 4+ years of experience with GitHub and/or Bitbucket
  • 4+ years of experience designing & developing ETL processes & data pipelines
  • 3+ years of experience with Python & Unix/Shell scriptin

Desired Qualifications:

  • A Bachelor’s degree and/or Master’s degree in Computer Science or related engineering field
  • Experience with Kafka for real-time data ingestion and processing.
  • Strong understanding of data modeling and data warehousing concepts
  • Excellent problem-solving and analytical skills
  • Experience with AWS, GCP, Azure or similar cloud platform
  • 2+ years of API design & development experience
  • Strong verbal, written, and interpersonal communication skills
  • Agile experienc
Responsibilities

The Enterprise Functions Technology (EFT) group provides technology solutions and support for Risk, Audit, Finance, Marketing, Human Resources, Corporate Properties, and Stakeholder Relations business lines. In addition, EFT provides unique technology solutions and innovation for Wells Fargo Technology, Enterprise Shared Services, and Enterprise Data Management. This combined portfolio of applications and tools are continually engineered to meet the challenges of stability, security, scalability, and speed.
Within EFT the Risk Data Services is a horizontal function within Risk Technology organization and is responsible for delivering data consistently across Risk. The Risk Data Services team is seeking a Senior Software Engineer to design, develop & implement solutions utilizing big data technology.

In this role, you will:

  • Design, development & implementations of data processing solutions using HDFS, Hive and Spark
  • Ensure solutions are highly usable, scalable, and maintainable
  • Understand the impacts of data layer performance factors and collaborate with the Data Architect to implement mitigating physical modeling solutions
  • Perform code and design reviews to ensure performance, maintainability, and standards
  • Work with the Data Architect to define data security roles, groups and policies
  • Enforce Hadoop design standards, reusable objects, tools, best practices, and related development methodologies for the organization
  • Partnering with Business Stakeholders and developers to ensure deliverables meet business expectations
  • Tuning of dataset design for optimal performance
  • Develop, execute, and troubleshooting of complex report SQL
  • Lead moderately complex initiatives and deliverables within technical domain environments
  • Collaborate and consult with peers and colleagues to resolve technical challenges and achieve goals

Required Qualifications, US:

  • 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
  • 4+ years of hands-on experience in Hadoop
  • 4+ years of experience with Hive and Spark
  • 4+ years of experience with GitHub and/or Bitbucket
  • 4+ years of experience designing & developing ETL processes & data pipelines
  • 3+ years of experience with Python & Unix/Shell scripting

Desired Qualifications:

  • A Bachelor’s degree and/or Master’s degree in Computer Science or related engineering field
  • Experience with Kafka for real-time data ingestion and processing.
  • Strong understanding of data modeling and data warehousing concepts
  • Excellent problem-solving and analytical skills
  • Experience with AWS, GCP, Azure or similar cloud platform
  • 2+ years of API design & development experience
  • Strong verbal, written, and interpersonal communication skills
  • Agile experience

Job Expectations:

  • This position is not available for visa sponsorship
Loading...