Financial Data Engineer at Equifax
St. Louis, Missouri, USA -
Full Time


Start Date

Immediate

Expiry Date

12 Nov, 25

Salary

0.0

Posted On

12 Aug, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Scripting Languages, Sql, Optimization, Python, Programming Languages, Data Models

Industry

Information Technology/IT

Description

FULL TIME

8/10/2025
J00170052
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you.
This is an exciting opportunity for a highly motivated and skilled Data Engineer to join the Finance Business Intelligence team at Equifax Workforce Solutions and play a pivotal role in shaping our data landscape. The Data Engineer is instrumental in driving data-driven decisions by designing, developing, and maintaining robust and scalable data pipelines and architectures. This dynamic role involves extracting, transforming, and loading (ETL) large and complex datasets from diverse systems, ensuring the utmost data quality, accuracy, and accessibility for critical reporting and insightful analytics. You’ll collaborate closely with financial analysts and business stakeholders, translating their data requirements into cutting-edge, efficient data solutions. Beyond building new infrastructure, you’ll also be responsible for optimizing existing data infrastructure for peak performance and unwavering reliability, directly supporting the creation of essential financial dashboards and strategic business intelligence. We’re looking for someone who thrives on solving complex data challenges and is passionate about empowering the finance organization with actionable insights.

WHAT EXPERIENCE YOU NEED:

  • BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred
  • 5+ years of experience as a data engineer or related role
  • Cloud certification strongly preferred
  • Advanced skills using programming languages such as Python or SQL and intermediate level experience with scripting languages
  • Intermediate level understanding and experience with Google Cloud Platforms and overall cloud computing concepts, as well as basic knowledge of other cloud environments
  • Experience building and maintaining moderately-complex data pipelines, troubleshooting issues, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects
  • Experience designing and implementing moderately complex data models and experience enabling optimization to improve performance
  • Demonstrates advanced Git usage and CI/CD integration skills

WHO IS EQUIFAX?

At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence.
We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best.
Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
  • Apply the knowledge of data characteristics and data supply pattern, develop rules and tracking process to support data quality model.
  • Prepare data for analytical use by building data pipelines to gather data from multiple sources and systems.
  • Integrate, consolidate, cleanse and structure data for use by our clients in our solutions.
  • Perform design, creation, and interpretation of large and highly complex datasets.
  • Stay up-to-date with the latest trends and advancements in GCP and related technologies, actively proposing and evaluating new solutions.
  • Understand best practices for data management, maintenance, reporting and security and use that knowledge to implement improvements in our solutions.
  • Implement security best practices in pipelines and infrastructure.
  • Develop and implement data quality checks and troubleshoot data anomalies.
  • Provide guidance and mentorship to junior data engineers.
  • Review dataset implementations performed by junior data engineers.
Loading...