Professional Data Engineer at Freddie Mac
McLean, Virginia, USA -
Full Time


Start Date

Immediate

Expiry Date

14 Nov, 25

Salary

163000.0

Posted On

14 Aug, 25

Experience

1 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Design Principles, Apache Spark, Computer Science, Snowflake, Integration

Industry

Information Technology/IT

Description

At Freddie Mac, our mission of Making Home Possible is what motivates us, and it’s at the core of everything we do. Since our charter in 1970, we have made home possible for more than 90 million families across the country. Join an organization where your work contributes to a greater purpose.
Position Overview:
We are seeking a highly skilled Professional Software Engineer to join our team and enhance our internal data platform. This role requires expertise in modern cloud-based data infrastructure to support data-driven decision-making and modeling across the organization. The ideal candidate will possess a strong background in data engineering, software engineering, and AWS familiarity.

Our Impact:

  • We manage a critical internal data platform supporting key business operations, including prepayment model development, trading analytics, and securitization.
  • We collaborate with various teams to understand their data requirements and design systems that align with their business objectives.
  • We ensure our systems are robust, scalable, fault-tolerant, and cost-effective.

Your Impact:

  • Design, build, maintain and support ETL/ELT data pipelines using AWS Services (e.g. AWS EMR) and Snowflake
  • Maintain data ingestion libraries written in Java and Python
  • Design and develop new code, review existing code changes, and implement automated tests.
  • Actively seek opportunities to continuously improve the technical quality and architecture to improve the product’s business value.
  • Improve the product’s test automation and deployment practices to enable the team to deliver features more efficiently.
  • Operate the data pipelines in production including release management and production support.

Qualifications:

  • At least 2 years of experience developing production software
  • Strong Python skills with at least two years of experience writing production code
  • At least one year of experience in data engineering with either Apache Spark or Snowflake
  • Bachelor’s degree in computer science or equivalent experience
  • Experience writing automated unit, integration, regression, performance and acceptance tests
  • Solid understanding of software design principles

Keys to Success in this Role:

  • Passionate about hands-on software development
  • A desire to work on all aspects of the software development lifecycle: requirements gathering, design, development, testing and operations
  • Strong collaboration and communication skills (both written and verbal)
  • Desire to continuously improve the team’s technical practices
  • Ability to quickly learn, apply and deploy new technologies to solve emerging problems
Responsibilities
  • Passionate about hands-on software development
  • A desire to work on all aspects of the software development lifecycle: requirements gathering, design, development, testing and operations
  • Strong collaboration and communication skills (both written and verbal)
  • Desire to continuously improve the team’s technical practices
  • Ability to quickly learn, apply and deploy new technologies to solve emerging problem
Loading...