Senior Data Engineer at Freddie Mac
, Virginia, United States -
Full Time


Start Date

Immediate

Expiry Date

24 Apr, 26

Salary

196000.0

Posted On

24 Jan, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, Software Engineering, AWS, Python, Java, Apache Spark, Snowflake, ETL, ELT, Collaboration, Automated Testing, Production Support, Data Pipelines, Software Development Lifecycle, Technical Improvement, Fault-Tolerance

Industry

Financial Services

Description
At Freddie Mac, our mission of Making Home Possible is what motivates us, and it’s at the core of everything we do. Since our charter in 1970, we have made home possible for more than 90 million families across the country. Continue your career journey where your work contributes to a greater purpose. Position Overview: We are seeking a highly skilled Senior Software Engineer to join our team and enhance our internal data platform. This role requires expertise in modern cloud-based data infrastructure to support data-driven decision-making and modeling across the organization. The ideal candidate will possess a strong background in data engineering, software engineering, and AWS familiarity. Our Impact: We manage a critical internal data platform supporting key business operations, including prepayment model development, trading analytics, and securitization. We collaborate with various teams to understand their data requirements and design systems that align with their business objectives. We ensure our systems are robust, scalable, fault-tolerant, and cost-effective. Your Impact: Design, build, maintain and support ETL/ELT data pipelines using AWS Services (e.g. AWS EMR) and Snowflake Maintain data ingestion libraries written in Java and Python Collaborate with data producers, data scientists / modelers and data consumers to understand their requirements and design innovative solutions to empower them Design and develop new code, review existing code changes, and implement automated tests. Actively seek opportunities to continuously improve the technical quality and architecture to improve the product’s business value. Improve the product’s test automation and deployment practices to enable the team to deliver features more efficiently. Operate the data pipelines in production including release management and production support. Qualifications: At least 5 years of experience developing production software Strong Python skills with at least two years of experience writing production code At least two years of experience in data engineering, including Apache Spark At least one year of experience with Snowflake Exposure to AWS and a willingness to learn more Bachelor’s degree in computer science or equivalent experience Experience writing automated unit, integration, regression, performance and acceptance tests Solid understanding of software design principles Keys to Success in this Role: Passionate about hands-on software development A desire to work on all aspects of the software development lifecycle: requirements gathering, design, development, testing and operations Strong collaboration and communication skills (both written and verbal) Desire to continuously improve the team’s technical practices Ability to quickly learn, apply and deploy new technologies to solve emerging problems We consider all applicants for all positions without regard to gender, race, color, religion, national origin, age, marital status, veteran status, sexual orientation, gender identity/expression, physical and mental disability, pregnancy, ethnicity, genetic information or any other protected categories under applicable federal, state or local laws. We will ensure that individuals are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Freddie Mac offers a comprehensive total rewards package to include competitive compensation and market-leading benefit programs. Information on these benefit programs is available on our Careers site. This position has an annualized market-based salary range of $130,000 - $196,000 and is eligible to participate in the annual incentive program. The final salary offered will generally fall within this range and is dependent on various factors including but not limited to the responsibilities of the position, experience, skill set, internal pay equity and other relevant qualifications of the applicant.
Responsibilities
Design, build, maintain, and support ETL/ELT data pipelines using AWS Services and Snowflake. Collaborate with various teams to understand their data requirements and design innovative solutions.
Loading...