Senior Data Engineer
at Top Remote Talent
Buenos Aires, Buenos Aires, Argentina -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 29 Jan, 2025 | Not Specified | 30 Oct, 2024 | 5 year(s) or above | Interpersonal Skills,Python,Computer Science,Apache Kafka,Soap,Data Services,Rest | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
Our client is a leader in the single-family rental (SFR) investment market, offering a comprehensive platform designed to make real estate investing more accessible, cost-effective, and straightforward. They combine a deep passion for helping investors build wealth through real estate with cutting-edge technology that redefines the investment process.
With a dynamic team of over 600 professionals, their collaborative and proactive culture drives their rapid growth. After closing a Series E funding round last year and raising $240 million, the company continues to expand its presence with offices in California, Texas, and New York, alongside numerous remote opportunities. Their growth strategy includes the recent acquisitions of Great Jones (a full-service property management company), Stessa (financial management software), Rent Prep (tenant screening and placement services), and Mynd (a property management platform for both retail and institutional investors).
About the team:
The Data Engineering team is the core of Data, which everything else relies upon. The team is responsible for the development and management of the Enterprise Data Platform, which powers the company and all respective Data functions. The Enterprise Data Platform is crucial for integrating, managing, and providing data across the business. There are multiple sub-disciplines within Data Engineering, each contributing to the overall effectiveness and efficiency of data operations. It is a highly cohesive team consisting of the 4 pods, Data Infrastructure, ML & GenAI Ops, Analytics Engineering & Data Services.
They architect and build the core data infrastructure to support the entire company, build data ingestions from internal & external applications, support infra for ML & GenAI products and applications, merge various data feeds and combining them into easy to use, valuable data sets to support analytics and design and create scalable and packaged data solutions in the form of various data services.
About the role:
We are looking for a talented Senior Data Engineer to join the Data Services pod in the established Data Engineering team.
As a Senior Data Engineer, you will be instrumental in architecting and constructing a new version of the data services platform, Data Services 2.0!
They operate a modern all-cloud data stack that includes AWS, Airflow, Docker, DBT, Python, Snowflake, Sigma, Java/Kotlin and our old friend SQL.
What you will do:
- Improve and maintain the data services platform.
- Deliver high-quality data services promptly, ensuring data governance and integrity while meeting objectives and maintaining SLAs for data sharing across multiple products.
- Develop effective architectures and produce key code components that contribute to the design, implementation, and maintenance of technical solutions.
- Integrating a diverse network of third-party tools into a cohesive, scalable platform, optimizing code for enhanced scalability, performance, and readability.
- Continuously improving system performance and reliability by diagnosing and resolving unexpected operational issues to prevent recurrence.
- Ensuring that your team’s work undergoes rigorous testing through repeatable, automated methods.
- Support data infrastructure and rest of the data team who designs, implements and deploys, scalable, fault-tolerant pipelines that ingest, and refine large diverse (structured, semi-structured and unstructured datasets) into simplified accessible data models in production.
- Collaborate with cross-functional teams to understand data flows and design, build and test optimal solutions for engineering challenges.
- Operate within an Agile/Scrum framework, working closely with Product and Engineering teams to deliver value across multiple services and products.
- Influence and shape the enterprise data platform and services roadmap, architecture, and design standards. Collaborate with technology leaders and team members to design, adapt, and enhance the architecture to meet evolving business needs.
QUALIFICATIONS:
- BS or MS in a technical field: computer science, engineering or similar.
- 8+ years technical experience working with data.
- 5+ strong experience building scalable data services and applications using either SQL, Python, Java / Kotlin, with the interest and aim to learn additional tools and technologies.
- Deep understanding of microservices architecture and RESTful API development including gRPC, REST/SOAP, GraphQL.
- Experience with AWS services including Messaging such as SQS, SNS, and familiarity with real-time data processing frameworks such as Apache Kafka or AWS Kinesis.
- Significant experience building and deploying data-related infrastructure, robust data pipelines (beyond simple API pulls) & ETL/ELT code encompassing messaging, storage, compute, transformation, execution.
- Experience in identifying and proposing initiatives aimed at enhancing the performance and efficiency of existing systems, setting the standard for SLAs & SLOs.
- Strong communication and interpersonal skills.
- Experience managing a team or experience working with an on-shore/off-shore model is a plus.
Responsibilities:
- Improve and maintain the data services platform.
- Deliver high-quality data services promptly, ensuring data governance and integrity while meeting objectives and maintaining SLAs for data sharing across multiple products.
- Develop effective architectures and produce key code components that contribute to the design, implementation, and maintenance of technical solutions.
- Integrating a diverse network of third-party tools into a cohesive, scalable platform, optimizing code for enhanced scalability, performance, and readability.
- Continuously improving system performance and reliability by diagnosing and resolving unexpected operational issues to prevent recurrence.
- Ensuring that your team’s work undergoes rigorous testing through repeatable, automated methods.
- Support data infrastructure and rest of the data team who designs, implements and deploys, scalable, fault-tolerant pipelines that ingest, and refine large diverse (structured, semi-structured and unstructured datasets) into simplified accessible data models in production.
- Collaborate with cross-functional teams to understand data flows and design, build and test optimal solutions for engineering challenges.
- Operate within an Agile/Scrum framework, working closely with Product and Engineering teams to deliver value across multiple services and products.
- Influence and shape the enterprise data platform and services roadmap, architecture, and design standards. Collaborate with technology leaders and team members to design, adapt, and enhance the architecture to meet evolving business needs
REQUIREMENT SUMMARY
Min:5.0Max:8.0 year(s)
Information Technology/IT
IT Software - DBA / Datawarehousing
Software Engineering
BSc
Computer Science, Engineering
Proficient
1
Buenos Aires, Buenos Aires, Argentina