Data Engineer - Systematic Commodities Hedge Fund at Moreton Capital Partners
Ciudad de México, , Mexico -
Full Time


Start Date

Immediate

Expiry Date

07 May, 26

Salary

0.0

Posted On

06 Feb, 26

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, SQL, Data Pipelines, ETL, Airflow, Cloud Computing, Data Warehousing, Data Quality, Monitoring Systems, Automation, Version Control, Distributed Processing, Financial Data, Time-Series Data, Documentation, Collaboration

Industry

Investment Management

Description
Data Engineer - Systematic Commodities Hedge Fund Moreton Capital Partners is a systematic commodities hedge fund preparing to launch live trading across global futures markets. Our research and trading systems rely on robust, scalable data infrastructure. We are looking for Data Engineers to help us design, build, and optimize that infrastructure alongside senior engineers and the CIO. Key Responsibilities You’ll work on projects such as: Designing and maintaining data pipelines to collect, clean, and transform market and alternative datasets (e.g., futures, options, weather, satellite, fundamentals). Building ETL workflows using Python (pandas/polars) and orchestration tools such as Airflow or Prefect. Structuring data warehouses and APIs (SQL, Snowflake, or similar) for efficient query and analysis. Developing data quality and monitoring systems for latency, completeness, and integrity. Assisting in cloud deployments (AWS, Docker) and automation for data ingestion and versioning. Collaborating with Quant Researchers to make research datasets reproducible and production-ready. Contributing to internal documentation and code standards to ensure long-term maintainability. Strong programming skills in Python and familiarity with SQL. Understanding of data structures, algorithms, and software engineering best practices. Interest in large-scale data systems, cloud computing, or distributed processing. Self-starter with curiosity and attention to detail. Bonus points for: Experience with Airflow, Docker, or AWS. Familiarity with Snowflake, Polars, or Pandas workflows. Exposure to financial or time-series data. Understanding of CI/CD, version control, or testing frameworks. Real-world impact: Help build data systems that directly feed institutional-grade trading research and live execution. Technical depth: Gain hands-on experience with distributed data pipelines, cloud infrastructure, and production data engineering. Mentorship: Work closely with senior engineers, the CIO, and Quant Researchers on live projects. Collaborative culture: Inclusive, high-trust team that values initiative and learning. Compensation: Competitive stipend/salary based on experience.
Responsibilities
You will design and maintain data pipelines to collect and transform various datasets, and build ETL workflows using Python and orchestration tools. Additionally, you will collaborate with Quant Researchers to ensure datasets are production-ready.
Loading...