Data Engineer Intern - Systematic Commodities Hedge Fund at Moreton Capital Partners
Ciudad de México, , Mexico -
Full Time


Start Date

Immediate

Expiry Date

19 Jan, 26

Salary

0.0

Posted On

21 Oct, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, SQL, Data Pipelines, ETL Workflows, Data Warehousing, APIs, Data Quality, Cloud Deployments, AWS, Docker, Collaboration, Documentation, Data Structures, Algorithms, Software Engineering, Curiosity

Industry

Investment Management

Description
Data Engineer Intern - Systematic Commodities Hedge Fund Moreton Capital Partners is a systematic commodities hedge fund preparing to launch live trading across global futures markets. Our research and trading systems rely on robust, scalable data infrastructure. We are looking for Data Engineer Interns to help us design, build, and optimize that infrastructure alongside senior engineers and the CIO. This role is ideal for students from computer science, data engineering, or software engineering backgrounds who want to apply their technical skills to financial markets and large-scale data systems. Key Responsibilities You’ll work on projects such as: Designing and maintaining data pipelines to collect, clean, and transform market and alternative datasets (e.g., futures, options, weather, satellite, fundamentals). Building ETL workflows using Python (pandas/polars) and orchestration tools such as Airflow or Prefect. Structuring data warehouses and APIs (SQL, Snowflake, or similar) for efficient query and analysis. Developing data quality and monitoring systems for latency, completeness, and integrity. Assisting in cloud deployments (AWS, Docker) and automation for data ingestion and versioning. Collaborating with Quant Researchers to make research datasets reproducible and production-ready. Contributing to internal documentation and code standards to ensure long-term maintainability. Currently studying or recently graduated in Computer Science, Software Engineering, Data Science, or related quantitative discipline. Strong programming skills in Python and familiarity with SQL. Understanding of data structures, algorithms, and software engineering best practices. Interest in large-scale data systems, cloud computing, or distributed processing. Self-starter with curiosity and attention to detail. Bonus points for: Experience with Airflow, Docker, or AWS. Familiarity with Snowflake, Polars, or Pandas workflows. Exposure to financial or time-series data. Understanding of CI/CD, version control, or testing frameworks. Real-world impact: Help build data systems that directly feed institutional-grade trading research and live execution. Technical depth: Gain hands-on experience with distributed data pipelines, cloud infrastructure, and production data engineering. Mentorship: Work closely with senior engineers, the CIO, and Quant Researchers on live projects. Career growth: Top performers may progress to full-time data engineering or quant dev roles as the fund scales. Collaborative culture: Inclusive, high-trust team that values initiative and learning. Timing: Flexible start dates; part-time during term or full-time during breaks; multiple cohorts year-round. Compensation: Competitive paid internship (stipend/salary based on location & hours).
Responsibilities
You will design and maintain data pipelines to collect, clean, and transform various datasets. Additionally, you will assist in building ETL workflows and developing data quality monitoring systems.
Loading...