Data Engineering Analyst at Tiverton Advisors
Raleigh, North Carolina, United States -
Full Time


Start Date

Immediate

Expiry Date

14 Apr, 26

Salary

0.0

Posted On

14 Jan, 26

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, SQL, Data Engineering, ETL, Data Visualization, AI, Machine Learning, APIs, Web Scraping, Excel, Data Quality, Problem Solving, Communication, Data Analysis, Version Control, Collaboration

Industry

Investment Management

Description
Company Description TIVERTON is an investment firm exclusively focused on the food and production agriculture sector. The firm oversees $2.2+ billion of assets across debt and equity strategies in the US. The team combines deep agricultural operating experience and financial professionals to provide tailored, long-term capital solutions to the space. For more information, please visit www.tiverton.ag. Job Description POSITION SUMMARY Tiverton is seeking a Data Engineering Analyst (Entry Level / Intern) to support our investment process and portfolio operations through data engineering, analytics, and AI-powered automation. This is an entry-level position ideal for recent graduates or current students seeking internship-to-hire opportunities. The role combines data infrastructure development with investment analytics, working across deal sourcing, due diligence, portfolio monitoring, and LP reporting. The ideal candidate is curious, eager to learn, and excited to build solutions across the full data stack - from pipeline engineering to business intelligence - while applying AI/ML tools to solve real-world problems in agricultural private equity. This role offers broad exposure to both the investment side (deal flow, due diligence and fund analytics) and operations side (portfolio company data, reporting automation, and other analytics. This role is onsite in our Raleigh, NC office. The successful candidate will be self-motivated and energized by working with a group of thoughtful, smart, and skilled colleagues. He or she will enjoy being a part of a young, hungry and collaborative organization focused on becoming the pre-eminent investment firm in US agriculture. PRIMARY RESPONSIBILITIES Data Infrastructure & Pipeline Engineering (40%) Build and maintain ETL pipelines pulling data from internal and external sources into our Snowflake data warehouse Develop Python and SQL automation scripts for recurring data processes Manage Snowflake data warehouse - schema design, query optimization, and data modeling Build API integrations for third-party data sources (pricing data, B2B data providers, market intelligence) Implement data quality checks, validation rules, and monitoring to ensure pipeline reliability Create web scraping solutions for data collection from public sources Maintain code repositories with proper version control and documentation Investment Analytics & Deal Support (30%) Support deal pipeline analytics and sourcing workflows in our CRM Build models and analytics for sector trends (crop prices, land values, farm credit metrics) Extract and analyze data from appraisal documents, financial statements, and industry reports Develop due diligence analytical frameworks and data rooms for new investments Create LP reporting dashboards and automated quarterly reporting processes Support investment team with ad-hoc analytical requests and data visualization AI/ML Implementation & Automation (20%) Leverage LLMs (OpenAI, Claude) to accelerate document analysis, data extraction, and research workflows Build AI-powered automation for deal screening, document processing, and data enrichment Implement intelligent solutions for pattern recognition, anomaly detection, and data quality Use prompt engineering and AI coding assistants to rapidly prototype analytical tools Develop RAG (Retrieval-Augmented Generation) systems for knowledge management Portfolio Company Support & Reporting (10%) Support portfolio company reporting requirements and data requests Build dashboards and reporting tools for portfolio operations teams Troubleshoot data issues and provide technical support to portfolio companies Partner with investment team to ensure clean, reliable data for portfolio monitoring QUALIFICATIONS AND SKILLS REQUIRED Technical Skills Proficiency in Python and SQL through coursework or projects; familiarity with pandas, APIs, or automation a plus Exposure to data pipelines, ETL concepts, or data engineering workflows through coursework or projects Familiarity with cloud platforms or data warehouses (Snowflake, BigQuery, AWS) – exposure through coursework or certifications counts Interest in data visualization; experience with any BI tool (Power BI, Tableau, Looker) or willingness to learn Solid Excel skills including formulas, pivot tables, and basic data analysis Exposure to APIs, web scraping, or data collection methods (REST APIs, Beautiful Soup, or similar) Interest in AI/ML tools and LLMs; experience with ChatGPT, Claude, or similar for productivity is a plus Git version control and collaborative development workflows Business & Analytical Skills Ability to translate business problems into technical solutions Strong problem-solving skills - can debug data issues independently Understanding of financial concepts and private equity metrics helpful but not required Strong communication skills - can explain technical concepts to non-technical stakeholders Self-directed with ability to prioritize and manage multiple projects Detail-oriented with focus on data quality and reliability Experience & Background Current senior or recent graduate (within 1 year) pursuing or holding a degree in a relevant field Pursuing or recently completed Bachelor’s degree in Computer Science, Data Science, Engineering, Finance, Economics, or related quantitative field Demonstrated interest through coursework, personal projects, hackathons, or prior internships involving data pipelines, analytics, or automation PREFERRED/NICE TO HAVE Experience building LLM-powered applications or automation tools Familiarity with CRM systems (Affinity, Salesforce) or investment workflow tools Experience with document processing and unstructured data extraction Knowledge of ML libraries (scikit-learn, numpy) and model deployment Exposure to private equity, venture capital, or investment banking Understanding of DevOps practices - testing, monitoring, CI/CD Knowledge of agricultural markets, farm credit systems, or commodity data ADDITIONAL INFORMATION: Please submit examples of technical projects (GitHub repos, class projects, hackathon submissions, or portfolio sites welcome) Open to both summer internships (10-12 weeks) and full-time entry-level positions with competitive compensation Interns: Mentorship, real project ownership, and potential full-time conversion Full-time hires: Benefits package including Healthcare, Dental, Vision, Group Life Insurance, 401(k), generous PTO Location: Raleigh, North Carolina Additional Information All your information will be kept confidential according to EEO guidelines.
Responsibilities
The Data Engineering Analyst will build and maintain ETL pipelines, develop automation scripts, and support investment analytics and deal workflows. The role also involves leveraging AI/ML tools for automation and providing reporting support for portfolio companies.
Loading...