Senior Software Engineer, Data at Old Well Labs
Charlotte, North Carolina, United States -
Full Time


Start Date

Immediate

Expiry Date

02 Aug, 26

Salary

0.0

Posted On

04 May, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, System design, Distributed systems, Event-driven systems, Data pipelines, Dbt, Airflow, SQS, Step Functions, Lambda, Athena, S3, AI agents, Architecture, Data engineering

Industry

Software Development

Description
About OWL Old Well Labs is a Charlotte-based financial intelligence startup used by leading allocators and the fund managers they invest with. Our platform turns billions of disclosures into structured data that makes it easy for those in the investment ecosystem to find, monitor, and connect with one another. About the Role We’re hiring an Engineer to define and build systems that make our data usable across the company. This role will own the architecture for how data is processed, enriched, and operationalized. You'll build agent systems that collect data across the web and extract structured information, the queues and pipelines that route that data through human-in-the-loop review, and the admin tooling that lets our data ops team work efficiently. You'll work heavily on the data infrastructure underneath (e.g., SQS, dbt, Step Functions, Lambda) to make it all fast and reliable. We're big proponents of using AI to do our best work, and this role is the most direct expression of that. Every workflow you build helps remove manual work or makes us faster at it. If you have an idea for a new tool, agent, or pipeline, you will be able to own and push that idea forward. What You’ll Do Build AI Agents That Collect and Extract Data Design and ship agent workflows that collect data from across the web, extract structured information, and build it into our app. Design Data Pipelines and Event-Driven Systems Own how data flows from collection through processing to the platform. Work across SQS, dbt, Step Functions, Lambda, Athena, and S3 to evolve the architecture as volume and complexity grow. Build Admin Tooling for Data Operations Ship the human-in-the-loop interfaces, dashboards, and internal tools our operations team uses every day. Move Quickly in a Small, Senior Team Operate with high ownership in a small engineering team. Take problems from ambiguous requirements through design, implementation, and production, with the autonomy to make the calls that ship the work. What We’re Looking For 5+ years of backend, platform, or systems engineering experience Strong Python and system design expertise Experience with distributed and event-driven systems Hands-on experience with data pipeline tools (dbt, Airflow, or comparable) Experience defining architecture across multiple systems Practical experience applying AI in production systems Ability to operate with high ownership in ambiguous environments Location Charlotte (Hybrid: Tuesday–Thursday in office) Compensation Competitive salary + equity Benefits 100% medical, dental & vision coverage 401k Flexible PTO Parental leave Office in South End Bagel Thursdays
Responsibilities
The engineer will design and build AI-driven data pipelines and agent systems to collect, extract, and operationalize structured information. They will also develop internal admin tooling and dashboards to support the data operations team's efficiency.
Loading...