Senior AI/Python Engineer at Fuel Cycle
Los Angeles, California, USA -
Full Time


Start Date

Immediate

Expiry Date

04 Dec, 25

Salary

180000.0

Posted On

04 Sep, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Aws, Analytics, Flask, Data Processing, Project Work, Cloud Computing, Ownership, Accountability, Deep Learning, Computer Science, Data Engineering, Pandas, Sqlalchemy, Interactive Applications, Docker, Architecture, Communication Skills, Microservices, Collaboration

Industry

Information Technology/IT

Description

Location
Los Angeles Office
Employment Type
Full time
Location Type
Hybrid
Department
Engineering
Compensation
$150K – $180K • Your final base salary will be determined based on location, work experience, skills, knowledge, education and/or certifications.
Overview
Application
About Fuel Cycle:
Fuel Cycle empowers leading organizations with agile research solutions that deliver decision-ready insights — fast, flexible, and fully integrated. As a market research disruptor, our AI-powered Insights Platform is built for speed, precision, and scale. With cutting-edge tools and seamless audience connectivity, we help brands ditch the guesswork and make smarter, customer-led decisions at lightning speed.

OVERVIEW:

We are seeking a Senior Python Engineer with deep expertise in AI-driven applications, passionate about applying cutting-edge techniques like Large Language Models (LLMs), Convolutional Neural Networks (CNNs), and advanced data processing pipelines to solve real-world problems in information processing.
The ideal candidate is equally comfortable building scalable backend systems and experimenting with the latest AI frameworks to turn research concepts into production-ready solutions. If you thrive at the intersection of engineering and applied AI, this role offers the opportunity to shape the future of decision intelligence technology.
This position follows a hybrid work model and is based out of our Los Angeles HQ, with an on-site presence required 3 days/week.

CORE SKILLS, COMPETENCIES & ATTRIBUTES:

  • Technical Excellence: Strong expertise in Python, FastAPI/Flask, Streamlit, pandas, numpy, Vector Stores and AI/ML frameworks.
  • AI/ML Frameworks: Strong background in AI/ML frameworks (PyTorch/TensorFlow, LangChain, Hugging Face, OpenAI APIs)
  • Deep Learning & AI Architectures: Experience building with LLMs, embeddings, RAG, CNNs, or related deep learning architectures.
  • Data Engineering & Analytics: Skilled in SQLAlchemy + PostgreSQL/MySQL and data wrangling with Pandas/Numpy.
  • Vector Database Expertise: Strong knowledge of vector databases such as Pinecone, FAISS, Weaviate, etc.
  • Applied AI Problem-Solving: Passion for solving real-world problems with AI.
  • Collaboration & Communication: Work effectively with cross-functional teams across different time zones.
  • Adaptability & Continuous Learning: Stay updated with the latest advancements in AI, cloud computing, and data engineering.
  • Ownership & Accountability: Take initiative and responsibility for the success of the projects.
Responsibilities
  • AI-Driven Application Development: Design and build intelligent applications that leverage LLMs (via LangChain, vector stores, embeddings) and CNNs for text, image, or multimodal data processing.
  • Information Processing & Data Engineering: Develop efficient pipelines for extracting, structuring, and analyzing large, unstructured datasets; optimize transformations with Pandas, NumPy, and modern vector DBs.
  • Scalable Backend Systems: Build robust APIs with FastAPI/Flask, integrate AI models into production, and create interactive prototypes with Streamlit.
  • Cloud & DevOps: Containerize and deploy AI-enabled applications on AWS with Docker, ensuring scalability, resilience, and security.
  • Cross-Functional Collaboration: Work with data scientists, AI researchers, and product leaders to translate business problems into AI-powered solutions.
  • Continuous Innovation: Explore and integrate emerging ML architectures and frameworks (transformers, CNNs, diffusion models, retrieval-augmented generation) to push the boundaries of what’s possible.
Loading...