Multimodal LLMs Research Engineer at Apple
Sunnyvale, California, United States -
Full Time


Start Date

Immediate

Expiry Date

16 Mar, 26

Salary

0.0

Posted On

16 Dec, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Multimodal LLM Development, Computer Vision, Machine Perception, Python Programming, PyTorch, JAX, AI Code Development Tools, Model Training, Fine-Tuning, Data Definition, Research, Technical Leadership, Publications, Patents, Agentic AI, Reasoning

Industry

Computers and Electronics Manufacturing

Description
We are actively seeking exceptional individuals who thrive in collaborative environments and are driven to push the boundaries of what is currently achievable with multimodal inputs and large language models. Our centralized applied research and engineering group is dedicated to developing cutting-edge Computer Vision and Machine Perception technologies across Apple products. We balance advanced research with product delivery, ensuring Apple quality and pioneering experiences. A successful candidate will possess deep expertise and hands-on experience across the full lifecycle of Multimodal LLM development, encompassing early ideation, data definition, model training, and fine-tuning. DESCRIPTION We are seeking a candidate with a proven track record—demonstrated through academic research, industry contributions, or a combination of both—in developing multimodal LLMs and advanced topics such as agentic AI, reasoning, and large-scale model evaluation. This role offers the opportunity to drive groundbreaking research projects, spanning foundational concepts to practical applications. MINIMUM QUALIFICATIONS Ph.D. with relevant research background, or Master of Science and a minimum of 2 years of relevant industry experience Demonstrated track record through publications, patents, and/or shipping relevant features Strong Python programming experience Strong PyTorch and/or JAX programming experience Ability to effectively utilize AI code development tools to accelerate the development process PREFERRED QUALIFICATIONS Strong publication record in relevant venues, such as CVPR, ICCV, ECCV, NeurIPS, ICML, ICLR, etc. Technical leadership experience - guiding technical efforts across diverse teams/individuals. Experience in shipping MM-LLMs in products.
Responsibilities
The candidate will drive groundbreaking research projects in multimodal LLMs, focusing on both foundational concepts and practical applications. They will work within a collaborative environment to push the boundaries of technology across Apple products.
Loading...