In an exciting move for the AI community, Apple unexpectedly released Ferret, an open source multimodal large language model (LLM), back in October 2023. The news flew under the radar at first, but Ferret’s introduction signals Apple’s commitment to impactful AI research and solidifies its position as a leader in multimodal AI. As conversations heat up about potential uses for local LLMs on devices like iPhones, Ferret provides a glimpse into the future of on-device AI.
While known primarily for its “walled garden” approach, Apple’s willingness to open source Ferret’s code and weights (for research purposes) represents a shift for the tech giant. As excitement builds around new innovations like Anthropic’s Constitutional AI and Google’s forthcoming Pixel Pro features, Apple’s foray into open source LLMs positions the company well to tap into burgeoning opportunities in this space.
The potential is clear: What if complex AI systems like Ferret could someday run efficiently right on your iPhone or iPad? As Apple announces breakthroughs in deploying LLMs locally and enabling more immersive visual experiences through AI, this future doesn’t seem far off.
Ferret’s release sets the stage for Apple to lead the way towards on-device intelligence. We have lots to look forward to!
- Apple open sourced multimodal LLM Ferret in October 2023
- Move represents shift towards open source AI for previously “walled garden” company
- Allows Apple to tap into growth of local LLMs and on-device intelligence
- Builds Apple’s leadership in multimodal AI like 3D avatars and natural language
- Exciting glimpse into future of AI capabilities running locally on iPhones & iPads