Apple is trailblazing new techniques to enable advanced artificial intelligence capabilities directly on iPhones and other devices rather than solely relying on the cloud servers of rivals. New research published this month reveals Apple’s progress shrinking complex AI models to operate efficiently on battery-powered hardware with limited memory and processing capabilities.
The paper outlines a novel approach to optimizing large language models (LLMs) like ChatGPT for low-power gadgets. Rather than connecting to the internet to generate text and converse by pulling from immense data centers, Apple wants future iPhone virtual assistants to perform natively. This would eliminate lag waiting for cloud responses while improving reliability, privacy, and functionality without connectivity.
Apple believes directly embedding intelligent features into devices represents the next major evolution in user experiences. LLMs might help smartphones automatically compose messages or schedule calendar events based on habits. More adept on-device AI could also enable transformative new photo, video, and audio editing tools. The company has fallen behind rivals in cloud offerings but can leverage its strength designing AI accelerators inside its A-series chips.
Research head Dr. John Giannandrea confirmed Apple’s conviction, stating: “Our work not only provides a solution to a current computational bottleneck but also sets a precedent for future research. We believe as LLMs continue to grow in size and complexity, approaches like this work will be essential…”
The paper exhibits Apple’s seriousness around competing in AI as developers prep apps leveraging powerful new machine learning models. Generative capabilities represent the next arms race in silicon. Qualcomm estimates over 100 million AI-focused smartphones will ship by 2024. The innovation promises to reinvigorate the saturated mobile device market as well - if Apple nails its execution.
Apple published research on running advanced AI models directly on devices like iPhones
Seeks to beat cloud-reliant competitors by embedding intelligence natively into hardware
Would enable real-time conversing without lag, stronger privacy, offline use
Believes on-device AI key to next evolution in user experiences and apps
Paper signals Apple getting serious about competing in AI as new arms race