
The world is watching to see what Apple will do to counter the dominance of Microsoft and Google in generative AI. Most assume the tech giant’s innovations will take the form of neural nets on the iPhone and other iOS devices. Small clues are popping up here and there.
Also: How Apple’s AI advances could make or break the iPhone 16
Apple just introduced its own “embedded” large language model (LLM) to run on mobile devices, OpenELM, essentially by mashing together the breakthroughs of several research institutions, including Google’s deep learning scholars and academics at Stanford and elsewhere.
All of the code for the OpenELM program is posted on GitHub, along with various documentation for the training approach.
Apple’s work, detailed in a paper by Sachin Mehta and team, “OpenELM: An Efficient Language Model Family with Open-source Training and Inference Framework”, posted on the arXiv pre-print server, is focused on mobile devices as the size of the neural net they use has just 1.3 billion neural weights, or, parameters.













