logo

FX.co ★ Apple Releases Open-Source LLMs To Run On-Device

Apple Releases Open-Source LLMs To Run On-Device

Apple has introduced several open-source large language models, named OpenELM or Open-source Efficient Language Models, to foster and enhance the open research community and contribute to future research endeavors.

The OpenELM models, launched on the AI code-sharing community platform, Hugging Face Hub, have the unique feature of running on-device rather than on cloud servers. According to Bloomberg, Apple has aimed for a completely on-device operation system, with the large language model technology powered by the iPhone's internal processor instead of cloud technologies.

Apple has mentioned the presence of four distinct OpenELM models - four that are pre-trained and four that are instruction-tuned. These come in different sizes, with the largest model containing 3 billion parameters and the others holding parameters ranging from 270 million to 1.1 billion. The technological giant highlights that all models employ a layer-wise scaling strategy for efficient allocation of parameters within each layer of the transformer model, leading to improved efficiency and accuracy.

In a departure from previous practices, not only does Apple's release include the model weights and inference code, but it also provides the complete framework for training and evaluating the language model on publicly available datasets. This also comprises of training logs, multiple checkpoints, and pre-training configurations.

In addition, Apple is slated to unveil iOS 18, equipped with AI capabilities, at the highly anticipated Worldwide Developers Conference.

*此处发布的市场分析旨在提高您的意识,但不提供交易指示
Go to the articles list Open trading account