Thursday, April 25, 2024

Apple releases eight small AI language models aimed at on-device use


An illustration of a robot hand tossing an apple to a human hand.

Enlarge (credit: Getty Images)

In the world of AI, what might be called "small language models" have been growing in popularity recently because they can be run on a local device instead of requiring data center-grade computers in the cloud. On Wednesday, Apple introduced a set of tiny source-available AI language models called OpenELM that are small enough to run directly on a smartphone. They're mostly proof-of-concept research models for now, but they could form the basis of future on-device AI offerings from Apple.

Apple's new AI models, collectively named OpenELM for "Open-source Efficient Language Models," are currently available on the Hugging Face under an Apple Sample Code License. Since there are some restrictions in the license, it may not fit the commonly accepted definition of "open source," but the source code for OpenELM is available.

On Tuesday, we covered Microsoft's Phi-3 models, which aim to achieve something similar: a useful level of language understanding and processing performance in small AI models that can run locally. Phi-3-mini features 3.8 billion parameters, but some of Apple's OpenELM models are much smaller, ranging from 270 million to 3 billion parameters in eight distinct models.

Read 7 remaining paragraphs | Comments

Reference : https://ift.tt/v6rt1Up

No comments:

Post a Comment

Startups Begin Geoengineering the Sea

If you search up Hawaii’s Keāhole Point on Google Maps, center it on your screen, and then zoom out until you can see the edges of the g...