A new Small Ai Model of AI2 exceeds similar size patterns from Google, Meta


AI Models included in the week for a week.

Nonfofofify Institute in AI2 on Thursday Ai2 Emit Olmo 2 1b came from the Beatatchmarks by the 1bolic parameter in Meta and Alibaba. Sometimes referring to parameters referred to as weights.

Olmo 2 1b is under license on a license on AI Dev Plmplatf Plmplatf platform. Unlike most models, you can replicate Olmo 2 1b to scratch. AI2 offers Code and Data Sets (Olmo-Mix-1124It is a good idea. Dolmino-Mix-1124) Used to develop.

Small models cannot be like the capabilities of Behemoms, but it is important that they do not need beef hardware. It contrasted the limits of low-speed and ease of low-speed and the limits of consumer machines for developers and fans.

Microsoft has opened a small model in the past few days PHI 4 Reasoning Family Into Qwen's 2.5 Oman 3B. Most of these: Olmo 2 1b can easily use a modern laptop or mobile device.

The AI2 said that the OLMO 2 1B was published in the 4 trillion-token and handed resources created by a 4 trillion-to-to-to-to-day tokeese. The tokens are equal to a lot of information and generate and generrate and 1 million tokens equal to 750,000 words.

By reasoning of measuring construction, the Gamma 2 1b, Meta's Llama's Llama's Llama's 3 1 b, 3 1 b, olmo 2 1b used the performance of a test of an experiment 3 1 b.

TechCrunch event

Berkeley, ca
|
June 5


Book

AI2 warned that OlMo 2 1 b is carrying the risks. Like AI models, it can lead to harmful and “problems” including “sensitive” content. For these reasons, AI2 recommends deploying Oilo 2 1B in commercial settings.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *