When you purchase through links on our site, we may earn an affiliate commission.Heres how it works.

However, this still falls significantly short of even four-year-old models likeOpenAIs GPT-3 which featuredover 175 billion parameters.

This project started out with research on the open-source implementation and scaling of globally distributed AI model training.

Intellect-1

Size isn’t everything though.

This reduces the risk of only a few large companies having access to this advanced technology.

For now, users can only contribute to the project through the companys own platform.

Arrow

But in the future, you should be able to contribute to the models training with your own hardware.

The training is made possible through separate clusters of devices that process data to train the AI model.

The training framework can also handle nodes joining or leaving without leading to system crashes.

Arrow

Delays with this catching-up process have been solved by having new nodes request checkpoints from their peers.

Its mainly training on a Hugging Face dataset called FineWeb-Edu which contains content from educational web pages.

More from Tom’s Guide

Apple 13" MacBook Air (M3,…

Lenovo Chromebook Duet 3…

ASUS Zenbook S 13 OLED Laptop…

ASUS

Asus ROG Zephyrus G14 2023

Best Buy

Lenovo IdeaPad Duet 3

Apple MacBook Pro (2023) 14.2…

P.C. Richard & Son

Apple 2023 MacBook Pro Laptop…

Apple 13" MacBook Air (M3,…

ASUS Zenbook S 13 OLED…

Asus ROG Zephyrus G14 2023

DeepSeek R1 illustrations

human vs. robot face with ChatGPT on phone

iPhone 16 Pro Max, Galaxy S25 Ultra and Pixel 9 Pro

ChatGPT generated image

Ring Wall Light Solar turned on at night.

Nintendo Switch 2

Motorola Razr 2024 Review.

Kit Connor and Joe Locke in Heartstopper season 3