Popular
join now

Musk open-sources Grok-1: The largest so far with 314 billion parameters, and calls out to OpenAI

AI information5months agorenew AItools
144 0
Musk open source Grok 1314 billion parameters so far the largest also shouts OpenAI

Early this morning, Musk's AI macromodel Grok announced that it was officially open source, and replied to a post in the comments section that said "OpenAI deserves the same (meaning open source)," saying "OpenAI is a lie."

On March 17, local time, Musk's startup AI company xAI officially publicized theGrok-1 source code(math.) genusGrok-1The number of participants reached 314 billion, which is the largest open source large language model with the largest number of participants to date.

This is a 314 billion parameter Mixed Expert (MoE) model, the project was released 5 hours ago and has been in the GitHub Harvest 6.3k stars.

© Copyright Notice

related articles

en_USEnglish