Musk open-sources Grok-1: The largest so far with 314 billion parameters, and calls out to OpenAI
Early this morning, Musk's AI macromodel Grok announced that it was officially open source, and replied to a post in the comments section that said "OpenAI deserves the same (meaning open source)," saying "OpenAI is a lie."
On March 17, local time, Musk's startup AI company xAI officially publicized theGrok-1 source code(math.) genusGrok-1The number of participants reached 314 billion, which is the largest open source large language model with the largest number of participants to date.
This is a 314 billion parameter Mixed Expert (MoE) model, the project was released 5 hours ago and has been in the GitHub Harvest 6.3k stars.
© Copyright Notice
The copyright of the article belongs to the author, please do not reprint without permission.