ChatGPT to get another major update on GPT-5 training insider revelations!
Recently, DeepMind co-founder Suleyman said in an interview that GPT-5, which is still under secret training, will be 100 times larger than the current GPT-4 in the future. This news has triggered widespread attention and hot debate.
However, Open AI CEO Sam Altman had previously denied the claim of training GPT-5. A source revealed that OpenAI may have come up with a new name for GPT-5, which is why they claimed not to have trained GPT-5.
Suleyman is now the CEO of Inflection AI, which is building one of the largest supercomputers in the world. He believes that within the next 18 months, they may be able to conduct a training run that is 10 or 100 times larger than the training run that built GPT-4's language model.
In March this year, GPT-4 was officially released. Compared with the GPT-3.5 model initially used by ChatGPT, GPT-4 has realized several leaps and bounds in terms of enhancement: powerful map-reading capability; text input limit raised to 25,000 words; significant improvement in answer accuracy; and the ability to generate lyrics, creative texts, and realize style changes.
GPT-3.5 has a parameter size of 175 billion. While OpenAI has not published the exact specifications of GPT-4, subsequent analysis suggests that it has 16 expert models with roughly 111 billion parameters per MLP expert, totaling 1.8 trillion parameters, 10 times the size of GPT-4.
If boosted at this scale, GPT-5 will exceed 10 trillion parameters, potentially becoming the largest AI grand model, far surpassing all other rivals. This progress undoubtedly brings broader imagination and exploration prospects for the development of the AI field.