IBL News | New York
Elon Musk’s xAI.com was finally released as open-source under the Apache 2.0 license — which permits commercial use — the base code of its LLM ‘Grok-1’ yesterday.
Grok-1 was released without any training code; only with the base model weights and network architecture.
According to its description, it’s “a 314 billion parameter Mixture-of-Expert model trained from scratch by xAI”, and “not fine-tuned for any specific application, such as dialogue.”
This release doesn’t include connections to the X social network.
Other large companies such as Meta and Google have also released their open-source versions. LLaMA, from Meta; Gemma2B, and Gemma7B, from Google; along with Mistral, Falcon, and A12, are some of the most popular.
Perplexity CEO Arvind Srinivas posted on X that the company will fine-tune Grok for conversational search and make it available to Pro users.
.
Yep, thanks to @elonmusk and xAI team for open-sourcing the base model for Grok. We will fine-tune it for conversational search and optimize the inference, and bring it up for all Pro users! https://t.co/CGn6cIoivT
— Aravind Srinivas (@AravSrinivas) March 17, 2024