Elon Musk Open-Sources Grok, But Without Training Code

IBL News | New York

Elon Musk’s xAI.com was finally released as open-source under the Apache 2.0 license — which permits commercial use — the base code of its LLM ‘Grok-1’ yesterday.

Grok-1 was released without any training code; only with the base model weights and network architecture.

According to its description, it’s “a 314 billion parameter Mixture-of-Expert model trained from scratch by xAI”, and “not fine-tuned for any specific application, such as dialogue.”

This release doesn’t include connections to the X social network.

Other large companies such as Meta and Google have also released their open-source versions. LLaMA, from Meta; Gemma2B, and Gemma7B, from Google; along with Mistral, Falcon, and A12, are some of the most popular.

Perplexity CEO Arvind Srinivas posted on X that the company will fine-tune Grok for conversational search and make it available to Pro users.