Hugging Face Hub for Fine-Tuned AI Models Surpassed One Million Listings

IBL News | New York

The Hugging Face hosting platform surpassed 1 million listings of AI models trained on data to perform specific tasks or make predictions.

The platform started as a chatbot app in 2016 and became an open-source hub for AI models and tools for developers and researchers in 2020.

In a post on X, Hugging Face CEO Clément Delangue wrote about how his company hosts many high-profile AI models, like “Llama, Gemma, Phi, Flux, Mistral, Starcoder, Qwen, Stable diffusion, Grok, Whisper, Olmo, Command, Zephyr, OpenELM, Jamba, Yi,” but also “999,984 others.”

Delangue explained,
“Contrary to the ‘1 model to rule them all’ fallacy, smaller specialized customized optimized models for your use-case, your domain, your language, your hardware and generally your constraints are better. As a matter of fact, something that few people realize is that there are almost as many models on Hugging Face that are private only to one organization—for companies to build AI privately, specifically for their use-cases.”

Hugging Face’s exponential growth into a major AI platform hosting fine-tuned models shows the increased interest in the field.

Developers and researchers worldwide have contributed their results, making Hugging Face a large ecosystem.

For example, many different fine-tuned versions of Llama models are optimized for specific applications.

At the top of the most downloads category, with a massive lead at 163 million downloads, is Audio Spectrogram Transformer from MIT, which classifies audio content like speech, music, and environmental sounds.

Following that, with 54.2 million downloads, is BERT from Google, an AI language model that learns to understand English by predicting masked words and sentence relationships, enabling it to assist with various language tasks.

At the top five AI models, users find OpenAI’s CLIP, which connects images and text, allowing it to classify or describe visual content using natural language.