Nvidia Announced Its Support to a Hugging Face Generative AI Service

IBL News | New York

NVIDIA, yesterday, announced it will support with its AI supercomputer DGX Cloud a new Hugging Face service called Training Cluster as a Service.

This service, set to roll out “in the coming months,” simplifies the creation of new and custom generative AI models for the enterprise.

DGX Cloud includes access to a cloud instance with eight Nvidia H100 or A100 GPUs and 640GB of GPU memory, as well as Nvidia’s AI Enterprise software to develop AI apps and large language models and consultations with Nvidia experts.

Companies could subscribe to DGX Cloud on their own at a price starting at $36,999 per instance for a month. But Training Cluster as a Service integrates DGX Cloud infrastructure with Hugging Face’s platform of a repository for all things related to AI models (over 250,000 models and 50,000 data sets.)

Our collaboration will bring Nvidia’s most advanced AI supercomputing to Hugging Face to enable companies to take their AI destiny into their own hands with open source to help the open-source community easily access the software and speed they need to contribute to what’s coming next,” Hugging Face co-founder and CEO Clément Delangue said.

Hugging Face’s partnership with Nvidia comes as this AI startup is looking to raise funds at a $4 billion valuation.

Meanwhile, Nvidia is pushing into cloud services for training and running AI models as the demand for such services grows. In March, the company launched AI Foundations, a collection of components that developers can use to build custom generative AI models for particular use cases.
.