IBL News | New York
Groq AI, a startup that develops high-speed processing and low-latency performance chips, released an app called Groqbook this month that generates entire books in seconds from a one-line prompt.
It works well on nonfiction books and generates each chapter within seconds.
It uses Llama3 on Groq. Specifically, it mixes Llama3-8b and Llama3-70b, utilizing the larger model to generate the structure and the smaller of the two to create the content.
Currently, the model only uses the context of the section title to generate the chapter content.
Groq said that “in the future, this will be expanded to the fuller context of the book to allow Groqbook to generate quality fiction books as well.”
Groq’s CTO demoed this app last week at New York’s Imagine AI Live Event, which IBL News covered.