IBL News | New York
Amazon/AWS announced in December a series of AI projects, with several generative models, a new AI computer chip Trainium2, plans for a new supercomputer, and big partnership deals, including its plans to double its investment in Anthropic to $8 billion.
With these announcements, Amazon signaled its intent to go beyond being a cloud platform and aim to be an end-to-end AI powerhouse.
Trainium2 chips are specifically designed for heavy computing demands and are intended to be a direct competitor to Nvidia and AMD.
Rainier’s supercomputer will be developed to meet its AI training and computational needs. It will rival Elon Musk’s Cortex and Colossus AI supercomputer.
To compete with ChatGPT and Gemini, Amazon launched six foundational large language models under its Nova umbrella.
This new family of multimodal AI models, Nova, includes four text-generating models: Micro, Lite, Pro, and Premier.
In addition, there’s an image-generation model, Nova Canvas, and a video-generating model, Nova Reel.
Micro has a 128,000-token context window, processing up to around 100,000 words. Lite and Pro have 300,000-token context windows, up to around 225,000 words, 15,000 lines of computer code, or 30 minutes of footage.
AWS said that in early 2025, certain Nova models’ context windows will expand to support over 2 million tokens.
The Nova models are available in AWS Bedrock, Amazon’s AI development platform. There, they can be fine-tuned on text, images, and video and distilled for improved speed and efficiency.
These models are optimized to work with proprietary systems and APIs, allowing for multiple orchestrated automatic steps—agent behavior.
“In 2025, we will bring even more capable models while driving cost down through algorithmic and computing innovations. And we will continue to invest in the talent, technology, and infrastructure necessary to offer world-class models to our customers for years to come,” Amazon said.