Anthropic, maker of Claude, expects to break even in 2028, The Wall Street Journal reported yesterday. By contrast, OpenAI forecasts operating losses of $74 billion, or roughly three-fourths of revenue at that year, expecting to turn a profit in 2030. The maker of OpenAI expects to burn 14 times as much cash as Anthropic. With the goal of becoming a trillion-dollar company, OpenAI, which is always in constant fundraising mode, says it is investing significantly more in chips and data centers, and doling out more stock-based compensation to attract top researchers. “Demand for AI exceeds available compute supply today,” an OpenAI spokesman said. Recently, OpenAI, valued at $500 billion, signed a string of new computing deals with cloud and chip giants. Altman said on X that the deals put OpenAI on the hook for up to $1.4 trillion in commitments over the next eight years. Anthropic, valued at $183 billion, is focused on increasing sales among corporate customers, which account for about 80% of revenue. It avoids OpenAI’s costly forays into image and video generation, which require significantly more computing power. Anthropic’s AI models have also taken off among coders. Founded four years ago by Dario Amodei [in the picture, on the right side], a former Google researcher who left OpenAI, Anthropic is centered on selling its Claude chatbot to businesses. Microsoft is OpenAI’s largest cloud provider, while Amazon and Google are for Anthropic.
IBM last week released, under the Apache 2.0 license, four new Granite 4.0 Nano models, designed to be highly accessible and well-suited for developers building applications on consumer hardware, without relying on cloud computing. With these models, IBM is entering a crowded and rapidly evolving market of small language models (SLMs), competing with offerings like Qwen3, Google's Gemma, LiquidAI’s LFM2, and Mistral’s dense models in the sub-2B parameter space. With this release, IBM is positioning Granite as a platform for building the next generation of lightweight, trustworthy AI systems. The 350M variants can run comfortably on a modern laptop CPU with 8–16GB of RAM, while the 1.5B models typically require a GPU with at least 6–8GB of VRAM for smooth performance. This is a fraction of the size of their server-bound counterparts from companies like OpenAI, Anthropic, and Google. This Granite 4.0 Nano family includes four open-source models now available on Hugging Face: Granite-4.0-H-1B (~1.5B parameters) – Hybrid-SSM architecture Granite-4.0-H-350M (~350M parameters) – Hybrid-SSM architecture Granite-4.0-1B – Transformer-based variant, parameter count closer to 2B Granite-4.0-350M – Transformer-based variant Overall, the Granite-4.0-1B achieved a leading average benchmark score of 68.3% across general knowledge, math, code, and safety domains. For developers and researchers seeking performance without overhead, the Nano release means they don’t need 70 billion parameters to build something powerful.
Elon Musk, last week, unveiled Grokipedia.com, his own version of Wikipedia, the crowdsourced online encyclopedia. The new project’s entries will be edited by xAI, Elon Musk’s artificial intelligence company. The website featured over 885,000 entries in just three days. This number took Wikipedia over half a decade to reach. Wikipedia, which debuted almost 25 years ago and now includes eight million human-written entries, has faced increasing criticism from conservatives in recent months. Elon Musk and his political allies have argued that the online encyclopedia is too “woke” and excludes conservative media outlets from its approved citations “Grokipedia will be a massive improvement over Wikipedia,” said Musk. “It will purge out the propaganda flooding Wikipedia.” “Wikipedia has achieved a dominant position. I hope Grokipedia challenges it and is able to fix that,” said David Sacks, the A.I. czar of the Trump administration and an investor in several of Musk’s companies, in an episode of his podcast this month. “But the easier path might just be for Wikipedia to stop blackballing and censoring conservative publications, rather than having to rebuild that whole thing from scratch.” Jimmy Wales, a co-founder of Wikipedia, said he is leading an internal working group focused on promoting neutral points of view and developing guidelines to encourage academic research. In addition, he said that he did not think AI could replace the site’s accuracy. Visits to its website have declined by 8% this year, while visits from automatic scrapers used by AI companies to harvest data have increased. AI-generated summaries by search engines and chatbots are also deterring users from visiting Wikipedia. We are building Grokipedia @xAI. Will be a massive improvement over Wikipedia. Frankly, it is a necessary step towards the xAI goal of understanding the Universe. https://t.co/xvSeWkpALy — Elon Musk (@elonmusk) September 30, 2025
"In higher education, we face today skepticism and scrutiny, and current developments are testing us," said Dr. John O’Brien, President and CEO at Educause, during the opening talk of the annual conference, which took place this week in Nashville, Tennessee. "The work we do, together, has never been more important." John O’Brien also introduced the winners of the 2025 Award Program: Leadership Award: • Elias G. Eldayrie, Senior VP and CIO at University of Florida • Helen Norris, Former CIO and Vice President for Information Technology at Chapman University Organizational Culture Award: Liv Gjestvang, Vice President and CIO at Denison University Rising Star Award: Michael McGarry, Academic Technology Lead and LMS Administrator at California State University, Channel Islands Community Leadership Award: David Sherry, Former CIO at Princeton University The next day, the President and CEO at Educause, during the presentation of the 2026 Top 10 report, advocated for clarity, resilience, and connection among institutions as they navigate uncertainty. Also on Wednesday, in conversation with reporters (including IBL News), Dr. John O’Brien (pictured) said, "We are at an inflection point and educators want to come together."
N8n, a German startup that helps businesses deploy artificial intelligence agents, has raised $180 million, valuing it at $2.5 billion. Venture firm Accel led the funding, and Nvidia Corp.’s NVentures participated in the round, n8n said in a statement this month. The company stated that it will utilize its new funding to hire additional staff and enhance its platform, expanding its collection of integrations with third-party applications. Founded in 2019, n8n offers an open-source platform for creating automation workflows. The company generates revenue through several paid versions of its open-source platform, which offer a cloud-based version and additional features. For non-technical users, n8n provides a visual interface that enables these workflows using drag-and-drop controls. Coders can write n8n workflows in TypeScript, an enhanced version of JavaScript developed by Microsoft Corp. TypeScript includes syntax modifications that make it easier to avoid certain types of software errors. The platform supports LangChain, an open-source toolkit, and there are several hundred prepackaged automation workflows. Cybersecurity teams use the software to automatically enrich breach alerts with threat intelligence from external sources. AI Workflow Builder Is Now Available in Beta to Enterprise Cloud Users AI Workflow Builder, previously available to Starter and Pro users, is now in beta for Enterprise Cloud users, with 1,000 monthly credits included. We are excited to expand access to our Enterprise Cloud… pic.twitter.com/UFl8kroyw3 — n8n.io (@n8n_io) October 27, 2025 n8n Livestream: AI Guardrails, Pinecone & Community Highlights https://t.co/A8ju4yFTUu — n8n.io (@n8n_io) October 30, 2025