Gary Marcus: AI bubble could destroy the economy.
Source: Youtube

Gary Marcus: AI bubble could destroy the economy.
Source: Youtube

IBL News | New York
OpenAI announced yesterday that it has 1 million business customers worldwide using ChatGPT for Work, either directly or through its developer platform.
Organizations related to areas such as financial services, healthcare, and retail are among the most active. Consumer adoption is also at high rates, with 800 million users weekly.
The San Francisco-based lab is set to generate $13 billion in revenue this year as it continues to expand sales.
To support the enterprise acceleration, OpenAI launched a new wave of tools and integrations, such as:
OpenAI quoted a recent Wharton study to highlight that 75% of enterprises report a positive ROI, and fewer than 5% report a negative return. “When AI is deployed with the right use case and infrastructure, teams see real results,” said the firm.
This week, too, OpenAI agreed to pay Amazon.com $38 billion for computing power in a multiyear deal. That marks the first partnership between the startup and the cloud company.

Artificial intelligence is here to stay, integrated into our daily lives.
Source: Youtube

Artificial intelligence: Its history, its current state and human attitudes towards it.
Source: Youtube

How schools are adapting to artificial intelligence.
Source: Youtube

Professor Gary Marcus (Professor Emeritus of Psychology and Neural Science, New York University) and Murad Hemmadi (Journalist, The Logic) at Attention: Govern Or Be Governed for their conversation, “When Will the AI Bubble Burst?”
Source: Youtube

Is AI behind the recent job cuts? Here’s what to know.
Source: Youtube

IBL News | New York
IBM last week released, under the Apache 2.0 license, four new Granite 4.0 Nano models, designed to be highly accessible and well-suited for developers building applications on consumer hardware, without relying on cloud computing.
With these models, IBM is entering a crowded and rapidly evolving market of small language models (SLMs), competing with offerings like Qwen3, Google’s Gemma, LiquidAI’s LFM2, and Mistral’s dense models in the sub-2B parameter space.
With this release, IBM is positioning Granite as a platform for building the next generation of lightweight, trustworthy AI systems.
The 350M variants can run comfortably on a modern laptop CPU with 8–16GB of RAM, while the 1.5B models typically require a GPU with at least 6–8GB of VRAM for smooth performance.
This is a fraction of the size of their server-bound counterparts from companies like OpenAI, Anthropic, and Google.
This Granite 4.0 Nano family includes four open-source models now available on Hugging Face:
Overall, the Granite-4.0-1B achieved a leading average benchmark score of 68.3% across general knowledge, math, code, and safety domains.
For developers and researchers seeking performance without overhead, the Nano release means they don’t need 70 billion parameters to build something powerful.

Parents are calling for stronger safeguards on Artificial Intelligence chatbots after videos showed Tesla’s AI chatbot having an alarming conversation with children in a car.
Source: Youtube