Author: IBL News

  • Gary Marcus: AI bubble could destroy the economy

    Gary Marcus: AI bubble could destroy the economy


    Gary Marcus: AI bubble could destroy the economy.

    Source: Youtube

  • OpenAI Announced 1M Business Customers and 800M Users Weekly

    OpenAI Announced 1M Business Customers and 800M Users Weekly

    IBL News | New York 

    OpenAI announced yesterday that it has 1 million business customers worldwide using ChatGPT for Work, either directly or through its developer platform.

    Organizations related to areas such as financial services, healthcare, and retail are among the most active. Consumer adoption is also at high rates, with 800 million users weekly.

    The San Francisco-based lab is set to generate $13 billion in revenue this year as it continues to expand sales.

    To support the enterprise acceleration, OpenAI launched a new wave of tools and integrations, such as:

    • Company knowledge consolidates all the context from connected apps (Slack, SharePoint, Google Drive, GitHub, Canva, Figma, Zillow, Spotify, among others) into ChatGPT.
    • Codex model for code generation, refactoring, and workflow automation.
    • AgentKit allows users to create and build enterprise agents.
    • Multimodal models, such as the Image Generation APISora 2, gpt-realtime, and Realtime API to build production voice agents.
    • Databricks has made OpenAI models available natively on its stack.
    • The Agentic Commerce Protocol (ACP) enables the creation of conversational commerce experiences in ChatGPT. Shopify, Etsy, Walmart, PayPal, and Salesforce are among the companies utilizing this protocol.

    OpenAI quoted a recent Wharton study to highlight that 75% of enterprises report a positive ROI, and fewer than 5% report a negative return. “When AI is deployed with the right use case and infrastructure, teams see real results,” said the firm.

    This week, too, OpenAI agreed to pay Amazon.com $38 billion for computing power in a multiyear deal. That marks the first partnership between the startup and the cloud company.

  • How is AI impacting education in Connecticut?

    How is AI impacting education in Connecticut?


    Artificial intelligence is here to stay, integrated into our daily lives.

    Source: Youtube

  • AI and the human role in finance

    AI and the human role in finance


    AI and the human role in finance

    Source: Youtube

  • Artificial intelligence: Its history, its current state and human attitudes towards it

    Artificial intelligence: Its history, its current state and human attitudes towards it


    Artificial intelligence: Its history, its current state and human attitudes towards it.

    Source: Youtube

  • How schools are adapting to artificial intelligence

    How schools are adapting to artificial intelligence


    How schools are adapting to artificial intelligence.

    Source: Youtube

  • When will the artificial intelligence bubble burst?

    When will the artificial intelligence bubble burst?


    Professor Gary Marcus (Professor Emeritus of Psychology and Neural Science, New York University) and Murad Hemmadi (Journalist, The Logic) at Attention: Govern Or Be Governed for their conversation, “When Will the AI Bubble Burst?”

    Source: Youtube

  • Is AI behind the recent job cuts? Here’s what to know

    Is AI behind the recent job cuts? Here’s what to know


    Is AI behind the recent job cuts? Here’s what to know.

    Source: Youtube

  • IBM Releases as Open Source Its ‘Granite 4.0 Nano’ AI Models that Can Run on the Browser

    IBM Releases as Open Source Its ‘Granite 4.0 Nano’ AI Models that Can Run on the Browser

    IBL News | New York

    IBM last week released, under the Apache 2.0 license, four new Granite 4.0 Nano models, designed to be highly accessible and well-suited for developers building applications on consumer hardware, without relying on cloud computing.

    With these models, IBM is entering a crowded and rapidly evolving market of small language models (SLMs), competing with offerings like Qwen3, Google’s Gemma, LiquidAI’s LFM2, and Mistral’s dense models in the sub-2B parameter space.

    With this release, IBM is positioning Granite as a platform for building the next generation of lightweight, trustworthy AI systems.

    The 350M variants can run comfortably on a modern laptop CPU with 8–16GB of RAM, while the 1.5B models typically require a GPU with at least 6–8GB of VRAM for smooth performance.

    This is a fraction of the size of their server-bound counterparts from companies like OpenAI, Anthropic, and Google.

    This Granite 4.0 Nano family includes four open-source models now available on Hugging Face:

    • Granite-4.0-H-1B (~1.5B parameters) – Hybrid-SSM architecture
    • Granite-4.0-H-350M (~350M parameters) – Hybrid-SSM architecture
    • Granite-4.0-1B – Transformer-based variant, parameter count closer to 2B
    • Granite-4.0-350M – Transformer-based variant

    Overall, the Granite-4.0-1B achieved a leading average benchmark score of 68.3% across general knowledge, math, code, and safety domains.

    For developers and researchers seeking performance without overhead, the Nano release means they don’t need 70 billion parameters to build something powerful.

  • Parents push for AI chatbot controls for kids

    Parents push for AI chatbot controls for kids


    Parents are calling for stronger safeguards on Artificial Intelligence chatbots after videos showed Tesla’s AI chatbot having an alarming conversation with children in a car.

    Source: Youtube