Category: Top News

  • Virtual Reality Biology Learning Experience at ASU Shows Powerful Outcomes

    Virtual Reality Biology Learning Experience at ASU Shows Powerful Outcomes

    M. Amigot, IBL News | San Diego

    Arizona State University (ASU) released new findings during the ASU+GSV 2025 Summit this week, showing the powerful impact of Dreamscape Learn immersive storytelling on student outcomes, after two years of research behind it.

    The primary outcome was that students in the virtual reality lab group were 1.7 times more likely to score between 90 percent and 100 percent on their lab assignments than students in the conventional lab group.

    ASU researchers studied more than 4,000 on-campus students from two biology courses over four terms — from fall 2022 through spring 2024.

    According to the institution’s data, “The intense engagement that Dreamscape Learn virtual reality biology experiences create is leading to higher grades and more persistence for biology majors.”

    ASU’s Dreamscape Learn Biology 181 course debuted in the spring 2022 semester, and now the experience is offered in Biology 181 and 182. These courses are intended for students in STEM majors and are required for graduation.

    These biology courses include 15-minute virtual-reality scenarios every week. Students travel through space to an intergalactic wildlife sanctuary, where they encounter intriguing scenarios they must solve through science.

    After the weekly Dreamscape Learn experience, students attend a three-hour lab, where the storyline continues. They solve the unique problems using careful reasoning.

    The experience is straight out of Hollywood. It was created in collaboration with Dreamscape Immersive, a company co-founded by Walter Parkes, a writer and producer of films including “WarGames,” “Gladiator,” and “Twister.”

    Currently, ASU is working with community colleges in California and K­–12 schools in Arizona to offer the technology, including the Pendergast Elementary School District in the West Valley.

  • Microsoft Introduced Two New AI Agents: Researcher and Analyst

    Microsoft Introduced Two New AI Agents: Researcher and Analyst

    IBL News | New York

    Microsoft will start rolling out in April two AI agents to increase productivity at work this week: Researcher and Analyst.

    According to the company, these agents analyze vast amounts of information and have secure, compliant access to users’ work data—emails, meetings, files, chats, and more—and the web.

    Researcher helps tackle complex, multi-step research by combining OpenAI’s deep research model with Microsoft 365 Copilot’s advanced orchestration and deep search capabilities. It can be used to build a go-to-market strategy based on internal, emerging, and other web data to identify opportunities for new products and provide insights and reports. It integrates data from external sources like Salesforce, ServiceNow, and Confluence.

    Analyst has been trained to think like “a skilled data scientist, so you can go from raw data to insights in minutes,” according to Microsoft.

    Optimized to do advanced data analysis at work, this agent is built on OpenAI’s o3-mini reasoning model, moving through problems iteratively, taking steps to provide answer that mirrors human analytical thinking.

    It can run Python to tackle your most complex data queries, and users can view the code running in real time and check if it works. For example, this AI Analyst can turn raw data scattered across multiple spreadsheets into a demand forecast for a new product, a visualization of customer purchasing patterns, or a revenue projection.

    Microsoft also announced deep reasoning and agent flows in Microsoft Copilot Studio, a platform for creating, managing, and deploying agents.

    Microsoft also presentedSales Agent to turn contacts into qualified leads. The agent draws on CRM, company data like price sheets, the web, and Microsoft 365 data such as emails and meetings to personalize every response.

    In addition, Sales Chat helps accelerate the sales cycle, giving reps actionable takeaways from CRM data, pitch decks, meetings, emails, and the web.

  • Linda McMahon at ASU+GSV 2025: “I’m Very Interested in New Learning Technologies”

    Linda McMahon at ASU+GSV 2025: “I’m Very Interested in New Learning Technologies”

    Miguel Amigot, IBL News | San Diego

    “I’m very interested in new technologies that stimulate kids and not in bureaucracy in Washington, D.C.; in fact, our goal is to eliminate bureaucracy,” said Secretary of Education Linda McMahon yesterday in San Diego, during the ASU+GSV 2025 ed-tech conference.

    Addressing a packed auditorium at the event, Linda McMahon acknowledged, “I don’t have the blueprint for the best technology.” 

    “I know we will fail if we don’t have the best educated workforce in the world.”

    In a 30-minute interview conducted by education entrepreneur Phyllis Lockett, the U.S. Secretary of Education McMahon defended the Trump administration’s large-scale cuts to federal education staff, including the intended elimination of the Department of Education, as steps to remedy a system that fails students.

    “We’ve just gotten to a point that we just can’t keep going along doing what we’re doing,” she said. “Let’s shake it up. Let’s do something different.”

    This shakeup involves distributing financial aid to the state level.

    The National Center for Education Statistics estimates that 90 percent of public school budgets already come from state and local sources.

    The U.S. Secretary of Education promised attendees at ASU+GSV that her department would consider ways to revamp the Institute of Education Sciences (IES) following a 90 percent reduction in staff last month, dropping from 170 to 20.

    McMahon deflected anti-DEI questions, sayingThere shouldn’t be any discrimination.”

    She offered few specifics on the Trump Administration’s measures related to funding cuts if diversity programs are not eliminated.

    The 16th annual ASU+GSV Summit was held from April 6 to 9 in San Diego. This education technology summit brought together global leaders, educators, and entrepreneurs to explore trends, encourage collaboration, and address the biggest challenges in education.

    Around 8,000 people joined us in person, with another 10,000 following online.

    All Videos of the ASU+GSV 2025 Conference

     

  • The Paradigm Shift of Vibe Marketing: Specialists with AI Agents Accelerate Development Cycles

    The Paradigm Shift of Vibe Marketing: Specialists with AI Agents Accelerate Development Cycles

    IBL News | New York

    Marketers using agents and the right AI stack are dramatically accelerating workflows and development cycles following a new practice called Vibe Marketing.

    The traditional way of operating with marketing teams of copywriters, designers, analytists, media buyers — working in silos — is being challenged by the new world of a single marketing testing multiple angles in real-time by using tens of AI agents.

    Experts say that these Vibe Marketing practices result in a 20x acceleration and substantial financial savings.

    They take advantage of autonomous tools and applications used in another recent trend, Vibe Coding, with various solutions such as Replit, Vercel, Bolt, and Lovable. (Read at iblnews.org The New Trend of ‘Vibecoding’: Non-Programmers Creating Software Tools with AI.)

    This phenomenon can be a complete paradigm shift in marketing, transforming the $250 billion industry.

    Marketing expert Greg Isenberg commented, “The cool thing is that a single marketer with the right stack can now outperform entire agencies or internal teams; there is an opportunity now to be 10x more efficient than the competition.”

    Some of the software applications and functionalities of these AI-driven agents are:

    • CRMs that browse the web and autonomously find prospects, analyze content, extract data, and craft personalized messages.

    • A tool that captures competitor ads, analyzes them, and auto-generates variations for your brand (free competitive intelligence)

    • AI-driven maps showing customer segments and conversion flows using census data

    • Platforms generating digital product launches, sales pages, email sequences, and ads.

    Some of the most common tools include:

    • Workflow Builders: Make, n8n, Zapier

    • Agent Platforms: Taskade, Manus, Relay, Lindy

    • Software (lead magnets, microsites, etc): Replit, Bolt, Lovable, Vercel

    • Marketing AI: Phantom Buster, mosaic, Meshr, Icon, Jasper

    • Creative tools: Flora, Kling, Leonardo, Manus


    Forbes: VCs Wake Up To Vibe Marketing: AI Reshaping The $250 Billion Industry

  • Stability Launches ‘Stable Virtual Camera’, a Model that Transforms Photos Into 3D Scenes

    Stability Launches ‘Stable Virtual Camera’, a Model that Transforms Photos Into 3D Scenes

    IBL News | New York

    Stability AI released in research preview mode this week Stable Virtual Camera, a model that transforms 2D images into immersive 3D scenes with realistic depth and perspective.

    With its tool, Stability is adding generative AI to virtual cameras, often used in digital filmmaking and 3D animation to capture and navigate scenes in real-time.

    The model is available for research use under a noncommercial license. It can be downloaded on Hugging Face, and the code is accessible on GitHub. The full research paper is here.

    “We invite the research community to explore its capabilities and contribute to its development,” said the company.

    The model can generate videos that travel along “dynamic” camera paths or presets, including “Spiral,” “Dolly Zoom,” “Move,” and “Pan.”

    In its initial version, Stable Virtual Camera may produce lower-quality results in certain scenarios, admitted Stability AI.

    The current version generates videos in square (1:1), portrait (9:16), and landscape (16:9) aspect ratios up to 1,000 frames in length.

    “Input images featuring humans, animals, or dynamic textures like water often lead to degraded outputs.”

    “Additionally, highly ambiguous scenes, complex camera paths that intersect objects or surfaces, and irregularly shaped objects can cause flickering artifacts, especially when target viewpoints differ significantly from the input images.”

     

    Stability, the firm behind the popular image-generation model Stable Diffusion, raised new cash last year as investors, including Eric Schmidt and Napster founder Sean Parker, sought to turn the business around.

    Techcrunch states, “Emad Mostaque, Stability’s co-founder and ex-CEO, reportedly mismanaged Stability into financial ruin, leading staff to resign, a partnership with Canva to fall through, and investors to grow concerned about the company’s prospects.”

    “In the last few months, Stability has hired a new CEO, appointed “Titanic” director James Cameron to its board of directors, and released several new image-generation models. In March, the company partnered with chipmaker Arm to bring an AI model to generate audio, including sound effects, to mobile devices running Arm chips.”

  • Runway Issues Its Model ‘Gen-4’, Which Allows to Generate Consistent Characters

    Runway Issues Its Model ‘Gen-4’, Which Allows to Generate Consistent Characters

    IBL News | New York

    Runway released its most advanced video generator model called Gen-4 for paid and enterprise customers last week.

    The AI video tools startup claimed it can generate consistent characters, locations, and objects across scenes, maintain coherent world environments, and regenerate elements from different perspectives and positions within scenes “without the need for fine-tuning or additional training.”

    To craft a scene, users can provide images of subjects and describe the composition of the shot they want to generate.

    On the other hand, Runway AI Inc. announced that it raised $308 million in a new round of funding, which more than doubled the company’s valuation. The deal pushes Runway’s value to just over $3 billion. Private equity firm General Atlantic led the round, which closed late last year. Other investors included Nvidia Corp. and SoftBank Group Corp.’s Vision Fund 2.

    The company has been able to differentiate itself, inking a deal with a major Hollywood studio and earmarking millions of dollars to fund films using AI-generated video.

    Runway says that Gen-4 allows users to generate consistent characters across lighting conditions using a reference image of those characters.

    “Runway Gen-4 [also] represents a significant milestone in the ability of visual generative models to simulate real-world physics,” said the company.

    Like all video-generating models, Gen-4 was trained on many video examples to learn the patterns and generate synthetic footage.

    Runway refused to say where the training data came from, out of fear of sacrificing competitive advantage and also to avoid IP-related lawsuits.

     

  • Anthropic Launches ‘Claude for Education’ Program to Compete with OpenAI

    Anthropic Launches ‘Claude for Education’ Program to Compete with OpenAI

    IBL News | New York

    Anthropic launched a specialized version of Claude tailored for higher education institutions this week to answer OpenAI’s ChatGPT Edu plan.

    The Claude for Education initiative seeks to equip universities with AI-enabled approaches to teaching, learning, and administration.

    With this program, Anthropic is trying to boost its revenue in the university space, where it competes with OpenAI. The company already reportedly brings in $115 million a month.

    Claude for Education includes what Anthropic calls a Learning mode. This mode is based on guiding students’ reasoning process rather than providing answers, helping them develop critical thinking skills. This feature works within Projects and saved conversations, where students can organize their work around specific assignments or topics.

    The solution will be embedded into CanvasLMS and extended with a program called Claude Campus Ambassadors, which will offer API credits for students who build projects.

    Anthropic said it collaborates with Northeastern University, the London School of Economics and Political Science (LSE), and Champlain College.

    The AI start-up summarized its offer in these terms:

    • “Students can draft literature reviews with proper citations, work through calculus problems with step-by-step guidance, and get feedback on thesis statements before final submission.
    • Faculty can create rubrics aligned to specific learning outcomes, provide individualized feedback on student essays efficiently, and generate chemistry equations with varying difficulty levels.
    • Administrative staff can analyze enrollment trends across departments, automate repetitive email responses to common inquiries, and convert dense policy documents into accessible FAQ formats—all from a familiar chat interface with enterprise-grade security and privacy controls.” 
  • OpenAI Launches Its Academy as a Resource Hub⁠

    OpenAI Launches Its Academy as a Resource Hub⁠

    IBL News | New York

    OpenAI launched its academy this week to help people acquire AI literacy and unlock economic opportunity. The academy encourages learners to use the company’s tools and resources.

    It’s a free, community-first approach online platform designed for students working toward a meaningful career, teachers reimagining their classrooms, and professionals exploring their next move.

    “It’s a central hub for learning how to use AI. From ChatGPT on Campus and ChatGPT at Work to Sora tutorials, Build Hours, and monthly webinars—plus resources from global partners including leading universities—this is practical, real-world learning in one place,”  explained Siya Raj Purohit, manager at OpenAI, in a post on LinkedIn.

    She added, OpenAI Academy is our answer to the question, How can I use AI to get ahead?”

    OpenAI Academy announced its educational initiative will offer a mix of online and in-person events, including workshops, discussions, and other digital content ranging from AI basics to advanced integration for engineers and developers.

    The Academy began as a series of in-person programs focused on developers and technical users.

    This next phase will broaden it to educators, students, job seekers, nonprofit leaders, and small business owners.

    It will start with educational materials created by Common Sense Media. Later, it will include in-person AI literacy workshops hosted in collaboration with institutions like Georgia Tech and Miami Dade College; workforce organizations like CareerVillageGoodwill, and Talent Ready Utah; and mission-driven nonprofits such as Common Sense MediaOATS from AARP, and the Fund for the City of New York.

    Goodwill Keystone in Pennsylvania—an early adopter of AI in the nonprofit sector—will co-develop a hands-on AI literacy workshop, training employment specialists to use ChatGPT to support job-seekers with resume feedback, mock interviews, and career guidance.

  • Salesforce Launched ‘Agentforce 2dx’, Letting AI Agents Run Autonomously Across Systems

    Salesforce Launched ‘Agentforce 2dx’, Letting AI Agents Run Autonomously Across Systems

    IBL News | New York

    Salesforce will release Agentforce 2dx in April, a significant update to its digital labor platform. This platform uses autonomous AI agents to work across enterprise data systems and user interfaces without constant human supervision.

    Agentforce 2dx has a new set of low-code and pro-code tools for Salesforce developers, paired with advanced analytics to help teams monitor, debug, and optimize agent performance with real-time data and guidance.

    To experiment with these tools, Salesforce now offers the Agentforce Developer Edition, a free environment where developers can prototype agents using Agentforce and also explore the capabilities of Data Cloud, Salesforce’s data engine.

    Salesforce announced AgentExchange, a marketplace and community with 200 initial partners built into Salesforce.

    Salesforce said customers like The Adecco Group, Engine, OpenTable, Oregon Humane Society, Precina, and Vivint have already adopted Agentforce.

    AgentExchange includes a library of ready-to-use templates and actions while it also enables partners to list their components for sale.

    “Unlike traditional AI chatbots, which require manual prompts or rigid programming, agentic AI dynamically responds to live data and evolving business needs by embedding AI seamlessly into apps, workflows, and processes,” said Adam Evans, EVP and GM of Salesforce’s AI Platform.

    For example, with agentic reasoning, customers can trigger an agent when an ERP order is created or kick off a loan application process automation.

    Dion Hinchcliffe, VP and Practice Lead, CIO Insights, Futurum Group said, “89% of CIOs identify AI and automation as critical to their digital strategy in 2025, yet 60% of AI projects fail to deliver clear ROI. Agentforce bridges this gap with pre-built workflows that deliver measurable impact fast. While DIY agent-based AI projects can take up to a year to implement, Agentforce customers go live in just 4-6 weeks, realizing value 3-4 times faster. As AI agents become embedded across business operations, from engineering to go-to-market, they will fundamentally reshape how teams engage with customers and partners. Instead of siloed interactions across CRM, support platforms, and marketing tools, AI agents can coordinate responses, escalate issues, and optimize workflows in real-time. Companies that harness this shift — especially in their partner ecosystems — will gain a competitive edge in both performance and market reach.”

  • Harvard’s $255.6M in Contracts and $8.7B in Multi-Year Grants Under the Federal Microscope

    Harvard’s $255.6M in Contracts and $8.7B in Multi-Year Grants Under the Federal Microscope

    IBL News | New York

    The Trump Administration announced yesterday it will undertake a “comprehensive review of federal contracts and grants” at Harvard University and its affiliates to combat anti-semitism and purge pro-Palestinian voices.

    Harvard is the latest Ivy League institution to be targeted by President Trump after Columbia University agreed to comply with demands.

    The Departments of Education (ED), Health and Human Services (HHS), and the U.S. General Services Administration (GSA) will review, through the Joint Task Force to Combat Anti-Semitism, the more than $255.6 million in contracts between Harvard University, its affiliates, and the Federal Government.

    The review will also include the more than $8.7 billion in multi-year grant commitments to Harvard University and its affiliates “to ensure the university’s compliance with federal regulations, including its civil rights responsibilities.”

    “Today’s actions by the Task Force follow a similar ongoing review of Columbia University,” said Secretary of Education Linda McMahon.

    That review led to Columbia’s agreement to comply with nine preconditions for further negotiations regarding the return of canceled federal funds.

    Ivy League universities — and Columbia in particular — were an epicentre of pro-Palestinian demonstrations in the U.S., after Israel launched a war against Gaza in October 2023.

    Similar protests around the country followed by student encampments on Columbia’s lawn in April and May 2024, as campus activists criticized school ties to Israel and called for an end of the war in Gaza.