TOKYO — In his keynote reside astatine Open Source Summit Japan connected Monday, Linux Foundation Executive Director Jim Zemlin based on that “Artificial intelligence whitethorn not beryllium successful a full-blown bubble, but ample connection models [LLMs] conscionable mightiness be.”
Why? Zemlin started by pointing to staggering finance numbers that person dominated headlines. Morgan Stanley, he noted, estimates that $3 trillion will beryllium spent connected AI information centers betwixt now and 2028, pinch hyperscalers specified arsenic Amazon, Google, Meta and Microsoft accounting for astir half of that total.
“That is much finance than nan GDP of galore mini countries,” Zemlin told nan crowd, emphasizing that astir businesses, and moreover astir nations, cannot meaningfully compete successful specified capital-intensive infrastructure buildouts.
More crucial, he said, is nan energy request tied to AI’s accelerating conclusion workloads. He cited Google’s 50 times year-over-year spike successful conclusion volume, specifically AI usage crossed Google, which climbed from 9.7 trillion tokens successful April 2024 to much than 480 trillion tokens successful April 2025.
Moreover, he echoed AWS President Andy Jassy’s belief that the azygous biggest constraint connected AI maturation coming is power. Zemlin based on that nan AI roar is fundamentally a communicative astir beingness infrastructure, GPUs, power and information centers, not conscionable algorithms, models and software.
Nevertheless, contempt this hardware-heavy environment, Zemlin said nan existent leverage for unfastened root lies elsewhere: successful nan exemplary and package infrastructure layers.
Specifically, successful nan past twelvemonth alone, open-weight models emerging from China, specified arsenic DeepSeek, closed nan capacity spread pinch commercialized frontier models. Zemlin added, “We’re besides seeing those open-weight models being utilized to distill smaller industry-specific models.” For example, he pointed to TinyLlama for Llama 3 and DistilBert for BERT.
The Economics of AI
This operation of open-weight models and distillation techniques has changed nan economics of nan AI sector. According to Zemlin, “Open root has mostly caught up pinch nan frontier models, nan proprietary models, successful nan U.S. Open-weight models are mostly 3 to six months behind.”
That’s much than bully capable for economical AI work. Zemlin quoted nan Linux Foundation’s main economist, Frank Nagle, who precocious quantified that mismatch. According to Zemlin, Nagel’s study shows that though open models are dramatically cheaper and astir arsenic capable, closed models still seizure 95% of revenue, leaving an estimated $24.8 cardinal successful yearly overspending connected proprietary systems.
Therefore, “I deliberation we’re not successful an AI bubble,” Zemlin said. “But we could beryllium successful an LLM bubble.”
As enterprises statesman prioritizing efficient, affordable deployments, he predicted 2026 will people “an era of capacity and efficiency” dominated by unfastened ecosystems.
Is PARK nan New LAMP Stack?
Zemlin besides highlighted nan emergence of what he calls nan PARK stack: PyTorch, AI, Ray and Kubernetes. (Ray is an unfastened root distributed computing framework for simplifying scaling AI and instrumentality learning [ML] workloads.) He believes nan AI procreation that will specify tomorrow’s tech stack, conscionable arsenic nan LAMP stack defined nan early web era. Already, he claimed, PARK is accelerated becoming nan default level for AI deployment astatine scale.
He compared this infinitesimal to nan improvement of nan Linux kernel, wherever corporate unit from a world developer organization many times drove ratio gains crossed divers hardware. In AI, unfastened root devices for illustration vLLM and DeepSpeed are now squeezing much capacity retired of GPUs, cutting powerfulness usage and reducing costs per token.
“This is what unfastened root is really bully at,” Zemlin said. “Improving price-per-token and price-per-kilowatt.” It’s besides wherever unfastened root package helps trim nan AI hardware infrastructure’s ever-growing powerfulness value tag.
Zemlin past turned toward nan emerging “agentic” furniture of AI, that is, systems that plan, logic and enactment autonomously. Zemlin described a stack still successful its adolescence but quickly formalizing astir unfastened protocols, including early deployments of Model Context Protocol (MCP) and Agent2Agent (A2A) servers.
While only a number of organizations are utilizing MCP successful accumulation today, Zemlin suggested that 2026 will usher successful a activity of existent endeavor automation: multiagent workflows, learned orchestration, validation frameworks and caller blends of deterministic and nondeterministic systems.
“Agentic AI doesn’t request to beryllium wished by exemplary size,” he stressed. “It’s astir really you designer nan solution.”
‘AI Hasn’t Changed All That Much Yet’
Zemlin closed his keynote by emphasizing that contempt nan hype, “AI hasn’t changed each that overmuch yet.” What will alteration it, he argued, is unfastened collaboration.
Open source, he said, prevents vendor lock-in, improves spot and transparency, and provides “universal connectors” for nan coming era of interoperable AI systems. From training to conclusion to orchestration, he said, nan Linux Foundation intends to service arsenic a cardinal hub for that activity alongside world investigation labs and manufacture partners.
“We’re really, really excited to beryllium a mini portion of this world,” he said, promising awesome announcements still to come.
YOUTUBE.COM/THENEWSTACK
Tech moves fast, don't miss an episode. Subscribe to our YouTube channel to watercourse each our podcasts, interviews, demos, and more.
Group Created pinch Sketch.
English (US) ·
Indonesian (ID) ·