2026 Nvidia GTC——More Explosive Content, Colder Stock Price

robot
Abstract generation in progress

Why is AI · Huang Renxun’s AI vision cooling the market?

Last night, the China-U.S. talks lasted until after 11 p.m., and when I woke up this morning, Huang’s AI Spring Festival Gala was again making waves.

Jingjing said that the A-shares, especially the light and communication sectors, didn’t drop as much during the war as Huang’s meetings. It was hilarious…

A one-sentence summary of NVIDIA’s GTC 2026 — more explosive content, cooler stock price.

Remember, every year Huang’s GTC is fuel for the capital markets:

  • 2024 ignited optical modules, HBM, liquid cooling.
  • 2025 Blackwell sparked a compute arms race.
  • 2026’s 2.5-hour speech still packed a huge information load — but the market was noticeably calmer for the first time.

Trillions of dollars in demand, 350x inference acceleration, tiered token pricing, Agent ending SaaS… each phrase alone could be a headline. As a result, NVIDIA’s stock rose up to 4.31% intraday but closed up only 1.65%.

Optical modules, CPO chains, copper cables — no movement. In an environment dominated by geopolitical risks and macro uncertainties, funds are holding back.

Huang is still pricing the entire AI ecosystem. But unlike previous years — geopolitical risks are weighing heavily on high-expectation assets.

Huang’s depiction of the AI business world can be summarized in five sentences:

  1. In the future, companies will give employees not just annual salaries but also token budgets. AI Agents will help write reports, run data, make decisions — the tokens consumed will be the new productivity cost.
  2. Traditional software subscriptions (SaaS) will be replaced by Agent-as-a-Service (AaaS). Instead of buying a tool and operating it yourself, you’ll directly hire an AI Agent to do the work.
  3. Tokens will be tiered in pricing, like airline tickets. From free economy class (basic inference) to $150 per million tokens for first class (ultra-low latency real-time response), pay according to speed and quality.
  4. AI is no longer just chatbots; it’s learning to think. Next-generation reasoning models will verify and self-correct answers repeatedly, using more computing power for more reliable results — meaning token consumption will grow exponentially.
  5. AI in the physical world is coming. From robots to autonomous driving, AI will start understanding 3D space and physical laws, no longer limited to text and images — NVIDIA calls this physical AI, which is the next explosive demand for computing power.

Putting these five points together, Huang’s story is clear: AI is moving from technical demos to full-scale commercialization, and computing power is the common denominator across all stages. The narrative itself isn’t the problem; the market has already largely priced in this story.

Before the conference, the market’s most anticipated theme was the CPO and optical interconnect roadmap, which had high expectations. But Huang gave a very “politically correct” answer: we need more copper cable capacity, more optical chip capacity, more CPO capacity — all of it. This can be seen as a sign that the technology is still in early stages and he doesn’t want to commit fully, or that suppliers and competitors are watching closely, and he must maintain diplomatic language. But for trading, “all of it” means “short-term, no explosive growth.” The Feynman architecture still supports both copper cables and CPO as expansion options, further confirming this. The catalyst for optical modules can only be delayed further.

Three signals worth noting about NVIDIA’s growth:

  1. Huang said FY2027 growth is “very likely to exceed 40%,” while Wall Street expects 30%. This is more valuable than a trillion dollars — directly pointing to upward revisions in profit expectations. Valuation has been compressed to a relatively low range; if 40% growth is confirmed in subsequent earnings reports, current prices will have an expectation gap.
  2. Groq’s integrated solution is more mature than expected, enabling asymmetric inference separation via Dynamo software. NVIDIA is turning inference from a slogan into a deliverable layered platform.
  3. The trend of Agent ending SaaS points to a multi-year corporate IT overhaul, with token consumption growth far beyond chatbot scenarios.

Before the conference, the biggest consensus among institutions was that short-term demand isn’t worth debating; the real focus is whether AI capital expenditure cycles can extend into 2027-2028. Huang provided a roadmap (Vera Rubin → Rubin Ultra → Feynman), a token economics framework, and a timeline for Groq’s mass production (Q3 shipments), but in terms of firm customer order commitments and infrastructure ROI quantification, the strength remains mostly qualitative. The conference’s overall reaction was “within expectations”: clear technical routes, but the commercial granularity isn’t enough to fully reassure the market.

Huang remains the strongest storyteller in the AI ecosystem, but in the current environment of geopolitical risks and valuation needing hard data, the 2.5-hour perfect speech’s market response shows a few things:

Over the past two years, trading has been accustomed to betting on a single path, so optical communication and CPO had high expectations before his speech. But Huang’s statements actually present a more rational picture: the “long-term vision” (which isn’t really far away, just not fitting the weekly/monthly trading horizon) is now clear — AI demand exists. But where are the constraints? Memory, cooling, power, network, raw material costs, hyperscaler capex… on the supply side.

An investor said something super insightful: “In my view, everywhere in the global supply chain are bottlenecks, but also opportunities.”

Translate this: Huge demand is coming; building a new city requires first renovating the old city. During the renovation, obstacles like old buildings, nail households… might slow down progress. Anything that can solve these bottlenecks is an opportunity.

I believe today’s market pricing is more about short-term geopolitical risks and expectations of “trading positions” — which is understandable, just like the initial craze for AI, where the market values rapid realization over grand narratives.

But from a long-term allocation perspective, the “Beautiful New World” is shifting from narrative to tangible land — the romance is turning into marriage.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin