Nvidia Is Not Inflating a Bubble: It's Pricing the New Digital Work
The debate around a potential "bubble" in AI spending often stems from an age-old intuition: if too many companies buy the same promise at once, a correction comes swiftly. However, Nvidia’s fiscal results for the fourth quarter of its 2026 fiscal year force us to refine that intuition. Not due to technological enthusiasm, but merely arithmetic.
Nvidia reported record quarterly revenues of $68.1 billion (period ending January 25, 2026), a 20% increase from the previous quarter and 73% year-on-year, surpassing the consensus of $66.21 billion. More importantly, its guidance for the first quarter of the fiscal year 2027 was $78.0 billion ±2%, also above expectations ($72.6 billion). In that same post-earnings call, CEO Jensen Huang stated that markets are “mistaken” in fearing that AI will displace established software firms like ServiceNow, presenting agents as a layer that enhances business workflows rather than erases them.
This distinction is not merely semantic; it’s a map of power: who captures value when productivity shifts from being solely dependent on people to relying on computational capacity, data, and tools that execute work on behalf of someone.
A Quarter That Doesn't Fit the Cooling Script
If AI spending were entering a saturation phase, one would expect to see typical signs: a slowdown in the core segment, margin compression, or cautious guidance to temper expectations. Instead, Nvidia exhibited the opposite.
The engine was overwhelmingly Data Center, with $62.3 billion in the quarter, a 75% increase year-on-year. Additionally, the company reported a GAAP gross margin of 75.0%, up 1.6 points quarter-on-quarter and 2.0 points year-on-year. This detail challenges the narrative of swift "commoditization": in a market that becomes a commodity, margins tend to shrink, not expand.
On GAAP results, Nvidia reported a diluted EPS of $1.76 and a GAAP net income of approximately $43 billion, 35% higher than the previous quarter and 94% year-on-year. In the full fiscal year 2026, revenues reached $215.938 billion, 65% above fiscal year 2025, with Data Center closing the year at $197.3 billion, up from $115.2 billion the previous year.
When a company reaches this scale and still accelerates, the point is no longer just “strong demand.” The point is the type of demand: it’s not exploratory purchasing for pilots; it’s capacity acquired for operation. The market may correct valuations, but here’s a structural fact: AI infrastructure is moving from experimentation to production line.
It’s also worth noting the “smaller” segments because they reveal diffusion: Gaming marked $3.7 billion (47% year-on-year, though -13% sequential), with a record year of $16.0 billion; and Professional Visualization rose to $1.3 billion, 159% year-on-year. This indicates that demand is not limited to training models in hyperscalers; the inference, visualization, and workflow layers are also beginning to absorb budget.
“AI Doesn’t Replace ServiceNow”: The Value Shift is in Workflow, Not Chips
Huang's remark to CNBC, cited by InvestingLive, is a strategic intervention: “markets are wrong” to fear that AI will destroy incumbent enterprise software firms like ServiceNow. His thesis is that agents “finish the work” using tools, then return information “in a way we can understand.” That “returning to an understandable way” is, in fact, the core of corporate value.
An organization does not pay for AI to simply “generate text.” It pays to reduce cycles: tickets that are solved, approvals that advance, incidents that get closed, reports that are consolidated, compliance that is verified. Within this framework, software like ServiceNow is not a dinosaur; it is the nervous system where work is recorded, audited, and governed. AI, when adopted judiciously, becomes muscle.
Here’s the power shift many underestimate: AI does not automatically eliminate platforms; it redefines the price of digital work within them. If an agent can execute a chain of tasks (querying, sorting, drafting, logging, escalating), then “work” becomes a computable unit. And when work is computable, budget discussions move from “licenses per user” to “capacity per result,” with performance metrics and traceability.
From its position, Nvidia captures the value of this transition because the current bottleneck is infrastructure: GPUs, memory, interconnects, and a stack that can meet that demand. That’s why the market may be discussing a “bubble,” but Nvidia operates as though it's collecting tolls on a newly opened highway.
The risk for companies is not that AI will replace their software; it’s that they attempt to use agents as a shortcut to automate broken processes. Inefficiency without discernment only accelerates error. And when an error travels at computing speed, the reputational and operational cost multiplies.
The “75% Margin” is a Sign of Functional Monopoly, but Not Eternal
A GAAP gross margin of 75% in a hardware company of this scale suggests pricing power and demand that finds no immediate substitutes. That is functional monopoly: not necessarily legal or permanent, but very real in the everyday purchasing practice.
However, the briefing itself acknowledges rising competitive pressures: custom silicon in hyperscalers like AWS, Google Cloud, and Microsoft Azure. This pressure does not need to knock Nvidia down tomorrow; it can do something subtler: push the market toward segmentation. At one extreme, “premium” infrastructure for critical workloads and frontier models. At the other, “good enough” infrastructure for inference and less critical agents, where the buyer optimizes total cost.
The guidance of $78 billion for the next quarter suggests that, for now, the premium remains intact. But for management teams, the useful message is not to bet on the premium lasting forever. It is to design a financial and operational architecture that does not depend on a single vendor or a single pricing curve.
An additional point: Nvidia returned $41.1 billion to shareholders in fiscal year 2026. That figure, in a market capital expenditure expansion cycle, reveals confidence in its cash generation and, at the same time, capital discipline. For CFOs, this is a signal: the “boom” is not forcing Nvidia to sacrifice returns to sustain growth. When that occurs, the provider becomes even more influential in the value chain.
Concurrently, the mentioned product lines (DLSS 4.5, RTX PRO 5000 72GB Blackwell, expansion of DGX Spark) confirm that the company is pushing AI into more usage contexts. It’s not merely about selling more units; it’s about expanding the perimeter of dependency on the stack.
The C-Level Opportunity: Moving from Blind Automation to Operable Augmented Intelligence
An executive who looks at these results and only concludes “we need to buy more AI” is reading the news as a gadget, not as business infrastructure. The strategic reading is different: AI is redefining how value is produced, and that demands governance.
First, it’s advisable to separate two purchases that many companies confuse: purchasing “capabilities” and purchasing “results.” Capability is computing power, models, integrations. The result is cycle time reduction, quality improvement, fewer incidents, greater compliance. Nvidia captures the capability; workflow platforms capture the result; and the user company only captures value if it translates both into operation.
Second, agents make a conversation about traceability inevitable. If an agent “finishes the work,” it can also finish it poorly. Hence, the real value is not in the agent acting, but in leaving a trace: what tool it used, what data it touched, what policy it applied, what escalation it made. That traceability is the bridge between productivity and risk.
Third, this market is entering a phase where the marginal cost of digital work tends to decrease, but not uniformly. For some time, there will be abundance for those who can afford premium infrastructure and scarcity for those who cannot. The job of leadership is to prevent that gap from turning into internal inequality: “augmented” teams that advance and “analog” teams that get trapped in operational debt.
Finally, Huang's assertion about ServiceNow has an implication for portfolios: software incumbents with access to workflows and transactional data have a natural advantage to “wrap” agents with control. This reduces the risk of total disintermediation but increases the pressure to redesign business models. The price will no longer be per seat; it will be per execution.
The Market Direction is Already Reflective in the Numbers
Nvidia’s results do not deny that there is exuberance around AI. They deny that we are facing superficial adoption. When Data Center reaches $62.3 billion quarterly and the company guides for $78 billion for the next quarter, the phenomenon appears less like a speculative peak and more like an infrastructure change comparable to cloud standardization.
In terms of exponential dynamics, this market has already passed the phase where technology “seems like a toy” and enters an industrial rollout stage: the cost per unit of digital work begins to compress, hardware becomes a productive lever, and workflow software becomes the place where that power is governed.
The dominant phase today is Disruption advancing toward Demonetization of repetitive work, with an inevitable side effect of Democratization as access to agents and computing expands beyond large buyers. Technology must be structured to empower human judgment and broaden access to productive capacity, not to automate errors at scale.









