
At CES 2026, Jensen Huang made one thing unmistakably clear: AI is no longer just about models or software. It is becoming a full-stack systems story.
Opening NVIDIA’s keynote in Las Vegas, Huang described a computing shift on a massive scale, arguing that the last decade of computing is now being rebuilt around accelerated computing and artificial intelligence. His message was not simply about faster chips. It was about an entirely new way of building intelligence, from infrastructure and models to robots, vehicles, and personal AI agents.
The centerpiece of that vision was Rubin, NVIDIA’s first extreme-codesigned six-chip AI platform, now in full production. Huang positioned Rubin as the company’s next major leap after Blackwell, built from the data center outward with tightly integrated GPUs, CPUs, networking, DPUs, and AI-native storage. The goal is not just raw performance, but removing the bottlenecks that slow training and inference at scale. Huang said Rubin could cut the cost of generating AI tokens to roughly one-tenth of the previous platform, a shift that could make large-scale deployment far more practical.
But infrastructure was only one part of the story. Huang also emphasized NVIDIA’s expanding portfolio of open models, trained on its own supercomputers and released across key domains. These include Clara for healthcare, Earth-2 for climate science, Nemotron for reasoning and multimodal AI, Cosmos for robotics and simulation, GR00T for embodied intelligence, and Alpamayo for autonomous driving. His argument was that NVIDIA is not only building the hardware layer of AI, but also helping create an open global intelligence ecosystem that developers, enterprises, and even countries can build on.
That vision stretched well beyond the data center. Huang showed how AI is becoming personal, demonstrating a local AI agent running on the DGX Spark desktop supercomputer and embodied through a small robot using Hugging Face models. What once felt futuristic was presented as something increasingly normal: AI agents that can run locally, respond in real time, and interact with the physical world.
Physical AI was one of the keynote’s strongest themes. Huang highlighted Cosmos world foundation models for robotics and simulation, capable of generating realistic video, modeling edge cases, and running interactive closed-loop simulations. He then introduced Alpamayo, an open portfolio of reasoning vision-language-action models, datasets, and simulation blueprints aimed at level-4 autonomy. That roadmap is already heading into the real world through the new Mercedes-Benz CLA, which will bring AI-defined driving to U.S. roads this year on NVIDIA DRIVE.
Robotics, too, took center stage, with Huang pointing to Isaac Sim, Isaac Lab, and a growing partner ecosystem that includes Siemens, Boston Dynamics, Synopsys, and Cadence. His bigger point was simple: future factories, vehicles, robots, and intelligent systems will not be built from isolated components. They will be built as integrated systems.
That is the real blueprint Huang laid out at CES 2026. AI is moving beyond software. It is becoming infrastructure, autonomy, simulation, industry, and everyday interaction—all at once.

Trifleck is a premier app development company and technology consulting firm. We deliver custom app development, enterprise software solutions, and digital strategies that generate real ROI for B2B clients.
