Nvidia Brings Blackwell To Jetson Edge Computers For GenAI-Powered Robots
‘The most important thing that we’re trying to do as an industry for a humanoid brain is to create a general-purpose brain and create sufficient reasoning capabilities,’ says Nvidia executive Deepu Talla of the opportunities the company sees with its new Jetson Thor modules.
Nvidia said Monday that its newly launched Blackwell-based Jetson AGX Thor edge compute modules can provide up to 7.5 times greater AI computing performance and as much as 3.5 times greater energy efficiency than its last-generation Jetson Orin products.
Designed to run generative and reasoning AI models for humanoid robots and other robot types, Jetson Thor comes with a Blackwell GPU and 128 GB of memory, making it capable of delivering up to 2,070 trillion operations per second of 4-bit floating point performance in a 130-watt power envelope, according to the AI infrastructure giant.
[Related: AI Chip Startups Seek An Edge By Enlisting The Channel]
The Santa Clara, Calif.-based company said early adopters include Amazon Robotics, Boston Dynamics, Caterpillar and Meta among others. Other big names such as John Deere and OpenAI are evaluating Jetson Thor, it added.
“We’ve built Jetson Thor for the millions of developers working on robotic systems that interact with and increasingly shape the physical world,” said Jensen Huang, founder and CEO of Nvidia, in a statement. “With unmatched performance and energy efficiency, and the ability to run multiple generative AI models at the edge, Jetson Thor is the ultimate supercomputer to drive the age of physical AI and general robotics.”
To promote development, Nvidia is releasing the $3,499 Jetson AGX Thor developer kit, which is available now and powered by the top-performing Jetson T5000 compute module.
The company said the stand-alone Jetson T5000 module is also available now from distributors globally for $2,999 in volume purchases. Embedded partners are expected to release their own systems and carrier boards using the new modules.
In addition, Nvidia revealed the less-performant Jetson T4000, which can deliver 1,200 trillion floating point operations per second in a 70-watt power envelope for $1,999, but it won’t be available until the fourth quarter.
By bringing the Blackwell GPU architecture to Nvidia’s family of Jetson compute modules, the company is bringing its game-changing Transformer engine architecture to the edge computing lineup for the first time. Designed to accelerate the performance of transformer models, including generative AI models, the Transformer engine was introduced in 2022 as part of Nvidia’s Hopper GPU architecture, which was never extended to the Jetson line.
Deepu Talla, vice president of robotics and edge AI, said this is the first time Nvidia is using the same GPU architecture for all of its product lines spanning servers, PCs and edge computers. The Jetson Orin products relied on Nvidia’s Ampere GPU architecture that debuted in 2020.
He told journalists in a briefing Friday that the new Jetson products are aimed at companies developing humanoid robots among other kinds of robots.
“The most important thing that we’re trying to do as an industry for a humanoid brain is to create a general-purpose brain and create sufficient reasoning capabilities,” he said. “If you look at most of the humanoid companies, they’re spending up to 1,000 watts for the compute because we are on the quest to solve general-purpose autonomy.”
The new Jetson modules come with a 14-core Neoverse Arm CPU, which signifies that that CPU was designed using a licensed chip blueprint from Arm’s Neoverse catalog of designs for cloud, edge and 5G network applications. The new CPU is up to 3.1 times faster than the Jetson Orin products, according to Nvidia.
The Jetson Thor modules also support four Gigabit Ethernet ports that are each capable of transmitting data at 25 Gbps. This makes their I/O throughput as much as 10 times faster than the last-generation Jetson Orin lineup.
While the new Jetson modules can max out their power to 130 watts, they can be configured to as low as 40 watts, paving the way for lower-power products in the future.
“We'll definitely look forward to address more segments,” Talla said.