NVIDIA Jetson Thor FAQs Everyone Must Know
- Marco Madrigal
- Aug 11
- 3 min read
Updated: Aug 12

What is NVIDIA Jetson Thor?
Jetson Thor is NVIDIA’s new supercharged embedded AI platform, delivering up to 2070 FP4 TFLOPS using the Blackwell GPU. It's designed for robotics, autonomous machines, and medical devices that need real-time and generative AI capabilities at the edge.
How does NVIDIA Jetson Thor compare to Jetson AGX Orin?
7.5× AI compute
3.5× energy efficiency
2.6× CPU performance
Up to 130W (Compared with 60W of Jetson Orin AGX)
Hardware support for Large Language Models (LLM)
 Jetson Thor brings massive leaps over Orin, with newer GPU architecture, MIG support, and up to 128 GB memory.
What’s new in Thor’s architecture?
Blackwell GPUÂ with transformer engine
14-core Arm Neoverse-V3AE CPU
MIG (Multi-Instance GPU)Â for parallel workloads
4× 25 GbE for sensor fusion
JetPack 7Â optimized stack with Linux 6.8 and CUDA 13
128GB of memory to enable LLM inference and complex models
How much does the Jetson Thor cost?
Jetson AGX Thor cost is $3499.
Where can I buy the NVIDIA Jetson Thor?
You can purchase the NVIDIA Jetson Thor from Arrow
What use cases is Thor built for?
Humanoid robotics
Autonomous mobile machines
Surgical robots and medical instruments
Vision AI agents for industrial inspection
Generative AI at the edge
Many other applications that require high compute capabilities at the edge!
Why is Jetson Thor ideal for humanoid robotics?
Thor supports real-time sensor fusion, large vision-language models (VLM), LLMs, and robot control logic (VLA). Its performance allows multi-agent cognition and perception onboard. Additionally, with the addition of Holoscan Sensor Bridge you can get very low latency video capture directly to memory.Â
How does Thor help in automotive applications?
NVIDIA Jetson Thor enables:
Real-time sensor processing
Transformer-based perception
Multi-functional cockpit AI
Consolidation of ADAS, DMS, and IVI on one system
Thor integrates seamlessly with LLMs to operate as part of your automotive infotainment system.
How is NVIDIA Jetson Thor used in healthcare with Holoscan?
Low-latency ingestion with Holoscan Sensor Bridge
11 ms latency for 4× 4K video streams
Replaces multi-Orin setups with a single Thor module
Ideal for surgical robots, imaging, and medical edge AI
What does Thor enable for Vision AI systems?
4× video stream throughput
Up to 6× improvement in Vision Foundation Model inference
AI-powered alerting and video summarization with VSS + VLM
How does it compare to NVIDIA DGX Spark?
Use DGX Spark for training and prototyping large models (up to 1–30 PFLOPS), and Jetson Thor for inference at the edge. They complement each other across the AI lifecycle.
What is JetPack 7 and why does it matter?
JetPack 7 is the official software stack for Jetson Thor. It features:
PREEMPT_RTÂ real-time kernel
CUDA 13, TensorRT, Triton
Support for GenAI frameworks (VLLM, SGLang, HuggingFace)
Secure boot, memory encryption, and MIG management
Where can I download the Jetpack 7?
Once the NVIDIA Jetson Thor is released, you will be able to download the Jetpack 7 at:
What types of AI models can run on Thor?
LLM: LLaMA, Qwen, DeepSeek
VLM: Gemini, NVILA
VLA: GR00T, PI0
Vision Transformers, CNNs, and more
Where can I find the NVIDIA Jetson Thor Datasheet?
You can find the NVIDIA Jetson Thor datasheet here!
What’s the expected power envelope?
Jetson Thor runs up to 130 W
Energy efficiency is 3.5× higher than Orin, thanks to FP4 compute
Can I attach cameras to Thor?
Yes, you can use the Holoscan Sensor Bridge in the NVIDIA Dev Kit for low-latency solutions and you can use MIPI cameras in development kits from other manufacturers. Please note that the NVIDIA Dev Kit does not include support for MIPI cameras.
Is NVIDIA Jetson Thor form factor compatible with Orin?
No. Thor introduces new electrical and mechanical specs. Custom carrier boards and thermal solutions are required.
Can RidgeRun.ai help me adopt Thor?
Absolutely. RidgeRun.ai provides:
Model training, quantization, fine-tuning
Inference optimization (TensorRT, FP8, sparsity, MIG config)
VLM/VLA/LLM deployment for vision, robotics, medical, and industrial
Can RidgeRun Embedded assist with hardware/software integration?
Yes, RidgeRun Embedded brings:
Driver development for sensors (camera, LIDAR, CAN, etc.)
BSP customization for custom boards
GStreamer pipelines for video ingest
Secure boot, OTA, safety features
Integration with Isaac, Holoscan, Metropolis SDKs
Where can I get help evaluating or deploying NVIDIA Jetson Thor?
Contact RidgeRun.ai for AI strategy, model adaptation, and Jetson acceleration
Contact RidgeRun Embedded for integration, customization, and deployment
We’ll help you translate Jetson Thor’s power into a competitive product.
Do you want to know more about the NVIDIA Jetson Thor?
Getting started with the NVIDIA Jetson Thor? Read more at:
https://developer.ridgerun.com/wiki/index.php/NVIDIA_Jetson_Thor:_Powering_the_Future_of_Physical_AI
Also, you can find all the NVIDIA Jetson Thor documentation at their Jetson Download Center.
And wait for our comprehensive user guides!Â
