NVIDIA Robotics’ cover photo
NVIDIA Robotics

NVIDIA Robotics

Computer Hardware Manufacturing

Santa Clara, California 451,525 followers

Inspiring visionaries and developers to create the next gen of AI-driven robots and explore the world of physical AI.

About us

The NVIDIA Robotics platform accelerates the development of AI-driven robots, streamlining processes from design and simulation to deployment. It enables key functions like navigation, mobility, grasping, and vision, supporting robotics across industries such as manufacturing, agriculture, logistics, and healthcare.

Website
https://www.nvidia.com/en-us/industries/robotics/
Industry
Computer Hardware Manufacturing
Company size
10,001+ employees
Headquarters
Santa Clara, California

Updates

  • NVIDIA Robotics reposted this

    Earlier this year, I shared how physical AI is moving from concept to reality. At #NVIDIAGTC 2026, we’ll explore some of the key forces behind that shift: digital twins and real-time simulation. High-fidelity simulation is transforming how products are developed and manufactured - enabling faster iteration, more efficient production, and safer deployment of autonomous systems. But it doesn’t stop there! Digital twins are advancing critical fields like smart cities, climate science, and space exploration - helping us solve problems at planetary scale. In my special address, I’ll share how #OpenUSD and NVIDIA’s accelerated computing platform are powering this transformation, and why simulation is becoming central to innovation across industries:��https://nvda.ws/4kSNUqp I’m also looking forward to joining leaders from ABB, Agile Robots SE, SK hynix, and Siemens for a panel discussion on how AI, robotics, and simulation are reshaping semiconductor, industrial, and advanced manufacturing systems in real-world deployment: htps://nvda.ws/4cC4mcv Join me in person or tune in virtually!

  • Congrats to AWS on launching Strands Labs. 🥳 By leveraging the NVIDIA Isaac GR00T open VLA model and Jetson, Strands Robots provides a seamless sim-to-real path for developers to accelerate agentic physical AI.

    View profile for Matt Garman
    Matt Garman Matt Garman is an Influencer

    Today we’re launching Strands Labs, a new Github organization built to help developers experiment with state-of-the-art approaches to agentic AI. Since releasing the Strands SDK as open source last year, we’ve seen strong feedback from the community and more than 14M downloads. Builders are pushing the boundaries of what agents can do — and we want to support that momentum. Strands Labs creates a dedicated space for experimentation. By separating early-stage projects from the production SDK, we can move faster, share ideas openly, and give developers direct access to emerging techniques without waiting on a formal release cycle. At launch, projects explore physical AI systems, robotics simulation, and AI Functions designed to narrow the trust gap in LLM-generated code. For builders, this means earlier access to new approaches, clearer examples, and a chance to shape the direction of agentic development in the open. Excited to see what the community creates next. Get started today: https://lnkd.in/eBX7h5sB

  • What open models run best on Jetson at the edge? ⚙️ Hear from our experts Chen Su, Suhas Sheshadri, and Chitoku Yato as they walk through which open models are optimized to run on Jetson for real-world AI applications. Watch the replay 📹 https://nvda.ws/4rGxjIY Download OSS models and review the benchmarks 👉 https://nvda.ws/4u4QCgY

  • View organization page for NVIDIA Robotics

    451,525 followers

    When issues happen on the factory floor, data shows what—but not always why. 🏭 Join Tulip Interfaces and Terex Corporation at #NVIDIAGTC to see how they combine multi-camera video, machine and robotics data to give full visual context—helping teams follow SOPs, find root causes faster, and fix process issues automatically. Add to schedule 🔗 https://nvda.ws/4kYJ9Md 📅 Wednesday, March 18 | 11:00 a.m PT Speakers🎤: Rony Kubat Ken McIntosh

  • Amazing to see NVIDIA Jetson Thor running Qwen3-Coder-Next-NVFP4 with Isaac ROS workflows. 🙌 A great showcase of what’s possible for open-source models on Jetson Thor. Check it out 👇

    My Jetson Thor just began programming itself! 🤖 I'm running Qwen3-Coder-Next-NVFP4 on NVIDIA Jetson Thor with a 64k context window, and still have room to spare to execute #IsaacROS Manipulator workflow's examples at the same time. Now I can basically ask Thor to explain me the Isaac ROS 4.1 structure, navigate the codebase, have him modify and tweak workflow components, and help me iterate on robotics pipelines. Thanks to NVFP4 quantization, I can fit this 80B total parameter model (3B activated) with large context windows (tested up to 128k). Running something this capable fully locally on an embedded robotics platform is pretty incredible. Paired with Cline in VS Code is actually doing some real, practical development work. If you want to try Qwen3-Coder-Next-NVFP4 on Thor, I documented the setup here: https://lnkd.in/d29MFt-i Developing with Jetson Thor is getting more exciting by the day.

  • Serve Robotics is redefining last-mile delivery on city sidewalks. 🤖 By simulating in NVIDIA Isaac Sim and running on NVIDIA Jetson Orin, Serve is deploying one of the largest fleets of physical AI-powered delivery robots, completing 100,000+ autonomous deliveries with a 99.8% completion rate across five major U.S. cities. See how physical AI and advanced simulation are reshaping urban delivery. 🔗 https://nvda.ws/4qTZRNZ

  • Excited to see Vention’s GRIIP bring their own proprietary models and NVIDIA robotics foundation models to production-ready physical AI in manufacturing—can’t wait to see it in action at #NVIDIAGTC. 🙌

    View organization page for Vention

    83,731 followers

    Introducing GRIIP™: a generalized Physical AI pipeline for manufacturing automation. Built on state-of-the-art foundation models from NVIDIA Robotics, GRIIP enables deployment of autonomous robot cells in highly unstructured manufacturing environments. The end-to-end pipeline covers scene digitalization, object segmentation, pose estimation, grasp selection, and collision-free motion planning. GRIIP delivers ultra-high generalization across part shapes, surface textures, colors, lighting conditions, and manufacturing environments. No training data required. Adaptive, robust performance out of the box. Read the full announcement: https://hubs.la/Q042tbF30 #Manufacturing #Automation #PhysicalAI #Robotics

    • GRIIP: Generalized Physical AI for Manufacturing
  • Reasoning in physical AI is rapidly accelerating GenAI adoption at the edge. Open‑source models give developers the control and flexibility needed to build production‑grade, use‑case‑specific systems. NVIDIA’s open models, including Nemotron, Cosmos, and Isaac GR00T, are not only truly open and customizable, they also deliver state‑of‑the‑art accuracy and lead many industry benchmarks. In this livestream, we will walk through the portfolio of NVIDIA open models optimized for Jetson, highlight real‑world success stories, and share key trends we’re seeing in GenAI deployments at the edge. During this session, we’ll cover: - NVIDIA open‑source models tailored for edge and on‑device workloads - Performance benchmarks for these models - Key Jetson resources and tutorials from NVIDIA - A sneak peek at the demo we’ll be showcasing at GTC 2026

    Reasoning at the Edge: NVIDIA Nemotron, Cosmos & Isaac GR00T on Jetson

    Reasoning at the Edge: NVIDIA Nemotron, Cosmos & Isaac GR00T on Jetson

    www.linkedin.com

Affiliated pages

Similar pages