Need unique liquid splatter effects? Simulate them in real-time with LiquiGen & generate variations in seconds, including render passes, 3D mesh and particle exports. This particular setup takes about 5 seconds per variant on an RTX 4090 (so hundreds could be created in less than an hour).
This blows my mind 🤯 3 years ago AI was able to make terrible 512x512 pixelated images 😆 , now I can do all of this on my own PC on a single RTX 6000 Pro GPU:
- Train a Wan 2.1 LORA of a car, in this case the 2025 BMW M5 based on just 10 images.
- Generate a 10s consistent video of the car with Wan 2.2
- Upscale and enhance the video (Creative or regular upscales)
- Create a custom ComfyUI workflow that tracks and replaces a part of the video
In the video example below you can see the original generated video, masked areas and the license plate being replaced with a reference image
Nvidia Jetson Orin NX with CTI Boson carrier board setup
This guide details the setup of an Nvidia Jetson Orin NX module on a CTI Boson carrier board (NGX007), leveraging JetPack 6.2.1 (L4T R36.4.4) and the CTI BSP. It provides step-by-step instructions to download necessary software, flash the board, and configure it to enable two FRAMOS IMX464 cameras. Users can follow this process to get their system operational and verify camera functionality.
Read more here: https://lnkd.in/ded9srqT
Using NVIDIA’s Cosmos-Transfer1-DiffusionRenderer to relight live-action footage with HDRI.
It produces an impressive ALBEDO, with shadows accurately cast from the HDRI light source.
I customized it to process HDRI sequences, allowing calculation with arbitrary motion and position.
However, the resolution is limited to 1280x720 and only up to 57 frames can be processed at once.
I used an official overlapping code release to fake longer sequences.
The image quality still degrades, so it’s not suitable for direct use—but I tried blending it slightly into the opening of the first cut this time.
News drop 📣
👤 NVIDIA ACE now supports Qwen3-8B for on-device deployment of real-time, dynamic NPCs. Available now as an SDK plugin.
💬 Magpie Flow text-to-speech received multilingual support for testing within end-to-end character pipelines.
🖼️ We just released a new tech demo for the RTX Branch of Unreal Engine 5 showing off RTX Mega Geometry, ReSTIR PT, and DLSS 4.
Details: https://nvda.ws/42OHvox
🚦 What if we could test AV safety faster, without driving endless real-world miles?
In our latest Safety in the Loop livestream, NVIDIA researchers shared how simulation can:
🚗 Accelerate safety validation for end-to-end AV stacks
📊 Reduce costly real-world tests
🧠 Improve confidence in performance through correlation modeling
🎥 Catch the replay → https://bit.ly/478TCPX
🚦 What if we could test AV safety faster, without driving endless real-world miles?
In our latest Safety in the Loop livestream, NVIDIA researchers shared how simulation can:
🚗 Accelerate safety validation for end-to-end AV stacks
📊 Reduce costly real-world tests
🧠 Improve confidence in performance through correlation modeling
🎥 Catch the replay → https://bit.ly/3L1l9dr
🚦 What if we could test AV safety faster, without driving endless real-world miles?
In our latest Safety in the Loop livestream, NVIDIA researchers shared how simulation can:
🚗 Accelerate safety validation for end-to-end AV stacks
📊 Reduce costly real-world tests
🧠 Improve confidence in performance through correlation modeling
🎥 Catch the replay → https://bit.ly/3IY8KGD
Very well done, Jason