ManiPylator focusing its laser pointer at a page.

Simulation And Motion Planning For 6DOF Robotic Arm

[Leo Goldstien] recently got in touch to let us know about a fascinating update he posted on the Hackaday.io page for ManiPylator — his 3D printed Six degrees of freedom, or 6DOF robotic arm.

This latest installment gives us a glimpse at what’s involved for command and control of such a device, as what goes into simulation and testing. Much of the requisite mathematics is introduced, along with a long list of links to further reading. The whole solution is based entirely on free and open source (FOSS) software, in fact a giant stack of such software including planning and simulation software on top of glue like MQTT message queues.

The practical exercise for this installment was to have the arm trace out the shape of a heart, given as a mathematical equation expressed in Python code, and it fared quite well. Measurements were taken! Science was done!

We last brought you word about this project in October of 2024. Since then, the project name has changed from “ManiPilator” to “ManiPylator”. Originally the name was a reference to the Raspberry Pi, but now the focus is on the Python programming language. But all the bot’s best friends just call him “Manny”.

If you want to get started with your own 6DOF robotic arm, [Leo] has traced out a path for you to follow. We’d love to hear about what you come up with!

Continue reading “Simulation And Motion Planning For 6DOF Robotic Arm”

2025 Pet Hacks Contest: Keep The Prey At Bay With The Cat Valve

Some cats are what you might call indoor cats, happy to stretch out in the lap of indoor luxury and never bother themselves with the inclement outdoors again. Others however are fully in touch with their Inner Cat, and venture forth frequently in search of whatever prey they can find.

[Rkramer] has a cat of this nature,sadly one with a propensity for returning with live prey. To avoid this problem a solution is called for, and it comes in the shape of the Cat Valve, an automated cat door which enforces a buffer zone in their cellar to prevent unwanted gifts.

It’s a simple enough idea, when an IR sensor connected to a Raspberry Pi 4 detects the cat heading out into the world through the exterior cat flap, the computer fires up a motor connected to a lead screw which closes the flap between buffer zone and house. The cat then has the safety of the buffer zone, but can’t bring the prey fully inside.

If you’re a cat lover you’ll forgive them anything, but we have to admit to being on [Rkramer]’s side with this one. A useful way to keep the prey at bay is something we could have used a few times in the past, too. This project is part of the 2025 Pet Hacks contest. Done something similar for your cat? Why not make it an entry!

Fytó pet plant

2025 Pet Hacks Contest: Fytó – Turn Your Plant Into A Pet

This entry into the 2025 Pet Hacks Contest is about bringing some fun feedback to normally silent plants. Fytó integrates sensors and displays into a 3D printed planter. The sensors read the various environmental and soil conditions that the plant is experiencing, and give you feedback about them via a series of playful expressive faces that are displayed on the screen embedded in the planter.

At the core of the Fytó is a Raspberry Pi Zero 2 W, which has plenty of power to display the animations while also being small enough to easily fit inside the planter without it growing in size much more than a normal planter would be. The sensors include a capacitive soil moisture sensor, a temperature sensor, and a light-dependent resistor. These sensors all provide analog outputs to relay their measurements and so there was an ADS1115 analog-to-digital converter board also included as the Raspberry Pi doesn’t have the required analog pins to communicate with them.

The fun animated faces are displayed with a 2-inch LCD display embedded in the planter. A small acrylic cover is placed in front of the LCD to help ease the transition from the printed planter to the internally mounted screen. The temperature and light sensors were also placed in openings around the planter to ensure they could get good environmental readings. There are six expressions the Fytó can express based on its sensor readings, ranging from happy when all the readings are in a good zone, to thirsty if it needs water or freezing when it’s too cold. Be sure to check out the other entries in the 2025 Pet Hacks Contest.

Continue reading “2025 Pet Hacks Contest: Fytó – Turn Your Plant Into A Pet”

Work, Eat, Sleep, Repeat: Become A Human Tamagotchi

When [Terence Grover] set out to build a Tamagotchi-inspired simulator, he didn’t just add a few modern tweaks. He ditched the entire concept and rebuilt it from the ground up. Forget cute wide-eyed blobby animals and pixel-poop. This Raspberry Pi-powered project ditches nostalgia in favour of brutal realism: inflation, burnout, capitalism, and the occasional existential crisis. Think Sims meets cyberpunk, rendered charmingly in Python on a low-res RGB LED matrix.

Instead of hunger and poop meters, this dystopian pet juggles Maslow’s hierarchy: hunger, rest, safety, social life, esteem, and money. Players make real-life-inspired decisions like working, socialising, and going into education – each affecting the stats in logical (and often unfair) ways. No free lunch here: food requires money, money requires mind-numbing labour, and labour tanks your rest. You can even die of overwork à la Amazon warehouse. The UI and animation logic are all hand-coded, and there’s a working buzzer, pixel-perfect sprite movement, and even mini-games to simulate job repetition.

It’s equal parts social commentary and pixel art fever dream. While we have covered Tamagotchi recreations some time ago, this one makes you the needy survivor. Want your own dystopia in 64×32? Head over to [Terence Grover]’s Github and fork the full open source code. We’ll be watching. The Tamagotchi certainly is.

Continue reading “Work, Eat, Sleep, Repeat: Become A Human Tamagotchi”

Kaleidoscopico Shows Off Pi Pico’s Capabilities

In the early days of computing, and well into the era where home computers were common but not particularly powerful, programming these machines was a delicate balance of managing hardware with getting the most out of the software. Memory had to be monitored closely, clock cycles taken into account, and even video outputs had to be careful not to overwhelm the processor. This can seem foreign in the modern world where double-digit gigabytes of memory is not only common, it’s expected, but if you want to hone your programming skills there’s no better way to do it than with the limitations imposed by something like a retro computer or a Raspberry Pi Pico.

This project is called Kaleidoscopio, built by [Linus Åkesson] aka [lft] and goes deep into the hardware of the Pi Pico in order to squeeze as much out of the small, inexpensive platform as possible. The demo is written with 17,000 lines of assembly using the RISC-V instruction set. The microcontroller has two cores on it, with one core acting as the computer’s chipset and the other acts as the CPU, rendering the effects. The platform has no dedicated audio or video components, so everything here is done in software using this setup to act as a PC from the 80s might. In this case, [lft] is taking inspiration from the Amiga platform, his favorite of that era.

The only hardware involved in this project apart from the Pi Pico itself are a few resistors, an audio jack, and a VGA port, further demonstrating that the software is the workhorse in this build. It’s impressive not only for wringing out as much as possible from the platform but for using the arguably weaker RISC-V cores instead of the ARM cores, as the Pi Pico includes both. [lft] goes into every detail on the project’s page as well, for those who are still captivated by the era of computer programming where every bit mattered. For more computing demos like this, take a look at this one which is based on [lft]’s retrocomputer of choice, the Amiga.

Continue reading “Kaleidoscopico Shows Off Pi Pico’s Capabilities”

A Pi-Based LiDAR Scanner

Although there are plenty of methods for effectively imaging a 3D space, LIDAR is widely regarded as one of the most effective methods. These systems use a rapid succession of laser pulses over a wide area to create an accurate 3D map. Early LIDAR systems were cumbersome and expensive but as the march of time continues on, these systems have become much more accessible to the average person. So much so that you can quickly attach one to a Raspberry Pi and perform LiDAR imaging for a very reasonable cost.

This software suite is a custom serial driver and scanning system for the Raspberry Pi, designed to work with LDRobot LIDAR modules like the LD06, LD19, and STL27L. Although still in active development, it offers an impressive set of features: real-time 2D visualizations, vertex color extraction, generation of 360-degree panoramic maps using fisheye camera images, and export capabilities for integration with other tools. The hardware setup includes a stepper motor for quick full-area scanning, and power options that include either a USB battery bank or a pair of 18650 lithium cells—making the system portable and self-contained during scans.

LIDAR systems are quickly becoming a dominant player for anything needing to map out or navigate a complex 3D space, from self-driving cars to small Arduino-powered robots. The capabilities a system like this brings are substantial for a reasonable cost, and we expect to see more LiDAR modules in other hardware as the technology matures further.

Thanks to [Dirk] for the tip!

GLaDOS Potato Assistant

This Potato Virtual Assistant Is Fully Baked

There are a number of reasons you might want to build your own smart speaker virtual assistant. Usually, getting your weather forecast from a snarky, malicious AI potato isn’t one of them, unless you’re a huge Portal fan like [Binh Pham].

[Binh Pham] built the potato incarnation of GLaDOS from the Portal 2 video game with the help of a ReSpeaker Light kit, an ESP32-based board designed for speech recognition and voice control, and as an interface for home assistant running on a Raspberry Pi.

He resisted the temptation to use a real potato as an enclosure and wisely opted instead to print one from a 3D file he found on Thingiverse of the original GLaDOS potato. Providing the assistant with the iconic synthetic voice of GLaDOS was a matter of repackaging an existing voice model for use with Home Assistant.

Of course all of this attention to detail would be for naught if you had to refer to the assistant as “Google” or “Alexa” to get its attention. A bit of custom modelling and on-device wake word detection, and the cyborg tuber was ready to switch lights on and off with it’s signature sinister wit.

We’ve seen a number of projects that brought Portal objects to life for fans of the franchise to enjoy, even an assistant based on another version of the GLaDOS the character. This one adds a dimension of absurdity to the collection.

Continue reading “This Potato Virtual Assistant Is Fully Baked”