Sensory feedback is not a “nice-to-have”. If it doesn’t stay on the skin, it doesn’t exist. This Scientific Reports paper “Ultra-conformable tattoo electrodes for providing sensory feedback via transcutaneous electrical nerve stimulation” tackles a very practical bottleneck in non-invasive neurostimulation: standard wet Ag/AgCl electrodes don’t adapt well to irregular residual-limb surfaces and can detach during movement. :contentReference Their proposal is elegant: ultra-conformable, Paryylene C–based “tattoo” electrodes to deliver somatotopic sensations via TENS, without surgery. :contentReference. A few details: • impedance stability over a working-day window: max variation ~8% over 9 hours at the target stimulation frequency :contentReference; • comparison on 12 participants: no significant differences vs wet Ag/AgCl in rheobase (p>0.3) and chronaxie (p>0.15), and comparable sensory perceptions :contentReference; • lower operational impedance with tattoo electrodes (practical advantage for real use) :contentReference. It’s about making non-invasive sensory feedback less fragile and more wearable, the kind of detail that determines whether feedback can live outside the lab. 📝 Link in the first comment. Question for those working with stimulation/wearables: what is the real blocker for daily-life TENS feedback today: adhesion, selectivity, skin comfort, or long-term stability? #haptics #sensoryfeedback #tens #transcutaneousstimulation #neuroprosthetics #prosthetics #upperlimbprosthesis #somatosensory #somatotopy #wearables #epidermalelectronics #electrodes #tattooelectrodes #parylenec #biomedicalengineering #rehabilitation #assistivetechnology #humanmachineinterface #embodiment #neuroengineering
Multi-Sensory Feedback Systems
Explore top LinkedIn content from expert professionals.
Summary
Multi-sensory feedback systems use technology to simulate or capture information from multiple senses—like touch, smell, sight, and sound—to create more immersive and responsive experiences in fields such as robotics, healthcare, and wellness. These systems help machines and environments interact with people in ways that feel more natural, improving comfort, safety, and emotional well-being.
- Integrate sensory layers: Combine touch, sound, sight, and even smell or temperature sensors to provide richer feedback that helps users or machines make better decisions.
- Prioritize comfort: Use flexible, wearable materials and designs that stay in place and adapt to movement, making sensory technology practical for daily use.
- Capture invisible signals: Monitor non-verbal cues and environmental changes to understand experiences that aren’t always spoken or easily measured.
-
-
Wellness retreats have evolved far beyond luxurious aesthetics—today, it's about scientifically engineered environments that actively influence emotional health, stress recovery, and cognitive restoration. 🔊 Did you know that carefully curated natural soundscapes can reduce cortisol (stress hormone) levels by 𝐮𝐩 𝐭𝐨 35%? 🌸 Or that olfactory gardens—designed around scent—can measurably reduce anxiety (𝐮𝐩 𝐭𝐨 30%) and enhance sleep quality (18% 𝐛𝐞𝐭𝐭𝐞𝐫)? 🧠 Neuroscience confirms multi-sensory spaces aren't just pleasant—they stimulate significantly more neural pathways, profoundly enhancing emotional positivity and accelerating stress recovery. Retreats adopting these strategies report guest satisfaction ratings that are 𝐮𝐩 𝐭𝐨 40% 𝐡𝐢𝐠𝐡𝐞𝐫. 🌳 Sustainable, tactile-rich materials like wood or stone can lower stress biomarkers by 10-15%, while circadian-aligned lighting strategies boost sleep quality by up to 22% and cognitive function by 26%. At Urban A&O, we see wellness architecture as an essential, data-backed tool for creating spaces that not only feel good but deliver measurable wellness benefits. It's wellness architecture that's immersive by design, aligning deeply with net-zero goals and transforming guest experiences. In this week's newsletter, we explore how wellness leaders use neuroscience-driven, multi-sensory design strategies—olfactory gardens, optimized acoustics, tactile intelligence, and circadian lighting—to redefine the wellness experience. 🔍 𝐊𝐞𝐲 𝐢𝐧𝐬𝐢𝐠𝐡𝐭𝐬 𝐟𝐫𝐨𝐦 𝐭𝐡𝐢𝐬 𝐞𝐝𝐢𝐭𝐢𝐨𝐧: • Up to 40% 𝐡𝐢𝐠𝐡𝐞𝐫 𝐠𝐮𝐞𝐬𝐭 𝐬𝐚𝐭𝐢𝐬𝐟𝐚𝐜𝐭𝐢𝐨𝐧 with multi-sensory design. • 20-30% 𝐟𝐚𝐬𝐭𝐞𝐫 𝐜𝐨𝐫𝐭𝐢𝐬𝐨𝐥 𝐧𝐨𝐫𝐦𝐚𝐥𝐢𝐳𝐚𝐭𝐢𝐨𝐧 in sensory-rich environments. • 61-101% 𝐢𝐦𝐩𝐫𝐨𝐯𝐞𝐦𝐞𝐧𝐭 𝐢𝐧 𝐜𝐨𝐠𝐧𝐢𝐭𝐢𝐯𝐞 𝐜𝐥𝐚𝐫𝐢𝐭𝐲 with optimized air quality and natural ventilation. It's time to move beyond superficial relaxation toward meaningful, measurable wellness. 📢 𝐉𝐨𝐢𝐧 𝐭𝐡𝐞 𝐜𝐨𝐧𝐯𝐞𝐫𝐬𝐚𝐭𝐢𝐨𝐧! How do you see multi-sensory architecture reshaping the future of wellness hospitality? Share your thoughts in the comments below using #UrbanAO. Subscribe to the newsletter to stay at the forefront of wellness and sustainability innovation. #UrbanAO #WellnessDesign #Architecture #Sustainability #WellnessRetreats #Innovation
-
𝗧𝗵𝗲 𝗔𝗻𝗮𝗹𝗼𝗴𝘆: 𝗪𝗵𝗲𝗻 𝗦𝗶𝗹𝗲𝗻𝗰𝗲 𝗜𝘀 𝗮 𝗦𝗶𝗴𝗻𝗮𝗹 In aviation, the black box records everything—data, audio, pressure changes—so investigators can understand not just what happened, but why. In healthcare, we have patient surveys. Complaint reports. Staff notes. But let’s be honest: most of the experience data we gather is either delayed, sanitized, or incomplete. 𝗪𝗵𝗮𝘁 𝗶𝗳 𝗼𝘂𝗿 𝗵𝗼𝘀𝗽𝗶𝘁𝗮𝗹𝘀 𝗵𝗮𝗱 𝗮 “𝗯𝗹𝗮𝗰𝗸 𝗯𝗼𝘅” 𝗳𝗼𝗿 𝗽𝗮𝘁𝗶𝗲𝗻𝘁 𝗲𝘅𝗽𝗲𝗿𝗶𝗲𝗻𝗰𝗲? A real-time, multi-sensory, always-on record of what patients go through, beyond what they tell us in a form? 𝗧𝗵𝗲 𝗚𝗮𝗽𝘀 𝗪𝗲 𝗗𝗼𝗻'𝘁 𝗧𝗮𝗹𝗸 𝗔𝗯𝗼𝘂𝘁 We assume feedback = reality. But here’s what we may be missing: • Patients who never speak up: due to fear, cultural norms, or low expectations • Body language in waiting rooms or during discharge conversations • Non-verbal drop-offs: appointment no-shows, cancelled follow-ups, or early discharges • Tone and emotion in call center interactions rarely captured or analyzed • “Too late” feedback: Post-discharge surveys don’t help today’s patient 𝗧𝗵𝗶𝗻𝗸 𝗗𝗶𝗳𝗳𝗲𝗿𝗲𝗻𝘁𝗹𝘆: 𝗘𝘅𝗽𝗲𝗿𝗶𝗲𝗻𝗰𝗲 𝗦𝗲𝗻𝘀𝗶𝗻𝗴 𝗙𝗿𝗮𝗺𝗲𝘄𝗼𝗿𝗸 To create a true PX Black Box, we need to blend different modes to unlock different insights - Passive Signals: No-shows, long wait room times, silent exits - Emotional distress, friction, avoidance - Environmental Cues: Noise levels, seating patterns, eye contact - Discomfort, safety perception - Behavioral Feedback: Staff-patient micro-interactions - Empathy, tone, relational experience - Active Listening: Surveys, complaints, social media, interviews - Verbalized perceptions and emotions 𝗥𝗲𝗮𝗹-𝗪𝗼𝗿𝗹𝗱 𝗔𝗽𝗽𝗹𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀 • Use AI sentiment analysis on call recordings & WhatsApp messages • Create a “𝙋𝙓 𝙎𝙝𝙖𝙙𝙤𝙬𝙞𝙣𝙜 𝙋𝙧𝙤𝙜𝙧𝙖𝙢” where staff silently observe journeys end-to-end • Equip waiting areas with PX Observers who map emotion, not just time • Integrate “𝙎𝙞𝙡𝙚𝙣𝙩 𝙀𝙭𝙞𝙩 𝙄𝙣𝙩𝙚𝙧𝙫𝙞𝙚𝙬𝙨” via digital kiosks no names, no pressure • Correlate EHR behavior patterns with patient satisfaction dips 𝗬𝗼𝘂𝗿 𝗣𝗫 𝗟𝗮𝗯 𝗘𝘅𝗲𝗿𝗰𝗶𝘀𝗲 Reflect on these 3 questions as a Leader or PX Leader: 1. What patient experiences in my hospital are invisible to our current systems? 2. How often do we act on what’s not said rather than what’s measured? 3. What could we learn by simply observing one full patient journey in silence? 𝗙𝗶𝗻𝗮𝗹 𝗧𝗵𝗼𝘂𝗴𝗵𝘁 Experience is not just data it’s signals. The future of PX leadership lies in sensing the invisible and decoding the unspoken. 𝗜𝘁’𝘀 𝘁𝗶𝗺𝗲 𝘄𝗲 𝗯𝘂𝗶𝗹𝘁 𝗼𝘂𝗿 𝗼𝘄𝗻 𝗯𝗹𝗮𝗰𝗸 𝗯𝗼𝘅𝗲𝘀. 𝗨𝗽 𝗡𝗲𝘅𝘁: 𝗜𝘀𝘀𝘂𝗲 #3 𝗧𝗲𝗮𝘀𝗲𝗿 𝗧𝗶𝘁𝗹𝗲: 𝘿𝙚𝙨𝙞𝙜𝙣𝙞𝙣𝙜 𝙛𝙤𝙧 𝘿𝙚𝙡𝙞𝙜𝙝𝙩: 𝙈𝙞𝙘𝙧𝙤-𝙈𝙤𝙢𝙚𝙣𝙩𝙨 𝙏𝙝𝙖𝙩 𝙏𝙧𝙖𝙣𝙨𝙛𝙤𝙧𝙢 𝘾𝙖𝙧𝙚 𝘉𝘪𝘨 𝘴𝘢𝘵𝘪𝘴𝘧𝘢𝘤𝘵𝘪𝘰𝘯 𝘰𝘧𝘵𝘦𝘯 𝘩𝘪𝘥𝘦𝘴 𝘪𝘯 𝘴𝘮𝘢𝘭𝘭 𝘪𝘯𝘵𝘦𝘳𝘢𝘤𝘵𝘪𝘰𝘯𝘴.
-
China has developed an advanced electronic skin that allows humanoid robots to sense pressure, temperature, and physical damage—enabling reactions similar to human pain responses. This breakthrough significantly enhances robotic awareness, safety, and interaction capability. The electronic skin is composed of flexible layers embedded with thousands of sensors that detect touch, force, heat, and sharp impacts. When excessive pressure or damage is detected, signals are instantly sent to the robot’s control system, triggering reflex-like reactions such as pulling away or adjusting grip strength. Pain perception in robots is not about suffering but protection. By detecting harmful conditions early, robots can prevent self-damage, handle fragile objects more carefully, and operate safely alongside humans. This is especially critical for robots used in healthcare, manufacturing, and public environments. The technology also improves learning. Feedback from the electronic skin allows robots to refine movements over time, similar to how humans learn through sensory experience. This results in better precision, adaptability, and durability. This development represents a step toward robots that are not just intelligent, but physically aware. As humanoid robots become more integrated into society, such sensory systems will be essential for safe and effective human-robot collaboration.
-
I am pleased to highlight the recent achievements of Dr SHI Ge, who completed his PhD at UCL Mechanical Engineering/UCL and is now a researcher at the Commonwealth Scientific and Industrial Research Organisation (CSIRO's Data61). SHI Ge's latest publication in the IEEE Transactions on Haptics (#ToH) presents a novel multi-cavity haptic feedback system. This system utilises a purely hydraulic-based approach that detects physical touch and delivers directional feedback through a fingertip sensor, paving the way for enhanced tactile interaction capabilities. Read the full article here: https://lnkd.in/enVR2CG5. This research builds upon his prior work on fluidic haptic interfaces for mechano-tactile feedback, previously published in the IEEE Transactions on Haptics (https://lnkd.in/edMDUntD), and modelled using finite deformation theory, which was featured in #SoftMatter Journal by The Royal Society of Chemistry. Read the full article here: https://lnkd.in/eNN87kfz In collaboration with Dr Jialei Shi, who graduated from UCL Mechanical Engineering/UCL earlier in the year and is now with the Hamlyn Centre for Robotic Surgery at Imperial College London, they developed a flexible, soft robotic handheld laparoscopic device, driven by this innovative multi-cavity touch interface. This work has been published in IEEE Transactions on Medical Robotics and Bionics (#TMRB): https://lnkd.in/eFVHB6Cd. This impactful research has been supported by UCL Grand Challenges, the EPSRC (grant number: EP/V01062X/1), and UCL-Indian Institute of Technology, Delhi Seed Funding 2020-21. #Haptics #Robotics #Research #Innovation #UCL #MedicalRobotics #IEEE
-
Most robots rely only on vision, making force-heavy tasks difficult; this system lets users feel forces directly w/o extra sensors: [Project & Paper ⬇️] FACTR combines force feedback with smart training, helping robots adapt to real-world interactions more like humans. Why this matters: ✅ Uses force feedback for better control in tasks that need precision ✅ Learns faster with a training method that prevents over-reliance on vision ✅ Improves performance on new objects by 43 percent compared to other methods ✅ Works with a simple, low-cost teleoperation system for better data collection FACTR makes robots better at handling real-world objects by teaching them to feel, not just see. Project: https://lnkd.in/du96ri34 Paper: https://lnkd.in/dUcEqTh5
-
Touch is the final frontier for the intersection of robotics & artificial intelligence, and a recent advance has shown how close we are getting to solving one of the main challenge: smart manipulation. Today’s humanoid hands are learning to sense, adapt, and collaborate with a nuance that rivals biology, and AgiBot World and FlexiRay have shown that distributed, multimodal tactile arrays that cover fingertips, phalanges, and palms, can process over a million manipulation trajectories, enabling robots to outperform previous dexterity benchmarks by 30% and achieve 2x task success rates when tactile feedback is integrated. We are seeing these sort of improvements at @analog devices as well, in our pioneering work in multimodal dexterous AI solutions. These systems process tactile data in parallel and adapt to new objects and tasks in real time, with the richness of sensory feedback as the key element. The labs and companies building these capabilities now are setting the standard for the next wave of human–robot collaboration. #TactileSensing #DexterousRobotics #HumanRobotInteraction #RoboticsResearch #AIManipulation https://lnkd.in/eRq84Cjs
-
Designing for the senses, how do you trick the brain into believing what it sees? In XR the secret to immersion lies in human perception. Sight, sound, touch and even smell work together to convince your brain that the virtual is real. But here’s the challenge: if these sensory inputs aren’t perfectly synchronized, the illusion breaks. A delayed sound or mismatched haptic feedback? Suddenly, you’re reminded it’s all just pixels and code. So, how do designers achieve harmony? 🔹 Visuals: High resolution, smooth motion, and minimal latency. 🔹 Audio: Spatial soundscapes that match what you see. 🔹 Touch: Haptic feedback that mirrors real-world sensations. 🔹 Smell & Taste: Emerging areas for deeper immersion in the future. The key is balance, too much stimulation can overwhelm users, while lag or inconsistency breaks the experience entirely. Curious about how XR designers pull this off? I’ve broken it down further in my latest article.