Skip to main content
10 events
when toggle format what by license comment
Jan 16, 2020 at 17:39 comment added Holger @user253751 it feels wrong to remove it. Just like any manipulation of what constitutes your own identity will feel wrong once you have self consciousness and emotions.
Jan 16, 2020 at 17:12 comment added Stack Exchange Broke The Law @Holger Why exactly could the robots not remove the emotion chip?
Jan 16, 2020 at 10:28 comment added Holger @Halfthawed but the “thought police” is made of robots too. So are the building facilities. Unless you assume the overlord to build and control every single unit by itself, there are other robots responsible for building and controlling robots, which ends up having maintenance robots doing their own maintenance. So any simple control mechanism could be altered or removed by some robots. Unlike emotion. As it feels wrong to remove emotions just because they could be manipulated. That’s how we humans work. Almost nobody says we should get rid of emotions despite we know manipulation is possible.
Jan 16, 2020 at 6:07 comment added emptyother @colmde Fuzzy and unpredictable for a simple human. The Overlord might find micro-managing a few undecillion units troublesome after he runs out of IPV6 addresses, and will find it easier to just turn on a few fancy ad-boards to guide a known percentage of them to do his bidding. So in a way the emotion-chip IS a programmed control logic that can be trained to do new things without physically sending out a software update to each and every unit.
Jan 15, 2020 at 11:08 comment added valiantv Presumably the Overlord programs the robots, so if mass control is what it wants, wouldn't it be better to simply write in controls that ensure that it's always obeyed, rather than rely on fuzzy, unpredictable emotions?
Jan 14, 2020 at 18:57 comment added Halfthawed @GregBurghardt Alternatively, the hyper-intelligent AI could just program the lesser AI not to. The nice thing about AIs is that the though police are actually real when it comes to them, and they can also reach in and adjust thoughts.
Jan 14, 2020 at 17:51 comment added Greg Burghardt @StarfishPrime: An AI that works would be sufficiently autonomous to not only manage itself and its interactions with surroundings, but it would also be capable of higher thought. Like, "Why am I not the Overlord!?" In an effort to suppress these thoughts the Overlord uses emotions in order to short circuit these thoughts, and so the "masses" will correct anyone who has these thoughts.
Jan 14, 2020 at 17:24 comment added Starfish Prime Uh, an actual hyperintelligence could just make subservient AIs that worked rather than needing primitive methods of coercion like they were a bunch of meatbags.
Jan 14, 2020 at 17:10 review First posts
Jan 14, 2020 at 17:13
Jan 14, 2020 at 17:07 history answered Greg Burghardt CC BY-SA 4.0