Jump to content

"Three Laws"-Compliant: Difference between revisions

no edit summary
(replaced redirect to trope with direct link)
No edit summary
Line 155:
** Of course, all this works properly [http://freefall.purrsia.com/ff2200/fc02200.htm only as long as other security measures stop tampering with software], and physical access to hardware is the point where security measures traditionally split into "minor delay" and "[http://freefall.purrsia.com/ff2300/fc02235.htm fool's errand]" categories.
** Determination of "human" [http://freefall.purrsia.com/ff2600/fc02547.htm had to] err on the safe side, and combined with learning (let alone organic brain based) AI this leads to giving the term some good stretch. Especially since the developer encouraged this outcome.
** The First Law being a free will override, robots are not inclined to stop and think what they are doing. Which is why Florence learned to avoid anything that may trip "hurr, {{smallcapssmall-caps|humans in danger}}" reaction altogether. Which is [http://freefall.purrsia.com/ff3100/fc03057.htm troublesome] even in case they can help, since compulsion doesn't magically solve basic coordination problems. As she [http://freefall.purrsia.com/ff3100/fc03057.htm points out], robots not trained or programmed for an adequate responses usually just go full "[[Leeroy Jenkins]]!" all at once, so in an actual emergency they could make things worse, for example [http://freefall.purrsia.com/ff3100/fc03053.htm by clogging the exits].
*** Also, if [[Morton's Fork|either choice]] may harm humans, robots are going to throw themselves whichever way, [http://freefall.purrsia.com/ff2300/fc02222.htm then other robots must try to stop them] from taking dangerous actions… and so on, [[Dwarf Fortress|"loyalty cascade"]] style. And then confused humans are likely to attempt solving the immediate problems by giving uncoordinated orders, which would only multiply the chaos.
* ''[[21st Century Fox (webcomic)|21st Century Fox]]'' has all robots with the Three Laws (though since [[Funny Animal|no one's human]] I expect the first law is slightly different). Unfortunately saying the phrase "[[Bill Clinton|define the word 'is']]", or "[[Richard Nixon|I am not a crook]]" locks AIs out of their own systems and allows anyone from a teenager looking at "nature documentaries" to a suicide bomber to do whatever they want with it.
Cookies help us deliver our services. By using our services, you agree to our use of cookies.