![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
A quite possibly triggery thought on the Three Laws of Robotics.
I came up with a rather unplesent thought experiment in a discussion on Asimov's Three Laws of Robotics, explicating why I think they are fundamentally unethical.
The problem with the Three Laws is that they involve such high-level concepts, tthat he robots have to be sentient beings with human level intelligence in order for the concepts work. In which case, we're not really talking about programming, we're talking about brainwashing.
To distill the ethics of the Three Laws to their essence, let's change the target of the Laws. We'll change the wording as so:
1. A Negro may not injure a White or, through inaction, allow a White to come to harm.
2. A Negro must obey the orders given to it by Whites, except where such orders would conflict with the First Law.
3. A Negro must protect its own existence as long as such protection does not conflict with the First or Second Law.
no subject
And as far as "limited beings" being created with innate comments, well what if, through neural engineering we could create humans who have "Three Laws" equivalents from birth. Would that be any different from an ethics perspective?
no subject
A being that had a human(-like?) body but a circumscribed consciousness is, in my view, not properly "human". I would draw a distinction between reducing a pre-existing human to such a state (such as via brainwashing), and growing such a being from scratch -- the former being more troubling than the latter... For me, this has something to do with the "potential" of a given being versus how it's allowed to express. The domestic dog, for instance, comes both in breeds that nobly adapt its wolf origins (such as herding dogs) as well as breeds that subvert that origin for what I call frivolous aesthetic reasons. All that being said, it could also be less immoral to keep a species around in acceptable form than to cause its extinction with its charateristics intact.