Galatea


In the end, how did Galatea end life support for Portia without violating the first law of robotics? Was it 'not harm' since she didn't want to live without Andrew?

"A robot may not injure a human being or, through inaction, allow a human being to come to harm."

I'm a nerd for noticing this, but there you go.



In episode 2F09, when Itchy plays Scratchy's skeleton like a xylophone, he strikes the same rib twice in succession, yet he produces two clearly different tones. I mean, what are we to believe, that this is some sort of a [the three nerds chuckle] magic xylophone or something? Boy, I really hope somebody got fired for that blunder.

reply

I think the implication is that she somehow also became an individualistic, emotional robotic lifeform, much like Andrew. She showed genuine emotion and abstract thought; it was not simply just the emotion chip.

reply

[deleted]

It showed she had broken the bonds of the three laws realy, she could act with compassion as opposed to raw, dull logic and could think for herself.

reply

Technically she was not directly causing harm to Portia. She was assisting and aiding her to die peacefully. had Portia ordered her to hit her over the head with a baseball bat then i figure she would not have complied

There's 3 ways a man can wear his hair; front-parted, side-parted or departed

reply

That would be a spectacular ending!

"hit me in the head" "okay" BAAAANG!!!

reply

I just thought that robots had advanced in that future time beyond the "3 laws of robotics". Maybe wrong but just my thought.

reply

Well I always assumed that after Andrew removed her personality chip he taught her much like Sir had taught him.

reply

highpriestess32 is correct. It also goes to the definition of "harm" ... It may have been more harmful to Portia to keep her alive through artificial means, so when Portia ordered Galatea to pull the plug, Galatea may have reasoned that she could follow the order (third law) because it was not in conflict with the first law.

rwsmith29456, the Three Laws Of Robotics are "hard-wired" into every positronic brain. This is a mandate of the human populace, which is afraid that robots will otherwise become 'superior' to humans, and take over the world.

knowsaboutfords ... No, Andrew was unique among robots. Neither US Robotics, nor Andrew himself, could ever turn a robotic brain into a human one. Something that happened during his manufacture blended with the way "Sir" demanded that Andrew be treated, to develop self-awareness in this single robot.

reply

Galatea was simply obeying an order from a human. The second law states that the androids must preserve themselves, but Andrew jumps out of a window on an order from the older girl. Galatea was obeying an order from a human, as per the core of her programming.

"I'm not A1nut because I'm normal...."

John "A1nut"

reply


A Law can't be obeyed if it contradicts a higher law... in this case, #1.


The Doctor is out. Far out.

reply

is this the birth to the film "I robot"

Bond James Bond

reply

The OP has a valid point but there seems to be some confusion about the three laws:

Asimov's three laws (Quoted, I'm sure, in many other places):

1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

@highpriestess32 & xenaguy - Surely there is no greater "harm" that can be done to a human being than their irreversible death?

@A1nut - It is the third law which requires a robot to preserve its existence *However* only in the event that such preservation does not conflict with the first or second laws. Jumping from a window as instructed by a human is required by the second law (It does not violate the first and overrides the third). An instruction to kill a human violates the first law and therefore would not be obeyed.

The other posters who mention that Galatea has removed/superceded/unlearned the three laws probably have the most reasonable answer (IMHO :-)) to this apparent error.

reply

In Asimov's "Three Laws" stories, most of Dr. Susan Calvin's (robot psychologist) cases were based on perceived conflicts between the three laws. One story had a robot circling a compound on planet Mercury because of a created "balance" between Laws #2 and #3. This is the same condition created in this movie when Portia instructed Galatea to ease her pain. Galatea's dilemma was whether to follow Law #1 or Law #2. Her conclusion was that Law #2 had a greater imperative, because Law #1 does not say, "kill", but rather, "harm."

A robot could aid a human in euthanasia, because the human could lead the robot to believe that more harm would be done by leaving the human alive. This problem has been addressed thousands of times by families who have had to make the decision to "pull the plug" on a dying relative.

Surely there is no greater "harm" that can be done to a human being than their irreversible death?

Apparently you haven't been in the position of making that decision for a vegetative relative. I have, and with my siblings, struggled for days over the problem. We finally decided to let my mother die as she slept, rather than keeping her "alive" via a respirator.

reply