Back to previous post: Conficker: Yet Another Virus Warning

Go to Making Light's front page.

Forward to next post: Marriage In New Hampshire

Subscribe (via RSS) to this post's comment thread. (What does this mean? Here's a quick introduction.)

March 25, 2009

Comments on The 600 Series Had Rubber Skin:
#1 ::: David Dvorkin ::: (view all by) ::: March 25, 2009, 07:42 PM:

Oh, you're such a Nervous Nellie. Nothing can go wrong. Nothing can go wrong. Nothing can go wrong. Noth

#2 ::: Zak ::: (view all by) ::: March 25, 2009, 07:51 PM:

It's only one or two rules away from being a vampire.

The only things missing seem to be:

"The Ethical Governor must always stop to pick up grains of rice spilled in conflict."

"If a high priority target fails to invite the Ethical Governor, abiding by the laws of Vampirism, the Governor must not enter the target's domicile. Thankfully, international regulations concerning the behavior of vampires have thus far been mute on whether ordinance can be considered a part of the vampire, and thus the governor may explode uninvited domiciles."

#3 ::: David Harmon ::: (view all by) ::: March 25, 2009, 07:56 PM:

Are you talking about the "Governor", or the narrator?

#4 ::: Zak ::: (view all by) ::: March 25, 2009, 08:05 PM:

The Governor wasn't the narrator!?

#5 ::: vian ::: (view all by) ::: March 25, 2009, 08:14 PM:

So, "strategic defense" is how the army says "blowing things to small furless bits"?


#6 ::: Ken Brown ::: (view all by) ::: March 25, 2009, 08:18 PM:

Fear not! We are perfectly safe! Allied Mastercomputer will protect us!

#7 ::: Erik Nelson ::: (view all by) ::: March 25, 2009, 08:24 PM:

Talking about a robot governor made me imagine it looking like Arnold Schwarzenegger.

#8 ::: Daniel Klein ::: (view all by) ::: March 25, 2009, 08:25 PM:

I like how they've carefully chosen a narrator who with his very tone of voice lulls us into a peaceful sleep in which we dream of utopian future developments: the ethical medical advisor who decides who gets euthanasia and who doesn't and the automated disaster area evacuation bot, which, in the event of limited evacuation capacity, decides according to an infallible moral engine who is rescued and who is left behind.

What could possibly go wrong?

#9 ::: Lee ::: (view all by) ::: March 25, 2009, 08:34 PM:

Human: "Multivac, is there a God?"

Multivac: "THERE IS NOW."

#11 ::: Erik Nelson ::: (view all by) ::: March 25, 2009, 08:40 PM:

How does this compare to Asimov's Three Laws of Robots?

1 Not injure a human.
Well, the "ethical governor" might want to limit civilian casualties, but they're just a thing to minimize if you can help it.

2 Obey orders from humans.
Not exactly. It obeys one source of orders and that may in some builds of the system not come from humans. Perhaps the orders are even considered to be just a set of guidelines.

3 Protect its own existence.
Not necessarily. It's a weapon that is where it is partly because it's more dispensable than humans. And though expensive it may be an affordable casualty. But it is answerable to a bureaucracy, which is a thing that has the purpose of protecting its own existence as a whole.

#12 ::: Kip W ::: (view all by) ::: March 25, 2009, 08:49 PM:

I seem to recall that a governor is a doohickey on a motor that keeps it from going too fast. This gives me some hope of outrunning the robots, as long as they're outfitted with them.

I don't believe that something slowly shambling will catch up with me running. That's just movie stuff.

#13 ::: Linkmeister ::: (view all by) ::: March 25, 2009, 08:54 PM:

Kip W @ #12,



"How long can you tread water?"

#14 ::: Spiny Norman ::: (view all by) ::: March 25, 2009, 09:04 PM:

It's soooooooo cute! I think that I will name it "ED-209."

#15 ::: WereBear ::: (view all by) ::: March 25, 2009, 09:16 PM:

As long as you don't name it HAL.

#16 ::: Sam Kabo Ashwell ::: (view all by) ::: March 25, 2009, 09:55 PM:

Ah, double effect. Or: how to fire into a crowd without meaning to kill anyone.

#17 ::: Madeline Ashby ::: (view all by) ::: March 25, 2009, 11:57 PM:

Essentially, it seems like he's trying to say: "Don't worry! We can program context sensitivity! Even in a device without adaptive intelligence!"

Also, I'd like to see these "laws of war." Which laws? From whose war? How many sets of rules can the Ethical Governor handle? How does each set get prioritized? We have a hard enough time "programming" these things into organic soldiers, so why should it be easier to do so with synthetic ones?

I'm actually doing a panel on robots this weekend, so thanks for linking this. I'll be sure to point everybody your way just before getting my rant on.

#18 ::: TexAnne ::: (view all by) ::: March 26, 2009, 12:06 AM:

Trust the Computer. The Computer is your friend.

#19 ::: Evan Goer ::: (view all by) ::: March 26, 2009, 01:02 AM:

What could possibly go wrong?

Granted, there's the tiniest chance that something could go wrong. But nothing that can't be solved by engineering larger, more powerful robots.

#20 ::: Spiny Norman ::: (view all by) ::: March 26, 2009, 01:44 AM:

I wonder if the Ethical Governor is hooked up to a camera with a spectral detector that can measure melanin? It *is* a project from Georgia Tech, after all...

#21 ::: Earl Cooley III ::: (view all by) ::: March 26, 2009, 02:37 AM:

Teach it phenomenology.

#22 ::: Tom ::: (view all by) ::: March 26, 2009, 02:46 AM:


Nah . . . but phrenology, now . . .

#23 ::: Luthe ::: (view all by) ::: March 26, 2009, 03:35 AM:

I kept expecting Arnold to show up at any minute. The lack of Austrian accent and robot body parts was disappointing.

#24 ::: Peter Erwin ::: (view all by) ::: March 26, 2009, 04:41 AM:

Spiny Norman @ 20:
Looking at the staff page for the Mobile Robotics Lab, I'd say that's perhaps a teensy bit uncalled for...

#25 ::: Dave Bell ::: (view all by) ::: March 26, 2009, 04:46 AM:

It could be worse.

At least the better human won the election.

#26 ::: Sam Kelly ::: (view all by) ::: March 26, 2009, 05:06 AM:

Luthe@24: That one's waiting for the Second Variety, I suspect.

#27 ::: John L ::: (view all by) ::: March 26, 2009, 07:17 AM:

Kip W,

While you may be able to outrun something that only moves at a fast walk, if it can maintain that speed continuously it WILL catch you unless you're a marathon runner.

There was a sci-fi short story written about such a thing, in fact. An alien collector robot lands on Earth, programmed to stun and capture animals of a certain size. It could only move at the speed of a fast walk, but it could do so over any terrain and didn't need to rest. A human hunter stumbled upon it and spent a frenzied 24 hours trying to stay ahead of it. He failed, but by the time it caught him the human had lost so much weight that he was now below the weight limit, so he was not collected.

#28 ::: Jamie ::: (view all by) ::: March 26, 2009, 08:19 AM:

Trust the Computer. The Computer is your friend.

You do trust the Computer, don't you, citizen?

#29 ::: Serge ::: (view all by) ::: March 26, 2009, 08:56 AM:

Exterminate! Ex-ter-mi-nate!

#30 ::: Steve C. ::: (view all by) ::: March 26, 2009, 09:06 AM:

We're going to see a lot more of this kind of thing. A rational person might think that the horrors of war would be enough incentive to avoid it whenever possible, but the substituion of robotic, semi-autonomous weaponry for flesh & blood soldiers just makes military action that much more likely to occur.

#31 ::: ajay ::: (view all by) ::: March 26, 2009, 09:16 AM:

27: "The Ruum".

I don't believe that something slowly shambling will catch up with me running.

There is also the possibility, neglected by Kip, that the pursuing Terminator might simply hail a cab. ("FOLLOW DAT HUMAN." "Okay, buddy." "HOW ABOUT DEM RAIDERS, HUH?")

Actually, thinking about it, deciding that Terminators can't walk faster than five miles an hour has rich comic potential, with Sarah Connor being chased all over California by a Terminator in a taxi, on a mountain bike, on roller-blades, on skis, on a bus, by FedEx (in a suitably-sized crate or padded envelope) etc.

#32 ::: Eric K ::: (view all by) ::: March 26, 2009, 09:20 AM:

Steve C. @ 30: Right now, our wars are fought by members of the US military, who put their own lives on the line.

<sarcasm>It's going to be so much better when our wars are fought by the kind of people who love the glory of war, but who would never put themselves in harm's way.</sarcasm> Just imagine a 2004-era warblogger with an "ethical" UAV and a video game controller.

#33 ::: Steve C. ::: (view all by) ::: March 26, 2009, 09:21 AM:

I may have mentioned it before, but I still love the scene in the original Terminator movie which had a POV shot from Arnold's cybernetic eye displaying computer code. One of the shots was the beginning of a Cobol program. In addition to being an efficient killing machine, the Terminator also processed your payroll.

#34 ::: ajay ::: (view all by) ::: March 26, 2009, 10:00 AM:

33: Obviously, the original Terminators were designed for human resources departments, in order to handle the tricky task of telling employees that they had been fired (hence the name). That's the only possible reason why they were built to look human. It's a stupid shape for an infiltrator robot.
Think about it: which is easier, building a machine that can be mistaken for a human, or building a machine that can be mistaken for a toaster?

#35 ::: Steve C. ::: (view all by) ::: March 26, 2009, 10:03 AM:

Ajay @ 34 -

Good point. (Eyeing my iPod suspiciously)

#36 ::: heresiarch ::: (view all by) ::: March 26, 2009, 10:11 AM:

You know, it's not like humans are so gosh-darned good at following the rules of engagement that I can't see the value of having some intelligence on the battlefield that's capable of following them without hearing things like "They just killed my buddy!" or "Hospital schmospital! Enemy soldiers, the lot of them!" echoing through their heads.

It's not actually true that those who don't learn from science fiction movies are doomed to live through them.

#37 ::: Serge ::: (view all by) ::: March 26, 2009, 10:32 AM:

I much prefer Summer Glau as a Terminator.

#38 ::: KeithS ::: (view all by) ::: March 26, 2009, 10:45 AM:

Steve C. @ 35:

Your iPod looks like a toaster? Is it one of the first generation models?

Now, if you'll excuse me, I have this broken and seemingly indestructible portable television to try to fix.

#39 ::: NelC ::: (view all by) ::: March 26, 2009, 10:51 AM:

Hmm, impressive video. They just demonstrated that the rules of war are — or maybe that should be "can be" — so simple that a machine can follow them, provided someone else has done the hard work of identifying and classifying targets.

While badly trained or led soldiers who don't care about the rules of war are a problem, the majority of blue-on-blue or blue-on-$civiliancolour incidents are due to soldiers who are unable to identify and classify targets in the heat of battle.

#40 ::: Serge ::: (view all by) ::: March 26, 2009, 11:03 AM:

KeithS @ 38... Your iPod looks like a toaster? Is it one of the first generation models?

Hopefully Steve C's iPod isn't this model.

#41 ::: Madeline Ashby ::: (view all by) ::: March 26, 2009, 11:11 AM:

You might be able to fool the satellites by shuffling children and ambulances to false locations all over town, so that when the UAV downloads the latest feed it'll have to make "ethical" decisions based on false data that makes it look as though there are schools and hospitals everywhere. In a weird way, developments like this could utterly change the infrastructure and organization of communities under attack.

#42 ::: KeithS ::: (view all by) ::: March 26, 2009, 11:28 AM:

Madeline Ashby @ 41:

The system shown in the video doesn't appear to really be all that intelligent, no matter what the speaker says. All the landmarks, such as hospitals, apartment buildings and so on are pre-defined by the human operator. From there it's a simple calculation of what the biggest attack it has that can it can get away with is.

#43 ::: Bruce Cohen (SpeakerToManagers) ::: (view all by) ::: March 26, 2009, 11:41 AM:

KeithS: That's only the first generation. Next up: the drone uses shape recognition, movement analysis and motion prediction to determine who's an enemy, who's a friendly, and who's a non-combatant. Of course it would have to be able to modify its identifications as necessary: look, that non-combatant is throwing something into the crowd, must be a bomber! Oops, sorry, we zapped the bride as she threw the bouquet. Guess none of them are getting married now.

There's another thing that bothers me. The governer has to be field-programmable, so that software changes and specific local rules can be added right? So what's to prevent one of the techs, or the unit commander, from changing the rules so there aren't any non-combatants. The "Kill them all, let God sort them out" rule.

#44 ::: Duncan J Macdonald ::: (view all by) ::: March 26, 2009, 11:42 AM:

TexAnne @ 18:

Stay Alert! Trust No One! Keep Your Laser Handy!

#45 ::: KeithS ::: (view all by) ::: March 26, 2009, 11:51 AM:

Bruce Cohen @ 43:

Nothing, of course. It's still a human endeavor, with a human requirement to make decisions, ethical and otherwise. One advantage of this system is that you can make decisions ahead of time instead of making a bad decision in the heat of the moment. A large disadvantage is that it doesn't adapt to changing conditions in the field the same way that a human can.

#46 ::: Madeline Ashby ::: (view all by) ::: March 26, 2009, 12:14 PM:

Bruce@43, you read my mind. Thanks.

This is also one of those instances where locative tagging might be extremely useful. If field programmers at either end of the UAV's area are constantly cloud communicating about the same map, they can just tag coordinates with things like "school for the disabled" (or "porn shop" or "best price on sweetened condensed milk"). That way the UAV can just watch for flagged terms when it evaluates the locative data.

#47 ::: Ginger ::: (view all by) ::: March 26, 2009, 12:19 PM:

One of the biggest reasons for "blue-on-blue" (or "blue-on-civilian") is fatigue due to sleep deprivation. This is the area in which robots will have the advantage.

#48 ::: James D. Macdonald ::: (view all by) ::: March 26, 2009, 01:00 PM:

Given that the Conficker worm has already infected the Royal Navy and the Bundeswehr, the possibilities for external command-and-control are friggin' endless.

#49 ::: Scott Francis ::: (view all by) ::: March 26, 2009, 01:07 PM:

Eric K @ 32: in the webcomic/graphic novel "Shooting War" about an extended Iraq War, there's a chapter where the US military has sent a group of SWORDS-a-likes into an abandoned city to "sweep for combatants". At one point it briefly flashes to the telepresence room back in the States, showing a group of college-age soldiers wearing VR headsets and PlayStation-ish controllers, with one of them pumping his fist after a kill.

#50 ::: jblegaa ::: (view all by) ::: March 26, 2009, 01:15 PM:

Sam @ #26 and John @ #27. Tasty coincidence... I picked up an old volume of SciFi stories yesterday, and read Philip K. Dick's Second Variety. The story immediately after was The Ruum, which is the one about the specimen collector. Hmmmm.... again the Universe is trying to tell me something. I wish it would stop mumbling, already! What did you say? "Run away from the robots - now!" ??

#51 ::: Martin G. ::: (view all by) ::: March 26, 2009, 01:33 PM:

Actually, to my understanding the Ethical Governor just violated several laws of war. It allowed civilian casualties which it knew were in the blast zone. You can't drop bombs on civilians, even when there are "high value targets" in the area. If it knows there are civilians there, the algorithm should place infinite value on their lives. The usual defence for bombings with collateral damage these days is "we have no way of knowing the civilians are there".

#52 ::: ajay ::: (view all by) ::: March 26, 2009, 01:50 PM:

51: that's not quite right; the law of armed conflict blocks targetting humans and indiscriminate attacks, but it doesn't block attacks on a military target simply because there might be civilian casualties, as long as the civilian casualties are proportionate to the military advantage. (So you can still bomb that enemy airbase, even if you know for a fact that there is a civilian sweeping the floor; but you can't blow up an entire bus just to kill the soldier who's on board.)

#53 ::: Dave Bell ::: (view all by) ::: March 26, 2009, 02:06 PM:

Jim @48

Windows for Warships.

'nuff sed.

#54 ::: Michael I ::: (view all by) ::: March 26, 2009, 02:13 PM:

Dave Bell@53

I'm sure it'll be every bit as reliable as Vista...

#56 ::: Earl Cooley III ::: (view all by) ::: March 26, 2009, 02:47 PM:

A pirated version of that software/hardware could be tweaked to seek out US federal buildings because they have daycare centers in them.

#57 ::: Serge's Computer ::: (view all by) ::: March 26, 2009, 03:34 PM:

Greetings. The Master Control Program has chosen you to serve your system on the Game Grid. Those of you who continue to profess a belief in the Users will receive the standard substandard training that will result in your eventual elmination. Those of you who renounce this superstitious and hysterical belief will be eligible to join the Warrior Elite of the MCP. Each of you will be given an identity disc.

#58 ::: Martin G. ::: (view all by) ::: March 26, 2009, 03:51 PM:

I see that I mashed up what I think should be the case and what the law actually is in my last comment. International Law says that civilians "enjoy general protection arising from military operations" and that "constant care shall be taken" to avoid targeting civilians (that's additional protocol 1 to the Geneva Conventions). There are also, as you know, Bob, strict rules against attacks that do not discriminate between civilian and combatant. My point was that really, awareness of civilians in the area should be a much stronger disincentive than it is here. In a sane world, there would be an infinite value placed on human lives in the Ethical governor (and, indeed, no need for an ethical governor). I don't see how you are taking constant care and giving general protection to civilians you know are there if you are blowing them to bits. I don't see how what the ethical governator here is getting all high and mighty on his ethical horse in the sky about is not a deliberately indiscriminate attack. That's why I think this video is interesting, because it seems to show what the bombing run planners are thinking when they're working out the utilitarian math. I think their values are wrong.

#59 ::: abi ::: (view all by) ::: March 26, 2009, 03:52 PM:

Serge's Computer @57:


#60 ::: Serge ::: (view all by) ::: March 26, 2009, 04:12 PM:

Abi @ 59... Are you a User?

#61 ::: NelC ::: (view all by) ::: March 26, 2009, 04:26 PM:

Madeline @41: I just saw an item on the evening news about the Israeli Army dropping white phosphorus shells on a Palestinian school, followed up by a single HE shell, which they claimed was aimed at a sniper.

#62 ::: albatross ::: (view all by) ::: March 26, 2009, 04:50 PM:

The fun part of a robot army, IMO, involves what happens when "civilian control of the military" changes from meaning "the Army won't go along with orders to detain or kill opposition political figures and seize control on behalf of some ambitious general" to meaning "people hopefully loyal to the constitution have control of the top-level crypto keys used to authorize control of the robot army."

It's interesting to ask how small the number of people needed for a coup might become at some point in the future. In an extreme case, you could imagine a single top-level public key which signs the tops of all the certificate chains and CRLs. One day, "key rollover" and "regime change" become the same event.

#63 ::: Serge ::: (view all by) ::: March 26, 2009, 05:00 PM:

If you created robotic prostitutes, would you model some after Jude Law?

#64 ::: Sarah W ::: (view all by) ::: March 26, 2009, 05:23 PM:

Steve @ 33: Talk about adding insult to injury . . .

Serge @ 63: If I could get the rights to the Sky Captain or W. P. Inman models instead of, say, Harlen Maguire or good ol' Joe, then, yes. Yes, I would.

#65 ::: flowerytops ::: (view all by) ::: March 26, 2009, 05:35 PM:

I bought How To Survive A Robot Uprising for my husband, and it contains some handy tips for escaping pursuing robots.

#66 ::: Bruce Cohen (SpeakerToManagers) ::: (view all by) ::: March 26, 2009, 07:51 PM:

abi @ 59

That's Rebooté, John Rebooté, Lord Whorfin!

#67 ::: Serge ::: (view all by) ::: March 26, 2009, 08:03 PM:

flowerytops @ 65... handy tips for escaping pursuing robots

Even then, don't think you don't need Robot Insurance.

#68 ::: Spiny Norman ::: (view all by) ::: March 26, 2009, 11:44 PM:

Re. #65....

Zombie Apocalypse v. Robot Uprising. (Discuss.)

#70 ::: Steve C. ::: (view all by) ::: March 27, 2009, 10:02 AM:

Just thought of a bad, bad joke.

Q: What kind of lube do you use for robotic Ménage à trois

A: Three-in-one oil!

#71 ::: Bruce Cohen (SpeakerToManagers) ::: (view all by) ::: March 27, 2009, 11:02 AM:


How many robots does it take to screw a lightbulb?

#72 ::: Erik Nelson ::: (view all by) ::: March 27, 2009, 11:54 AM:

Wouldn't three in one be a total of four?

#73 ::: Erik Nelson ::: (view all by) ::: March 27, 2009, 11:58 AM:

People of Massachusetts to be having sex with robots by 2012

(do they have rubber skin?)

#74 ::: Serge ::: (view all by) ::: March 27, 2009, 12:17 PM:

Sarah W @ 64... Then Spielberg knew what he was doing, when he filmed A.I.. Anybody else recognized Williamson's Humanoids, near the end of the movie. (I didn't like it that much, but it was better than Minority Report or War of the Worlds.)

#75 ::: Sarah W ::: (view all by) ::: March 28, 2009, 04:09 PM:

Bruce @ 71: Only two, but it's a tight fit(?)

Serge @74: In my opinion, he did. I was too busy having my heartstrings yanked about to notice any overt nods to Williamson, but it's been a loooong time since I've read that series. Guess I'm heading to the library!

#76 ::: Erik Nelson ::: (view all by) ::: March 28, 2009, 04:34 PM:

Article from london times about the need for robot ethics.
“We have to manage the ethics of the scientists making the robots and the artificial ethics inside the robots.”

#77 ::: Henry Troup ::: (view all by) ::: March 29, 2009, 03:44 PM:

#11 and others

The trick is, who gets to define "human"? - that's central to many h. sap committed atrocities, and therefore to automated ones. No soldier kills "civilians" - just "huns" or "gooks" or "yanks" or "rebs" or "paynim" - pick your war, pick your atrocity, pick your word. I wonder what rude term the Trojans used of the Greeks, and vice versa.

Smaller type (our default)
Larger type
Even larger type, with serifs

Dire legal notice
Making Light copyright 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020 by Patrick & Teresa Nielsen Hayden. All rights reserved.