Go to Making Light's front page.
Forward to next post: Marriage In New Hampshire
Subscribe (via RSS) to this post's comment thread. (What does this mean? Here's a quick introduction.)
The Skynet Funding Bill is passed. The system goes on-line August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug.
Oh, you're such a Nervous Nellie. Nothing can go wrong. Nothing can go wrong. Nothing can go wrong. Noth
It's only one or two rules away from being a vampire.
The only things missing seem to be:
"The Ethical Governor must always stop to pick up grains of rice spilled in conflict."
"If a high priority target fails to invite the Ethical Governor, abiding by the laws of Vampirism, the Governor must not enter the target's domicile. Thankfully, international regulations concerning the behavior of vampires have thus far been mute on whether ordinance can be considered a part of the vampire, and thus the governor may explode uninvited domiciles."
Are you talking about the "Governor", or the narrator?
So, "strategic defense" is how the army says "blowing things to small furless bits"?
Doubleplusgood.
Fear not! We are perfectly safe! Allied Mastercomputer will protect us!
Talking about a robot governor made me imagine it looking like Arnold Schwarzenegger.
I like how they've carefully chosen a narrator who with his very tone of voice lulls us into a peaceful sleep in which we dream of utopian future developments: the ethical medical advisor who decides who gets euthanasia and who doesn't and the automated disaster area evacuation bot, which, in the event of limited evacuation capacity, decides according to an infallible moral engine who is rescued and who is left behind.
What could possibly go wrong?
Human: "Multivac, is there a God?"
Multivac: "THERE IS NOW."
How does this compare to Asimov's Three Laws of Robots?
1 Not injure a human.
Well, the "ethical governor" might want to limit civilian casualties, but they're just a thing to minimize if you can help it.
2 Obey orders from humans.
Not exactly. It obeys one source of orders and that may in some builds of the system not come from humans. Perhaps the orders are even considered to be just a set of guidelines.
3 Protect its own existence.
Not necessarily. It's a weapon that is where it is partly because it's more dispensable than humans. And though expensive it may be an affordable casualty. But it is answerable to a bureaucracy, which is a thing that has the purpose of protecting its own existence as a whole.
I seem to recall that a governor is a doohickey on a motor that keeps it from going too fast. This gives me some hope of outrunning the robots, as long as they're outfitted with them.
I don't believe that something slowly shambling will catch up with me running. That's just movie stuff.
Kip W @ #12,
"Noah."
"What?"
"How long can you tread water?"
It's soooooooo cute! I think that I will name it "ED-209."
Ah, double effect. Or: how to fire into a crowd without meaning to kill anyone.
Essentially, it seems like he's trying to say: "Don't worry! We can program context sensitivity! Even in a device without adaptive intelligence!"
Also, I'd like to see these "laws of war." Which laws? From whose war? How many sets of rules can the Ethical Governor handle? How does each set get prioritized? We have a hard enough time "programming" these things into organic soldiers, so why should it be easier to do so with synthetic ones?
I'm actually doing a panel on robots this weekend, so thanks for linking this. I'll be sure to point everybody your way just before getting my rant on.
Trust the Computer. The Computer is your friend.
What could possibly go wrong?
Granted, there's the tiniest chance that something could go wrong. But nothing that can't be solved by engineering larger, more powerful robots.
I wonder if the Ethical Governor is hooked up to a camera with a spectral detector that can measure melanin? It *is* a project from Georgia Tech, after all...
@21;
Nah . . . but phrenology, now . . .
I kept expecting Arnold to show up at any minute. The lack of Austrian accent and robot body parts was disappointing.
Spiny Norman @ 20:
Looking at the staff page for the Mobile Robotics Lab, I'd say that's perhaps a teensy bit uncalled for...
It could be worse.
At least the better human won the election.
Luthe@24: That one's waiting for the Second Variety, I suspect.
Kip W,
While you may be able to outrun something that only moves at a fast walk, if it can maintain that speed continuously it WILL catch you unless you're a marathon runner.
There was a sci-fi short story written about such a thing, in fact. An alien collector robot lands on Earth, programmed to stun and capture animals of a certain size. It could only move at the speed of a fast walk, but it could do so over any terrain and didn't need to rest. A human hunter stumbled upon it and spent a frenzied 24 hours trying to stay ahead of it. He failed, but by the time it caught him the human had lost so much weight that he was now below the weight limit, so he was not collected.
Trust the Computer. The Computer is your friend.
You do trust the Computer, don't you, citizen?
We're going to see a lot more of this kind of thing. A rational person might think that the horrors of war would be enough incentive to avoid it whenever possible, but the substituion of robotic, semi-autonomous weaponry for flesh & blood soldiers just makes military action that much more likely to occur.
27: "The Ruum".
I don't believe that something slowly shambling will catch up with me running.
There is also the possibility, neglected by Kip, that the pursuing Terminator might simply hail a cab. ("FOLLOW DAT HUMAN." "Okay, buddy." "HOW ABOUT DEM RAIDERS, HUH?")
Actually, thinking about it, deciding that Terminators can't walk faster than five miles an hour has rich comic potential, with Sarah Connor being chased all over California by a Terminator in a taxi, on a mountain bike, on roller-blades, on skis, on a bus, by FedEx (in a suitably-sized crate or padded envelope) etc.
Steve C. @ 30: Right now, our wars are fought by members of the US military, who put their own lives on the line.
<sarcasm>It's going to be so much better when our wars are fought by the kind of people who love the glory of war, but who would never put themselves in harm's way.</sarcasm> Just imagine a 2004-era warblogger with an "ethical" UAV and a video game controller.
I may have mentioned it before, but I still love the scene in the original Terminator movie which had a POV shot from Arnold's cybernetic eye displaying computer code. One of the shots was the beginning of a Cobol program. In addition to being an efficient killing machine, the Terminator also processed your payroll.
33: Obviously, the original Terminators were designed for human resources departments, in order to handle the tricky task of telling employees that they had been fired (hence the name). That's the only possible reason why they were built to look human. It's a stupid shape for an infiltrator robot.
Think about it: which is easier, building a machine that can be mistaken for a human, or building a machine that can be mistaken for a toaster?
Ajay @ 34 -
Good point. (Eyeing my iPod suspiciously)
You know, it's not like humans are so gosh-darned good at following the rules of engagement that I can't see the value of having some intelligence on the battlefield that's capable of following them without hearing things like "They just killed my buddy!" or "Hospital schmospital! Enemy soldiers, the lot of them!" echoing through their heads.
It's not actually true that those who don't learn from science fiction movies are doomed to live through them.
I much prefer Summer Glau as a Terminator.
Steve C. @ 35:
Your iPod looks like a toaster? Is it one of the first generation models?
Now, if you'll excuse me, I have this broken and seemingly indestructible portable television to try to fix.
Hmm, impressive video. They just demonstrated that the rules of war are — or maybe that should be "can be" — so simple that a machine can follow them, provided someone else has done the hard work of identifying and classifying targets.
While badly trained or led soldiers who don't care about the rules of war are a problem, the majority of blue-on-blue or blue-on-$civiliancolour incidents are due to soldiers who are unable to identify and classify targets in the heat of battle.
KeithS @ 38... Your iPod looks like a toaster? Is it one of the first generation models?
Hopefully Steve C's iPod isn't this model.
You might be able to fool the satellites by shuffling children and ambulances to false locations all over town, so that when the UAV downloads the latest feed it'll have to make "ethical" decisions based on false data that makes it look as though there are schools and hospitals everywhere. In a weird way, developments like this could utterly change the infrastructure and organization of communities under attack.
Madeline Ashby @ 41:
The system shown in the video doesn't appear to really be all that intelligent, no matter what the speaker says. All the landmarks, such as hospitals, apartment buildings and so on are pre-defined by the human operator. From there it's a simple calculation of what the biggest attack it has that can it can get away with is.
KeithS: That's only the first generation. Next up: the drone uses shape recognition, movement analysis and motion prediction to determine who's an enemy, who's a friendly, and who's a non-combatant. Of course it would have to be able to modify its identifications as necessary: look, that non-combatant is throwing something into the crowd, must be a bomber! Oops, sorry, we zapped the bride as she threw the bouquet. Guess none of them are getting married now.
There's another thing that bothers me. The governer has to be field-programmable, so that software changes and specific local rules can be added right? So what's to prevent one of the techs, or the unit commander, from changing the rules so there aren't any non-combatants. The "Kill them all, let God sort them out" rule.
TexAnne @ 18:
Stay Alert! Trust No One! Keep Your Laser Handy!
Bruce Cohen @ 43:
Nothing, of course. It's still a human endeavor, with a human requirement to make decisions, ethical and otherwise. One advantage of this system is that you can make decisions ahead of time instead of making a bad decision in the heat of the moment. A large disadvantage is that it doesn't adapt to changing conditions in the field the same way that a human can.
Bruce@43, you read my mind. Thanks.
This is also one of those instances where locative tagging might be extremely useful. If field programmers at either end of the UAV's area are constantly cloud communicating about the same map, they can just tag coordinates with things like "school for the disabled" (or "porn shop" or "best price on sweetened condensed milk"). That way the UAV can just watch for flagged terms when it evaluates the locative data.
One of the biggest reasons for "blue-on-blue" (or "blue-on-civilian") is fatigue due to sleep deprivation. This is the area in which robots will have the advantage.
Given that the Conficker worm has already infected the Royal Navy and the Bundeswehr, the possibilities for external command-and-control are friggin' endless.
Eric K @ 32: in the webcomic/graphic novel "Shooting War" about an extended Iraq War, there's a chapter where the US military has sent a group of SWORDS-a-likes into an abandoned city to "sweep for combatants". At one point it briefly flashes to the telepresence room back in the States, showing a group of college-age soldiers wearing VR headsets and PlayStation-ish controllers, with one of them pumping his fist after a kill.
Sam @ #26 and John @ #27. Tasty coincidence... I picked up an old volume of SciFi stories yesterday, and read Philip K. Dick's Second Variety. The story immediately after was The Ruum, which is the one about the specimen collector. Hmmmm.... again the Universe is trying to tell me something. I wish it would stop mumbling, already! What did you say? "Run away from the robots - now!" ??
Actually, to my understanding the Ethical Governor just violated several laws of war. It allowed civilian casualties which it knew were in the blast zone. You can't drop bombs on civilians, even when there are "high value targets" in the area. If it knows there are civilians there, the algorithm should place infinite value on their lives. The usual defence for bombings with collateral damage these days is "we have no way of knowing the civilians are there".
51: that's not quite right; the law of armed conflict blocks targetting humans and indiscriminate attacks, but it doesn't block attacks on a military target simply because there might be civilian casualties, as long as the civilian casualties are proportionate to the military advantage. (So you can still bomb that enemy airbase, even if you know for a fact that there is a civilian sweeping the floor; but you can't blow up an entire bus just to kill the soldier who's on board.)
Jim @48
Windows for Warships.
'nuff sed.
Dave Bell@53
I'm sure it'll be every bit as reliable as Vista...
Windows NT Leaves Navy Ship Dead in Water.
The sea-blue screen of death?
A pirated version of that software/hardware could be tweaked to seek out US federal buildings because they have daycare centers in them.
Greetings. The Master Control Program has chosen you to serve your system on the Game Grid. Those of you who continue to profess a belief in the Users will receive the standard substandard training that will result in your eventual elmination. Those of you who renounce this superstitious and hysterical belief will be eligible to join the Warrior Elite of the MCP. Each of you will be given an identity disc.
I see that I mashed up what I think should be the case and what the law actually is in my last comment. International Law says that civilians "enjoy general protection arising from military operations" and that "constant care shall be taken" to avoid targeting civilians (that's additional protocol 1 to the Geneva Conventions). There are also, as you know, Bob, strict rules against attacks that do not discriminate between civilian and combatant. My point was that really, awareness of civilians in the area should be a much stronger disincentive than it is here. In a sane world, there would be an infinite value placed on human lives in the Ethical governor (and, indeed, no need for an ethical governor). I don't see how you are taking constant care and giving general protection to civilians you know are there if you are blowing them to bits. I don't see how what the ethical governator here is getting all high and mighty on his ethical horse in the sky about is not a deliberately indiscriminate attack. That's why I think this video is interesting, because it seems to show what the bombing run planners are thinking when they're working out the utilitarian math. I think their values are wrong.
Madeline @41: I just saw an item on the evening news about the Israeli Army dropping white phosphorus shells on a Palestinian school, followed up by a single HE shell, which they claimed was aimed at a sniper.
The fun part of a robot army, IMO, involves what happens when "civilian control of the military" changes from meaning "the Army won't go along with orders to detain or kill opposition political figures and seize control on behalf of some ambitious general" to meaning "people hopefully loyal to the constitution have control of the top-level crypto keys used to authorize control of the robot army."
It's interesting to ask how small the number of people needed for a coup might become at some point in the future. In an extreme case, you could imagine a single top-level public key which signs the tops of all the certificate chains and CRLs. One day, "key rollover" and "regime change" become the same event.
If you created robotic prostitutes, would you model some after Jude Law?
Steve @ 33: Talk about adding insult to injury . . .
Serge @ 63: If I could get the rights to the Sky Captain or W. P. Inman models instead of, say, Harlen Maguire or good ol' Joe, then, yes. Yes, I would.
I bought How To Survive A Robot Uprising for my husband, and it contains some handy tips for escaping pursuing robots.
abi @ 59
That's Rebooté, John Rebooté, Lord Whorfin!
flowerytops @ 65... handy tips for escaping pursuing robots
Even then, don't think you don't need Robot Insurance.
Re. #65....
Zombie Apocalypse v. Robot Uprising. (Discuss.)
Just thought of a bad, bad joke.
Q: What kind of lube do you use for robotic Ménage à trois
A: Three-in-one oil!
#70:
How many robots does it take to screw a lightbulb?
Wouldn't three in one be a total of four?
http://gizmodo.com/gadgets/robot-sex/people-of-massachusetts-to-be-having-sex-with-robots-by-2012-310568.php
People of Massachusetts to be having sex with robots by 2012
(do they have rubber skin?)
Sarah W @ 64... Then Spielberg knew what he was doing, when he filmed A.I.. Anybody else recognized Williamson's Humanoids, near the end of the movie. (I didn't like it that much, but it was better than Minority Report or War of the Worlds.)
Bruce @ 71: Only two, but it's a tight fit(?)
Serge @74: In my opinion, he did. I was too busy having my heartstrings yanked about to notice any overt nods to Williamson, but it's been a loooong time since I've read that series. Guess I'm heading to the library!
http://www.timesonline.co.uk/tol/news/uk/article675984.ece
Article from london times about the need for robot ethics.
“We have to manage the ethics of the scientists making the robots and the artificial ethics inside the robots.”
#11 and others
The trick is, who gets to define "human"? - that's central to many h. sap committed atrocities, and therefore to automated ones. No soldier kills "civilians" - just "huns" or "gooks" or "yanks" or "rebs" or "paynim" - pick your war, pick your atrocity, pick your word. I wonder what rude term the Trojans used of the Greeks, and vice versa.