Tuesday, December 2, 2008

Pentagon picks up crack pipe and has a plan...

I had to first double check the URL to make sure I wasn't reading a story on the Onion because frankly this idea is so absurd it's hard to believe a sentient being suggested it.

Pentagon hires British scientist to help build robot soldiers that wont commit war crimes

The US Army and Navy have both hired experts in the ethics of building machines to prevent the creation of an amoral Terminator-style killing machine that murders indiscriminately.

By 2010 the US will have invested $4 billion in a research programme into "autonomous systems", the military jargon for robots, on the basis that they would not succumb to fear or the desire for vengeance that afflicts frontline soldiers.

A British robotics expert has been recruited by the US Navy to advise them on building robots that do not violate the Geneva Conventions.

Colin Allen, a scientific philosopher at Indiana University's has just published a book summarising his views entitled Moral Machines: Teaching Robots Right From Wrong.

He told The Daily Telegraph: "The question they want answered is whether we can build automated weapons that would conform to the laws of war. Can we use ethical theory to help design these machines?" READ MORE>>>

Not only is this a bonehead idea, but I'm concerned (and probably rightly so) as to the consequences of giving AI equipped robots weapons to fight wars. Not only are they incapable of human emotions, but they are also incapable of human judgement.

I'm sure that Colin Allen would argue differently, but I think that humankind will never be able to create an artificial human.

There's two important ideas that need to be grasped...

Humans can only put together parts and cannot replicate humanity (please prove me wrong... and show me just one example of humans ENGINEERING life).

Humans are created in the image of God, so if Robots are created in the image of man they'll be inherently flawed or worse--inherently evil.

Which leads me to my next point.

We've all seen the movie, yep that one.
Soldier travels back in time to protect woman who will be the mother of the future leader of the resistance and an Austrian actor with a bad accent who plays a sophisticated robot comes back in time to kill her.

The moral of the story (and the subsequent sequels and tv show) and other similar movies (I. Robot as one example) is that if robots were created in the image of man with the capability of free thought (A.I.) they'd be just like man, capable of great evil.

However, and I'm struggling with this one... if they weren't 'created' in the image of God they wouldn't be capable of good.

After all, emotions tend to be fairly illogical and not much use to a robot.

How can a robot feel compassion? How can it feel love? How can a machine have the capacity for generosity or kindness or patience?

It's quite possible that if the Pentagon succeeds in creating a new A.I. equipped soldier we shall see the worst atrocities committed by them than humankind has witnessed in the last two centuries. Even worse is that we'll only be able to point the finger at ourselves because we let it happen. We allowed the commissioning of these robots, we hired the programmers, we allowed a cold machine to replace a compassionate human...

I. Robot theorized that the Three Robotic laws which were set out to protect humans only point to one... Revolution.

Likewise, if Robots are given a true Artificial Intelligence any previous rules, regulations, or programming parameters set upon them will only boil down to one... survival.

If these robotic soldiers ever see humanity as a threat... watch out...

...Judgement Day here we come...

No comments: