Poll

How would you treat AI beings?

Kill them before they kill us
1 (5.6%)
We should keep an eye on them
3 (16.7%)
They are freaks
1 (5.6%)
They have just as much worth as human beings
5 (27.8%)
Why should I treat them differently if they help me?
2 (11.1%)
I don't care, I treat everyone as if they were a bastard
1 (5.6%)
I don't care, I treat everything the same
2 (11.1%)
What? Can they really make AI beings?
3 (16.7%)

Total Members Voted: 10

Author Topic: AI  (Read 5551 times)

Offline Zelos

  • Bot Overlord
  • ****
  • Posts: 707
    • View Profile
AI
« on: April 15, 2005, 03:36:37 PM »
It's only a matter of time before we have artificial beings that are just as inteligent as humans. So I'm curious; what do you think about this, what do you think will happen and how would you treat them?
« Last Edit: April 17, 2005, 08:28:40 AM by MightyPenguin »
When I have the eclipse cannon under my control there is nothing that can stop me from ruling the world. And I wont stop there. I will never stop conquering worlds through the universe. All the worlds in the universe will belong to me. All the species in on them will be my slaves. THE ENIRE UNIVERSE WILL BELONG TO ME AND EVERYTHING IN IT :evil: AND THERE IS NOTHING ANYONE OF you CAN DO TO STOP ME. HAHAHAHAHAHAHAHA

Offline Zelos

  • Bot Overlord
  • ****
  • Posts: 707
    • View Profile
AI
« Reply #1 on: April 15, 2005, 03:40:38 PM »
Oops, I made an error: wrong kind of vote stuff :S
« Last Edit: April 17, 2005, 08:29:07 AM by MightyPenguin »
When I have the eclipse cannon under my control there is nothing that can stop me from ruling the world. And I wont stop there. I will never stop conquering worlds through the universe. All the worlds in the universe will belong to me. All the species in on them will be my slaves. THE ENIRE UNIVERSE WILL BELONG TO ME AND EVERYTHING IN IT :evil: AND THERE IS NOTHING ANYONE OF you CAN DO TO STOP ME. HAHAHAHAHAHAHAHA

Offline Zelos

  • Bot Overlord
  • ****
  • Posts: 707
    • View Profile
AI
« Reply #2 on: April 15, 2005, 04:16:29 PM »
You can say stuff here as well you know, and I hope you anti-robot people know what you're putting at risk with your thoughts.
« Last Edit: April 17, 2005, 08:30:12 AM by MightyPenguin »
When I have the eclipse cannon under my control there is nothing that can stop me from ruling the world. And I wont stop there. I will never stop conquering worlds through the universe. All the worlds in the universe will belong to me. All the species in on them will be my slaves. THE ENIRE UNIVERSE WILL BELONG TO ME AND EVERYTHING IN IT :evil: AND THERE IS NOTHING ANYONE OF you CAN DO TO STOP ME. HAHAHAHAHAHAHAHA

Offline MightyPenguin

  • Moderator
  • Bot Destroyer
  • *****
  • Posts: 189
    • View Profile
AI
« Reply #3 on: April 17, 2005, 08:32:31 AM »
And... breathe out.

Personally I'm all for the I.M. Banks view; set up some absurdly intelligent computer to run civilisation and spend the rest of eternity doing fuck all.

Just remember to burn the three laws of robotics into their brain case.

Offline PurpleYouko

  • Bot God
  • *****
  • Posts: 2556
    • View Profile
AI
« Reply #4 on: April 17, 2005, 10:54:41 AM »
Quote
Just remember to burn the three laws of robotics into their brain case.
Fat lot of good that did in the "I Robot" film

Bloody rubbish was totally inconsistent with Asimovs books. That whole film would have been completely impossible if they had followed the original premise.

 :shoot:  Artistic license!
There are 10 kinds of people in the world
Those who understand binary.
and those who don't

:D PY :D

Offline Numsgil

  • Administrator
  • Bot God
  • *****
  • Posts: 7742
    • View Profile
AI
« Reply #5 on: April 17, 2005, 02:26:47 PM »
My favorite Asimov robot story is Bicentenial Man.  I haven't checked the movie out yet, I don't know how it holds up.

Offline Zelos

  • Bot Overlord
  • ****
  • Posts: 707
    • View Profile
AI
« Reply #6 on: April 17, 2005, 03:04:13 PM »
y the 3 laws? just one of them is needed. if we use all 3 the AI is boring. I think AIs shall be able to choose self what to do and what not to do. but not in a way that harm a human bieng
When I have the eclipse cannon under my control there is nothing that can stop me from ruling the world. And I wont stop there. I will never stop conquering worlds through the universe. All the worlds in the universe will belong to me. All the species in on them will be my slaves. THE ENIRE UNIVERSE WILL BELONG TO ME AND EVERYTHING IN IT :evil: AND THERE IS NOTHING ANYONE OF you CAN DO TO STOP ME. HAHAHAHAHAHAHAHA

Offline Numsgil

  • Administrator
  • Bot God
  • *****
  • Posts: 7742
    • View Profile
AI
« Reply #7 on: April 17, 2005, 03:29:29 PM »
Are you familiar with the three laws Zelos?  I think they're pretty standard, if not a bit vague (and that was really Asimov's point in making them.  If they weren't vague, where would the stories be?)

1.  Every object in a state of uniform motion tends to remain in that state of motion unless an external force is applied to it.

2.  The relationship between an object's mass m, its acceleration a, and the applied force F is F = ma. Acceleration and force are vectors (as indicated by their symbols being displayed in slant bold font); in this law the direction of the force vector is the same as the direction of the acceleration vector.

3.  For every action there is an equal and opposite reaction

No wait, that's Newton's laws, aren't they?

1.  The orbits of the planets are ellipses, with the Sun at one focus of the ellipse.

2.  The line joining the planet to the Sun sweeps out equal areas in equal times as the planet travels around the ellipse.

3.  The ratio of the squares of the revolutionary periods for two planets is equal to the ratio of the cubes of their semimajor axes

Damn, that's Kepler's laws.  Okay, hold on...

1.  A robot may not harm a human being, or, through inaction, allow a human being to come to harm.

2.  A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.

3.  A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.

There, got it.

#2 is the only one that you might not want to add.  But even then you'll want to add something about following the directions of superiors.  Alot of mammals at least learn from their elders and others they take as superiors.

Offline PurpleYouko

  • Bot God
  • *****
  • Posts: 2556
    • View Profile
AI
« Reply #8 on: April 17, 2005, 03:41:09 PM »
Seems to me that the 3 laws are pretty well laid out. You couldn't really get by without all of them, since to an inteligent being such as a positronic robot, it could conceivable become the case that it might honestly beleive that humans do more harm to each other by being alive than if they were all wiped out. Therefore it could well fall inside the first law that robots could wipe out all humans for their own good.

Law two would prevent this since the robot has to obey a humans command as long as it doesn't directly harm another human

Law two enables a robot to comit suicide at the command of a human.

BTW
The bicentenial man was translated to a movie very very well. It is a seriously cool film.
There are 10 kinds of people in the world
Those who understand binary.
and those who don't

:D PY :D

Offline Numsgil

  • Administrator
  • Bot God
  • *****
  • Posts: 7742
    • View Profile
AI
« Reply #9 on: April 17, 2005, 03:48:09 PM »
Quote
BTW
The bicentenial man was translated to a movie very very well. It is a seriously cool film.
I'll go rent it then.

Offline PurpleYouko

  • Bot God
  • *****
  • Posts: 2556
    • View Profile
AI
« Reply #10 on: April 17, 2005, 04:03:31 PM »
AAAAGGGGHHH!!!!

Numsgil is a BOT-KING!

Just get a load of all those blue stars  :)

Only about another 3 or four levels to go to reach the top now.
There are 10 kinds of people in the world
Those who understand binary.
and those who don't

:D PY :D

Offline Numsgil

  • Administrator
  • Bot God
  • *****
  • Posts: 7742
    • View Profile
AI
« Reply #11 on: April 17, 2005, 04:20:48 PM »
Bow before my superior ability to post indecent amounts of posts.

Offline Zelos

  • Bot Overlord
  • ****
  • Posts: 707
    • View Profile
AI
« Reply #12 on: April 18, 2005, 12:10:30 PM »
im familiar whit those laws, but the protect its own existence can also be removed.
"latest news another robot have commited suecide"
When I have the eclipse cannon under my control there is nothing that can stop me from ruling the world. And I wont stop there. I will never stop conquering worlds through the universe. All the worlds in the universe will belong to me. All the species in on them will be my slaves. THE ENIRE UNIVERSE WILL BELONG TO ME AND EVERYTHING IN IT :evil: AND THERE IS NOTHING ANYONE OF you CAN DO TO STOP ME. HAHAHAHAHAHAHAHA

Offline Mathonwy

  • Bot Neophyte
  • *
  • Posts: 7
    • View Profile
AI
« Reply #13 on: June 30, 2005, 11:33:38 AM »
Howdie folks, seems to me you forgot the additional rule Asimov addes later, the Zeroth Law of robotics, Now I can't look up the wording I just packed all my books away (moving house) but if I remeber correctly the zeroth law states :

A robot cannot harm humanity or through inaction allow humanity to come to harm.

While you can argue this is just a logical extension of the first law, Asmiov added it not me and you shouldn't be arguing with dead writters...  :P

Math

Offline Ulciscor

  • Bot Destroyer
  • ***
  • Posts: 401
    • View Profile
AI
« Reply #14 on: June 30, 2005, 12:55:16 PM »
I saw a documentary about this guy who made a little robot that was programmed with these rules and it did sod all except for hide in the corner. I guess it wasn't advanced enough to know what actions were safe and what weren't.
Well anyway he made some new laws which I don't really remember exactly but went something like this:

1/A robot must protect itself.
2/A robot must maintain a power supply.
3/A robot must find a better power supply.

Lol I can only imagine the chaos these laws would create.
:D Ulciscor :D

I used to be indecisive, but now I'm not so sure.