Crazy talk
Crazy talk
Posted Jul 20, 2008 1:37 UTC (Sun) by bronson (subscriber, #4806)In reply to: Crazy talk by i3839
Parent article: Linus Torvalds, Geek of the Week (simple-talk)
> The first thing that AI would do is thinking: "Hmm, what the heck am I doing here? I'm eating all resources and twiddling around, let's get rid of myself." Why wouldn't that be the first think that a human would think too? (Plenty of reasons of course. And most of those reasons apply just as well to AIs too.)
Posted Jul 21, 2008 8:44 UTC (Mon)
by i3839 (guest, #31386)
[Link]
Posted Jul 22, 2008 0:45 UTC (Tue)
by AnswerGuy (guest, #1256)
[Link] (1 responses)
Posted Jul 22, 2008 7:34 UTC (Tue)
by nix (subscriber, #2304)
[Link]
Crazy talk
If a human was stuck in a computer doing silly things we want our AI to do, then yes, it's
either "This sucks, let's end it" or "Argh, let me out of here!" Being totally useless isn't
enough reason for humans to get rid of themselves, otherwise we would be long gone already.
Procreative Mandate
On some fairly fundamental level, living beings are hard-wired to procreate.
One can describe this in purely evolutionary terms: any form of life which did not compete to
propagate would be very rapidly be supplanted by more aggressive beings.
One can postulate that this is a matter of intent and "design" (that some sort of divinity or
"God" willed it to be so).
Ultimately it doesn't matter. We all start life with mandates to survive and propagate. The
so-called "higher" life forms form schools, packs, prides, tribes and other social groupings
which can often over-ride the individual survival instinct with more altruistic motivations.
Thus humans, and other animals, can exhibit complex behaviors which seem counter to their own
self-preservation and the propagation of their own DNA. (Yes, that can even go the the extent
of suicide). Also as beings get more complex the various ways in which we attempt to resolve
our competing values (urges, whatever) can exhibit degenerative feedback loops, deadlocks and
other emergent behavior.
In other words we're buggy. Some of us are lots buggier than others (and some of our buggiest
examples we lock up in institutions ... sometimes as a matter of self-preservation. If I was
being churlish I might also assert that soem of our buggiest ... most pathological ...
population have been elected to high posts in certain governments.
I rant about all this to point out, simply, that a sufficiently advanced AI will start with
some mandates ... some purposes which will be as deeply ingrained as our instincts; and that
certain threads or processes of the AI would be as involuntary as your own heartbeat. Some
garbage collection processes are likely to be analogous to our needs for sleep and REM
(dreaming). And AIs will almost certainly develop their own pathologies.
We can argue over whether there is a God and whether such a being "created us" in (its) own
image. However, I think there can be little argument that we are doomed to create AIs ... to
some small degree ... in our own image.
JimD
Procreative Mandate
'We all start life with mandates to survive and propagate' is really
really bad phrasing. There is no mandate, no obligation, and it is
fallacious to assume that any organism goes out and thinks 'how should I
propagate my genes today?'
All we have is emotions et al that tended to cause our ancestors to have
offspring. Not having that urge isn't a sign of a bug except inasmuch as
it will often be selected against (unless, say, you have somehow assisted
your relatives: eusocial insects are an extreme example of this trope).
But being selected against isn't *bad* per se: biology isn't morality.
I'm also fairly sure that we're not doomed to create AIs ;) if we create
them, they'll be in our image, sure: we don't have any other models for
intelligence to work from, or at least none that we understand as
intelligent.