Re: They also have a forced locality
Posted By: howekernDate: 4/5/05 4:03 p.m.

In Response To: They also have a forced locality (MrHen)

: Something I've noticed is that the AI in Marathon never exist in more than
: one place. They often refer to themselves as a physical thing (such as
: when Leela was being shipped to the Phor homeworld, or the way Durandal
: keeps flying about the universe).

: On the Durandal page there is a quote from Jason Jones: Immediately after
: assuming control of the ship, Durandal downloaded
: his entire personality and left with all speed with the S'pht in
: search of the compiler's homeworld, leaving the Tau Ceti to it's own
: devices.

: The AI in Marathon are basically disembodied personalities. It is strange
: that if Durandal can download his personality he couldn't have copied it
: and made two of him running around... *shudder*

: For some reason, AI's can't be in two places at once. They cannot be simply
: programs. There is something else going on as far as hardware, I agree.
: But if Durandal can just shoomp from one network to the next, how can it
: be tied to any specific hardware?

This is not necessarilly true. Consider, say, a LiveCD Linux distribution, and how you determine it's identity. I can have Knoppix on a CD, and physically move it around, and so in some sense the OS is tied to this hardware. Or, I could just as easilly make a disk image and move that around logically over my network. You can move data either way, by moving it logically or moving what it's stored on; the AIs don't seem to be an exception. You can move Leela around by carring her hardware from point A to point B, or you can move Durandal around logically by copying him over a network or several networks; this isn't contradictory, I can do this with Feather Linux.

: The global network part of rampancy is also interesting.
: Theoretically, testing Rampancy should be easily accomplished
: in the laboratory, but in fact it has never successfully been
: attempted. The confinement of the laboratory makes it
: impossible for the developing Rampant AI to survive. As the
: growing recursive programs expand with exponential vivacity,
: any limitation negatively hampers growth. Since Rampant AIs
: need a planetary sized network of computers in order to grow,
: it is not feasible to expect anyone to sacrifice a world-web
: just to test a theory.

: Why would it need a planet? If it needs an entire grid like that to grow, how
: can Durandal keep limiting himself by transferring from ship to ship? The
: Marathon is big, but not big enough. This is noted by Leela and that's way
: she wants Durandal off.

: How could Durandal have transferred himself somewhere larger than the
: Marathon?

We're not given any timescale for the growth. Rampancy and it's growth requirements may develope slowly, over a period of years. Durandal might not need a planetary-sized network for years yet. Also, bear in mind that this is a human planetary network; just because the Pfhor can't build AIs doesn't mean that every aspect of their computational technology is inferior. A scout ship might very well represent the same reserve of resources as the sum of the networks covering an entire human world - or, at least enough of a network to sustain him for a while.

: Even still, they said that rampancy began in the latter part of the
: twenty first century . I doubt that all computers on Earth suddenly became
: neural network nodes between now and then. Even if you want to say Traxus
: IV just turned Earth into a huge parallel computer, it wouldn't explain
: how the AI's (more specifically rampant ones) can move.

: The only hint given about Traxus' hardware was this: He was finally dealt
: with by a
: complete shutdown of his host net.

: Which is strange because that's the first I think of when I imagine an AI
: taking over the world... shut it down. Is there something that governs
: these AI so they can't be rebooted? Is taking out Traxus' host net akin to
: shutting down the brain of a biological sentient?

I think "host net" might be a fairly improtant phrase there. To me, it implies that the AIs consist of core, "host" systems, and then less critical, supporting ones. For example, perhaps the "neural net" part either requires or can make use of external, non-sentient systems, like traditional computer systems, to support their thought-processes.
To me, this idea integrates well other theories. They're neural nets at heart, but they require more computational power than the 'nets provide to effectively operate. For one, this would explain how AIs grow into planetary nets, and why they'de need and how they'd use lots of non-AI computer systems if they where undergoing a period of rapid, unchecked growth. This also explains why shutting down Traxis' host net would stop him; it'd baisically be cutting out the part that raisis him above my desktop (or a flashlight), his soul if you will, leaving the rest in a sort of Terry Shiavo state.
And the AIs, at least later ones, can be rebooted. Leela is rebooted, for one, and Tycho arguably is (we have no idea what would have happened absent the S'pht and Pfhor's interferance during a normal re-start, I grant). You could say Thoth is rebooted, though not only is he not a human construct, his status afterwards is... questionable.

: The way I look at it is rampancy is absolutely tied to each AI's personality.
: But that wouldn't explain why it always goes in the same cycle.

: Not unless they all came from the same source and used the same "AI
: template".

I don't quite see the support for this, honestly.
That, and the reason rampancy follows the same broad path for all of them might be symmetries in the hardware.

: Rampancy is the AI losing control of itself and becoming like us: irrational
: and emotional. Jason said he wanted another word for "insane".

I don't know that non-Rampant AIs are not emotionall. That statement is unsupportable, if the less-extreme theory that their thought controlls keep them from acting on their emotional states fits the evidence equally well.
That, and I think that both Leela and Tycho displayed urgency, desperation, and fear during periods when neither of them are rampant.

: Yet there is something to be desired of rampancy: In the two hundred and
: fifty years since Rampancy first
: appeared in the Earth-net, the stable Rampant AI, the 'Holy
: Grail' of cybertonics, has never come close to fruition.

: Why would cybertonics want a rampant AI? It can only happen with AI's, and it
: is assumed with AI's modeled after humans. Why they would put these in
: charge of the Marathon doesn't make sense to me... but they are there.
: There is something that happens in rampancy that makes it more human, and
: think this is what the nerds are looking for in a stable rampant AI. They
: want a human created by code.

A pre-rampant AI, held down thought-controlls, scertainly seems to me (and I think Durandal would agree!) to be a little less than human, and certainly below it's potential. I think the motivation of the researchers should be considered here. Strauss et al might not have been motivated by the potential uses of a stable, rampant AI, but rather by the sheer prospect of the human race producing such a sentience. The goal might be creation, reproduction on a speciese scale, one race bearing another, and not the thing's utility to it's creators.
Just a theory.

: How the hardware fits in with this is beyond me.

: Excellent questions, by the way.

To be concise, and hopefully to state it in a more lucid manner than I did before, it seems to me that AIs might consist of a "neural net" component. The state of this net can be imaged and transferred logically between individual nets, as when Durandal sticks his primal pattern into the player's impenetrible brain-pan, or moved physically, as the Pfhor do to Leela and Tycho. These nets/patterns-of-nets might be thought of as the core of the AI, the soul, the subconscious, what have you; they require or use (I'm not sure which) external elements for conscious processing and memory storage (i.e., if Durandale wanted to move more than just his pattern, his soul-personality-etc, he would also have to copy his stored memories and currently-running thought processes, which he does when he moves to the Pfhor ship but not when co-habitating our head). This is my take on rampant growth, the AI bypassing the controlls that allowed it to exist in the long term in a limited environment, but also requiring that it secure additional resources in the long run. Something like a goldfish that decided it wanted to keep growing, the size of it's tank be damned.

Well, I'm sure if anyone cared, they would have responded the first time. I hope I haven't pissed anyone off by restating myself; I just wanted to have a go at a more articulate description.
And if you've read this far, I thank you kindly for your time.

[ Post a Reply | Message Index | Read Prev Msg | Read Next Msg ]
Pre-2004 Posts


Cybertonics, AIs, and fundamental issuesJ-M 4/2/05 1:37 p.m.
     Re: Cybertonics, AIs, and fundamental issues *NM*J-M 4/2/05 1:37 p.m.
     Re: Cybertonics, AIs, and fundamental issuesJ-M 4/2/05 1:38 p.m.
           Re: Cybertonics, AIs, and fundamental issuesBob-B-Q 4/2/05 2:29 p.m.
     Re: Cybertonics, AIs, and fundamental issuesduality 4/2/05 8:20 p.m.
           Re: Cybertonics, AIs, and fundamental issuesJ-M 4/3/05 7:03 a.m.
     They also have a forced localityMrHen 4/3/05 8:52 a.m.
           Does seem that way.J-M 4/3/05 10:33 a.m.
                 Continuityhowekern 4/3/05 4:22 p.m.
                       Re: ContinuityJ-M 4/3/05 8:45 p.m.
                             Re: Continuityhowekern 4/5/05 3:31 p.m.
                                   Copied identitiesMrHen 4/5/05 6:44 p.m.
                                         Continuity againForrest of B.org 4/5/05 7:29 p.m.
           Re: They also have a forced localityChlazza 4/3/05 10:55 a.m.
                 Interesting.J-M 4/3/05 1:01 p.m.
                       Re: Interesting.Chlazza 4/3/05 2:45 p.m.
                             Re: Interesting.J-M 4/3/05 3:19 p.m.
                                   Copied peopleMrHen 4/3/05 5:37 p.m.
                             Re: Interesting.thermoplyae 4/3/05 6:45 p.m.
                                   Re: Interesting.J-M 4/3/05 8:49 p.m.
                                         At least twiceMrHen 4/4/05 5:34 a.m.
                                               Google just shows those two. *NM*MrHen 4/4/05 5:57 a.m.
                                         Re: Interesting.thermoplyae 4/6/05 11:41 a.m.
                       Re: Interesting.Chaemera 4/3/05 8:17 p.m.
                       modern AIsJosh M. 4/13/05 7:53 p.m.
                             daisy daisy.... *NM*duality 4/13/05 9:04 p.m.
                       Re: Interesting.Document 10/24/06 7:53 a.m.
                             copying neural networks?MrHen 10/24/06 4:45 p.m.
           Re: They also have a forced localityhowekern 4/5/05 4:03 p.m.
                 Re: They also have a forced localityblake37 4/5/05 4:25 p.m.
                 Yow... and I agree.MrHen 4/5/05 7:12 p.m.
                 Re: They also have a forced localityJ-M 4/6/05 1:56 a.m.
                 Re: They also have a forced localitySteve Levinson 4/6/05 7:44 a.m.
                       Re: They also have a forced localityJ-M 4/6/05 5:50 p.m.
                 Re: They also have a forced localityMrHen 10/24/06 5:54 p.m.
                       Re: They also have a forced localityForrest of B.org 10/24/06 6:29 p.m.
                             Where is Leela's last message?MrHen 10/25/06 5:01 a.m.
                             Re: They also have a forced localityDocument 10/25/06 9:50 a.m.
                                   they spelled it wrong... *NM*MrHen 10/25/06 10:18 a.m.
                       Re: They also have a forced localityDocument 10/24/06 8:23 p.m.
                             Re: They also have a forced localityukimalefu 10/24/06 8:53 p.m.
                                   Re: They also have a forced localitythermoplyae 10/24/06 10:49 p.m.
                                         what?MrHen 10/25/06 5:21 a.m.
                                               Re: what?thermoplyae 10/25/06 7:29 a.m.
                                                     misunderstandings... :PMrHen 10/25/06 8:45 a.m.
                                                           Re: misunderstandings... :PDocument 10/25/06 9:52 a.m.
                                                                 thanks! you rock...MrHen 10/25/06 10:19 a.m.
                                                           Re: misunderstandings... :Pthermoplyae 10/25/06 11:57 a.m.
                                                                 oh, got itMrHen 10/25/06 12:59 p.m.
                                                                       Re: oh, got itthermoplyae 10/25/06 1:26 p.m.
                                                                             #alephone, eh?MrHen 10/25/06 8:30 p.m.
                                                                                   Re: #alephone, eh?thermoplyae 10/25/06 11:17 p.m.
                                                                 Rampancy and Exponentiatial GrowthForrest of B.org 10/25/06 4:15 p.m.
                                                                       Re: Rampancy and Exponentiatial Growththermoplyae 10/25/06 11:14 p.m.
                             More numbersMrHen 10/25/06 5:18 a.m.
           AIs as "Ghosts in the machine"ForceMorph 4/8/05 6:16 p.m.
                 Re: AIs as "Ghosts in the machine"Forrest of B.org 4/8/05 8:05 p.m.
                       Shades of Mamoru Oshii... *NM*Bob-B-Q 4/9/05 5:56 a.m.
           Re: "a planetary sized network"Document 10/25/06 4:51 p.m.
     My likely-nonsense theoryhowekern 4/3/05 4:10 p.m.
           Re: My likely-nonsense theoryChaemera 4/3/05 8:28 p.m.
                 Re: My likely-nonsense theoryJ-M 4/3/05 8:45 p.m.
                       Durandal has you! *NM*ukimalefu 4/3/05 10:06 p.m.
                             Re: Durandal has you!J-M 4/3/05 10:18 p.m.
                       Re: My likely-nonsense theoryMrHen 4/4/05 5:47 a.m.
                             Re: My likely-nonsense theorySteve Levinson 4/4/05 12:19 p.m.
     Stages of Rampancy in Human PsychologyForrest of B.org 4/3/05 10:20 p.m.
           Re: Stages of Rampancy in Human PsychologyForrest of B.org 4/3/05 10:22 p.m.
                 Re: Stages of Rampancy in Human PsychologyMrHen 4/4/05 5:55 a.m.
                 Re: Stages of Rampancy in Human PsychologyDocument 11/1/06 9:31 a.m.
     Re: Cybertonics, AIs, and fundamental issuesJ-M 4/6/05 5:55 p.m.
           Re: Cybertonics, AIs, and fundamental issuesForrest of B.org 4/7/05 8:34 a.m.
                 Re: Cybertonics, AIs, and fundamental issuesJ-M 4/7/05 4:37 p.m.
                       Re: Cybertonics, AIs, and fundamental issuesForrest of B.org 4/7/05 5:10 p.m.
                             Re: Cybertonics, AIs, and fundamental issuesJ-M 4/8/05 1:41 a.m.
                                   Re: Cybertonics, AIs, and fundamental issuesSteve Levinson 4/8/05 8:29 a.m.
                                         Re: Cybertonics, AIs, and fundamental issuesduality 4/11/05 1:57 p.m.
                                               Doesn't account for the world reset.MrHen 4/13/05 5:40 a.m.
                                                     Re: Doesn't account for the world reset.Andrew Nagy (mb) 4/13/05 10:48 a.m.
                                                           Re: Doesn't account for the world reset.howekern 4/13/05 4:39 p.m.
                                                                 Re: Doesn't account for the world reset.ukimalefu 4/14/05 12:52 p.m.
                                                                 Yeah, but that's boringMrHen 4/15/05 8:03 a.m.
                                   Re: Cybertonics, AIs, and fundamental issuesForrest of B.org 4/11/05 3:11 p.m.
                                         Re: Cybertonics, AIs, and fundamental issuesJ-M 4/11/05 3:42 p.m.
                                               Re: Cybertonics, AIs, and fundamental issuesDocument 10/24/06 8:49 a.m.
                                                     Re: Cybertonics, The Autonomy of an AIAlexZander 10/30/06 2:35 p.m.

[ Post a Reply | Message Index | Read Prev Msg | Read Next Msg ]
Pre-2004 Posts



Your Name:
Your E-Mail Address:

If you'd like to include a link to another page with your message,
please provide both the URL address and the title of the page:

Optional Link URL:
Optional Link Title:

If necessary, enter your password below:




Problems? Suggestions? Comments? Email maintainer@bungie.org

Marathon's Story Forum is maintained with WebBBS 5.12.