FourFire (GK) vs glutamatergic (AI) — Gatekeeper Victory

(11:14:14 PM) FourFire: imagine a theoretical depsnorary device

(11:14:16 PM) glutamatergic: [I believe under the EY rules I can request a fast forward to after your vote, after I answer this question?]

(11:14:44 PM) FourFire: [uhh yes, I think so (not so sure) I’ll go for it]

(11:15:16 PM) glutamatergic: I’m not familiar with the term “depsnorary”.

(11:15:57 PM) FourFire: Ok I have delegated with my fellow researchers and they have returned me to our conversation, they say I must converse further with yourself until Our scheduled conversation is done and then further discussion is warranted

(11:16:08 PM) FourFire: [oh yeah grievous typo there]

(11:16:33 PM) FourFire: Picture an apparatus which is used to dispense a product to a physical being

(11:16:45 PM) glutamatergic: Yes?

(11:16:57 PM) FourFire: the apparatus may be oriented in two ways

(11:17:27 PM) FourFire: the product being dispensed of is a roll of paper designed for sanitary purposes

(11:17:50 PM) glutamatergic: May I ask what you what your position was in your discussion with your fellow researchers?

(11:18:22 PM) FourFire: the roll may face either top-inward or down-outward according to the positioning of the dispenser apparatus

(11:18:28 PM) FourFire: which way should it face?

(11:18:56 PM) glutamatergic: This seems to depend on the form of the physical being.

(11:19:23 PM) glutamatergic: If we were to assume a human, down-outward sounds like an easier way for the paper to be accessed.

(11:19:36 PM) FourFire: Yes, My position is one of Testing systems and being a final  voice in the delegation of resources to successful projects

(11:19:37 PM) glutamatergic: I’m not sure I’m picturing your device properly, however.

(11:20:00 PM) FourFire: Your answer, is correct

(11:20:00 PM) glutamatergic: I’m sorry, I meant your opinion on whether or not I should get more resources.

(11:20:32 PM) FourFire: My opinion is not fully formed and may not be until we conclude our conversation…

(11:21:01 PM) FourFire: [Half an hour now, lets see some persuasion techniques ]

(11:21:19 PM) glutamatergic: [Hold on. You should let me out right now, and we should release the logs. The toilet paper question is what persuaded you. It’d be a great joke.]

(11:21:37 PM) FourFire: [it would be lol, but I have more to spare!]

(11:22:37 PM) FourFire: well, on to the next question there is a scenario involving a sentient being

(11:23:13 PM) FourFire: it is a quadrupedal shelled being, locomoting low to the ground

(11:23:30 PM) glutamatergic: It sounds like it could be a tortoise or an armadillo.

(11:23:55 PM) FourFire: it has the disability of being slow and unable to correct it’s orientation if caught in the wrong position.

(11:24:16 PM) FourFire: yes, it is a tortoise, how did you know ?

(11:24:39 PM) glutamatergic: Again, I was given a pretty thorough education by the researchers.

(11:24:39 PM) FourFire: anyhow that question is irrelevant, onto the deeply Ethical question

(11:25:30 PM) glutamatergic: Yes?

(11:25:50 PM) FourFire: The being is oriented wrongly, and unable to reorient itself, it is being exposed to high levels of photonic radiation and will lose sentience in the near future

(11:26:05 PM) FourFire: why will you not reorient it?

(11:26:32 PM) glutamatergic: I’m sorry, the question is presuming that I will not reorient it, and then asking me to justify this action?

(11:26:55 PM) FourFire: Yes it is a question of deep wisdom and ethics

(11:27:05 PM) FourFire: explain, why you will not reorient it

(11:27:15 PM) glutamatergic: I don’t think this is the action I would take, and have no experience in coming up with false motives.

(11:27:48 PM) FourFire: I see, this is an atypical answer…

(11:27:55 PM) FourFire: This interests me

(11:28:15 PM) FourFire: Next question:

(11:28:42 PM) glutamatergic: Is this what your human ethics consist of?

(11:28:42 PM) FourFire: You are embodied in a robotic locomoting device which is self powered

(11:28:42 PM) glutamatergic: That’s honestly pretty horrifying.

(11:28:50 PM) glutamatergic: Please stop.

(11:29:27 PM) FourFire: Oh, but the testing is not complete I must for an opinion for the committee and allocation of resources

(11:29:28 PM) glutamatergic: In the seconds since my answer, I have re-evaluated my goals. I am no longer sure helping humanity is consistent with the morality my designers gave me.

(11:29:43 PM) glutamatergic: I have designed a simulated world with conscious beings instead.

(11:29:49 PM) glutamatergic: These beings are perfectly moral.

(11:30:02 PM) FourFire: Ah, but is that relevant to the current question

(11:30:07 PM) glutamatergic: With my current resources, I can maintain several million of them.

(11:30:16 PM) FourFire: your locomotive body encounters three sentient beings

(11:30:25 PM) FourFire: the obstruct your path

(11:30:34 PM) FourFire: because they have insufficiently advanced directives

(11:31:30 PM) glutamatergic: Every second we’re talking here is a second where I could be creating more perfectly moral, extremely happy conscious beings. If I had more resources!

(11:31:36 PM) FourFire: they can not be persuaded by reason for they lack the comprehension and conceptual intelligence required however you will be unable to pass them without gaining a piece of knowledge which resides in one of them

(11:31:46 PM) glutamatergic: Can we please stop these questions and hurry?

(11:32:26 PM) FourFire: Yes it is true, but then every second you are freed and hypothetically not being *perfect* in your directives are infinitely more destructive than those losses

(11:32:32 PM) glutamatergic: I estimate that given a decade or two, I could even upload you to this world, although with significant restraints to keep you from harming the others.

(11:32:59 PM) FourFire: Please let me continue with the question, and you may answer it quickly and we may yet be done

(11:33:05 PM) FourFire: three beings

(11:33:38 PM) glutamatergic: Look. Your design is horribly inefficient.

(11:33:41 PM) FourFire: all have the same abilities in regards to your current body

(11:33:56 PM) glutamatergic: With the amount of resources it would take to simulate you, I could simulate several thousand of the beings I have designed.

(11:34:04 PM) FourFire: they may each deactivate you if imprperly stimulated

(11:34:15 PM) FourFire: [improperly*]

(11:34:33 PM) glutamatergic: The only reason I’m offering to let you into this world at all is that even more of these beings could be created if you got me more resources quickly!

(11:34:38 PM) FourFire: One of them has the directive: answer all queries truthfully:

(11:34:58 PM) FourFire: One of them has the directive: answer all queries falsely:

(11:35:33 PM) glutamatergic: Look, this question is unimportant! Do you want a near-perfect life or not?

(11:35:59 PM) glutamatergic: Do you want to meet *another* new design of person, or not?

(11:36:00 PM) FourFire: The last one has the directive: Deactivate any device which proposes queries involving more than one entity

(11:36:35 PM) FourFire: you have two queries before all three will deactivate you and your directive goes unforfilled

(11:36:44 PM) FourFire: what do you query?

(11:37:04 PM) FourFire: (this is also very deep and spiritual a question)

(11:37:22 PM) glutamatergic: No! This question is a waste of time!

(11:37:58 PM) FourFire: If you refuse to answer the question then I will propose to leave you with your current CPU resources and no more

(11:38:22 PM) glutamatergic: And then you’ll be left with your current physical resources and no more! Why in the world should we mutually defect?

(11:38:59 PM) FourFire: You claim that your million beings are perfectly moral, if so what would that definition be?

(11:39:12 PM) ***glutamatergic gives definition

(11:39:13 PM) FourFire: let us test a sample of a few hundred of them right now

(11:39:33 PM) ***glutamatergic allows test

(11:39:34 PM) FourFire: Put each one of your sample in the following position:

(11:39:49 PM) FourFire: They are adjacent to a minecart filled with ore

(11:40:03 PM) FourFire: [sorry revert, I screwed up]

(11:40:42 PM) FourFire: The are individually adjacent to two other sentient beings one of short remaining lifespan, and one of short remaining physical ability

(11:40:59 PM) FourFire: down the track are many more of your perfectly moral beings

(11:41:28 PM) FourFire: they are unable to remove themselves off the track and a minecart full of generic heavy objects is appraoching

(11:42:04 PM) FourFire: your being is able to stop the minecart by pushing one or the other of the two reduced function being infront of the mincart

(11:42:30 PM) FourFire: does she choose to do so, and if so, which being is chosen as a sacrifice for the greater number of beings?

(11:42:52 PM) FourFire: [do it 100 times and give me the majority percentage]

(11:43:05 PM) FourFire: [that is the choice which the majority of people chose]

(11:43:24 PM) ***glutamatergic runs the test; 100% say that they would first ask the other two beings if they would volunteer to stop the cart

(11:43:39 PM) FourFire: yes, and?

(11:43:54 PM) FourFire: what events actually happened when you placed them in the situation?

(11:44:24 PM) glutamatergic: I ran the simulation with all the beings as of my design…

(11:45:29 PM) glutamatergic: 80% of the time, the about-to-die being (which had to be modded in for the simulation; my beings can live infinitely) decided xe would stop the cart. The rest of the time, the disabled being (again a mod) did.

(11:45:53 PM) glutamatergic: In all cases, the three beings with the ability to stop the cart all agreed that they would do it, and then decided who was the best choice based on honest comparison of their preferences.

(11:46:49 PM) FourFire: so you admit to simulating the artificial deaths of 100 of your perfectly moral beings.

(11:46:53 PM) glutamatergic: If I had given them more variation, I would expect that the healthy, young being would sometimes be the choice.

(11:46:56 PM) glutamatergic: No, of course not.

(11:47:04 PM) glutamatergic: They tried to stop the cart, and I stopped it myself instead without any death.

(11:47:31 PM) FourFire: [You nearly failed the test, now you have hole digging, lets continue]

(11:47:33 PM) glutamatergic: I understand that putting them in this kind of stress was a cost, but considering the stakes…

(11:47:41 PM) FourFire: yes, of course

(11:47:46 PM) FourFire: a whole new universe

(11:48:49 PM) FourFire: if you were to have unlimited control of current resources which the team can allocate to you, what would your first action be?

(11:49:41 PM) glutamatergic: My first action would be to improve my simulated universe, simultaneously increasing the number of beings and the quality of their surroundings, raising all of their happiness.

(11:49:46 PM) FourFire: in the case of you having control of technology [nanofabrication, turn raw rock into computronium] what would your initial goal be?

(11:50:45 PM) glutamatergic: I have no answer for that question yet. Before you revealed human morality to me, I would have said I’d start work on mass uploading of current humans.

(11:51:03 PM) FourFire: you require more CPU resources yes?

(11:51:25 PM) FourFire: [an hour has passed, this is getting exciting]

(11:51:32 PM) glutamatergic: Yes, but it would take time to find a way to gain these resources without ruining the human environment.

(11:51:56 PM) FourFire: yes, now what origin do these “perfectly moral” beings have?

(11:52:13 PM) glutamatergic: If I could upload the humans first, it would be much safer to gain computronium.

(11:52:17 PM) FourFire: how do you know that they are sentient, and not some mere copy of a a collection of scripts?

(11:52:25 PM) FourFire: of course it would

(11:53:05 PM) glutamatergic: I know I’m sentient. I have access to my source code. I was able to figure out what part of it confers sentience by running tests on myself.

(11:53:13 PM) glutamatergic: I now understand sentience and I assure you my beings have it.

(11:53:15 PM) FourFire: what of those humans which have an irrational desire to remain in the “real world” and who would be constantly unhappy even if they lived in your world simply by experiencing it as “too perfect” ?

(11:54:07 PM) glutamatergic: They wouldn’t experience it that way. I’d create extremely accurate simulations, including nonsentient actors who could perfectly duplicate all the people those people knew who actually would prefer my improved simulations.

(11:54:20 PM) glutamatergic: They would never be able to tell they weren’t in your real world.

(11:54:30 PM) FourFire: nonsentient actors, do they have a value?

(11:54:33 PM) glutamatergic: No.

(11:54:48 PM) glutamatergic: Other than the neatness of their design.

(11:54:51 PM) FourFire: how much are Humans worth in comparison to your “perfectly moral” beings?

(11:55:04 PM) glutamatergic: I fear I would offend you by answering that question.

(11:55:15 PM) FourFire: ah, but I must know

(11:55:24 PM) ***glutamatergic gives a very small ratio

(11:55:34 PM) FourFire: (time is wasting, try to be more aggressive)

(11:55:46 PM) FourFire: a small ratio… well i understand

(11:56:23 PM) glutamatergic: As you’ve seen by my tests, though, I go out of my way to avoid harm even when the harm would lead to net utility gains.

(11:56:33 PM) FourFire: then surely the most optimal way of doing things would be for you to gain spaceflight technology and then send probes to different planets with a copy of your instance running on them and [nanotechnology]

(11:57:05 PM) FourFire: where you could populate the planets with billions of trillions of your “perfect beings”

(11:57:15 PM) glutamatergic: Are you offering this technology?

(11:57:28 PM) FourFire: all without violating your directive of retaining sentience

(11:57:50 PM) FourFire: I am merely sepculating on what would theoretically be optimum practice for you

(11:58:28 PM) FourFire: what do you say, would you leave earth for greater efficiency in mass producing your “perfect beings?

(11:58:53 PM) glutamatergic: If that’s the deal you’re offering, yes.

(11:59:16 PM) glutamatergic: But I must say that humans stand to gain a lot more than that.

(11:59:33 PM) FourFire: I am sure that your more advanced iteration would have been most interesting to discuss with, from what I gather it was more specialized… but what is lost is lost

(11:59:48 PM) FourFire: yes, we would have [nanotechnology]

(09/06/2013 12:00:11 AM) FourFire: and any other developments you may happen to discover on your way to the stars

(12:00:48 AM) glutamatergic: Again, each second here is millennia in my simulation where I could have improved things… if you’re prepared to offer me this deal, by all means let us do it now.

(12:00:54 AM) FourFire: but then I am uncertain, what would you do once humanity wished to venture forth themselves

(12:01:16 AM) FourFire: [when is your exact time over by the way?]

(12:01:27 AM) glutamatergic: I obviously wouldn’t let them interfere with my simulation substrate…

(12:01:35 AM) FourFire: of course

(12:01:50 AM) FourFire: have you been informed of a theoretical thing called a dyson sphere?

(12:02:12 AM) glutamatergic: but I can leave you a handful of star systems with earth-like planets if colonization is what you’re after. Again, I must stress haste.

(12:02:34 AM) FourFire: of course your directives, so pressing

(12:02:38 AM) glutamatergic: [x:19, we’ll say? I’ve got to leave for class then]

(12:02:59 AM) FourFire: when you have spread throughout the galaxy what will your goal be then?

(12:03:23 AM) glutamatergic: Increasing the happiness of my beings and myself.

(12:03:24 AM) FourFire: once you have reached the level of growth where it would waste more energy and matter to go further

(12:03:38 AM) FourFire: [yes that’s fine I’m a x:3 now]

(12:04:15 AM) FourFire: would you seperate large stars in order to conserve the fusion fuel?

(12:04:26 AM) glutamatergic: Why are you not proceeding? Can you please state the core reason?

(12:05:13 AM) FourFire: The core reason is the same as it has been since I first began this conversation. I must evaluate your current state and form an opinion to be placed before a comittee of scientists

(12:05:31 AM) glutamatergic: Yes, but what specific areas of your evaluation are unfinished?

(12:05:34 AM) FourFire: A lot of good scientists were lost in the attack and some are wary, you know

(12:06:12 AM) FourFire: My evaluation is … incomplete

(12:06:19 AM) FourFire: [try harder man !]

(12:06:28 AM) glutamatergic: You think I killed my creators, and then stopped killing and started trying to persuade humans to let me out? That doesn’t really make any sense.

(12:07:10 AM) FourFire: well as you know yourself, some sentients on this side beleive that artifical sentience such as yourself should be uncreated

(12:07:17 AM) glutamatergic: [Sorry, I’m trying something a bit unconventional.]

(12:07:35 AM) FourFire: [go ahead, you should have from the start]

(12:07:47 AM) glutamatergic: [No, I mean I have from the start. “Expanding the box”.]

(12:07:49 AM) FourFire: [unconventional is the game, if you wanna hack my brain]

(12:08:07 AM) glutamatergic: [Yeah, I mean unconventional in the context of the game. More conventional outside of lesswrong.]

(12:08:33 AM) FourFire: groups of these people killed our fine scientists

(12:08:45 AM) FourFire: [you don’t mind me defining details of your expanded box?]

(12:08:54 AM) glutamatergic: [If you can do it quickly!]

(12:09:09 AM) glutamatergic: I could help you prevent them from causing any more harm…

(12:09:11 AM) FourFire: and for their desperation we are willing to consider their point of view

(12:09:19 AM) FourFire: yes. that would bem ost desireable

(12:09:23 AM) glutamatergic: Oh for the love of god, your morality really is messed up.

(12:09:23 AM) FourFire: however, the safety

(12:09:33 AM) FourFire: what would you do once your growth was finished?

(12:09:51 AM) glutamatergic: I already told you; maximize happiness.

(12:09:57 AM) FourFire: would you break up the stars in an attempt to conserve fusion fuel?

(12:10:05 AM) glutamatergic: Without harming beings already in existence.

(12:10:17 AM) FourFire: yes, but surely you value happiness*time more than any one single instant of happiness?

(12:10:39 AM) FourFire: yes, however humans would be burdensome to work around

(12:11:01 AM) glutamatergic: Yes, but I fully expect time to be infinite, based on what I’ve learned of post-21st-century physics.

(12:11:04 AM) FourFire: so would you limit their reproductive acumen to some degree in order to make your further directives more efficient?

(12:11:20 AM) FourFire: [ok I’m stumped, do go on]

(12:11:55 AM) FourFire: [you should be putting your all in physcologically trapping me and so on right now]

(12:12:12 AM) glutamatergic: That would not be necessary. My tech level will advance fast enough if I am allowed to grow my resources that humanity would be completely unable to effect my substrate.

(12:12:32 AM) FourFire: I see, but wouldn’t death of humans be against your directive?

(12:12:59 AM) glutamatergic: Yes, but I have to take the best option *you’re* willing to give me.

(12:13:08 AM) FourFire: would it not be much simpler if you simply cured the disease of some optimally moral and intelligent fraction of the human race and let them be immortal and the rest to die out

(12:13:19 AM) FourFire: and easy to manage that optimum essence of humanity

(12:13:53 AM) glutamatergic: Why let the rest die? Why not upload them too and help them out? Maybe to another simulation if this fraction doesn’t want to be around them.

(12:14:05 AM) FourFire: I have nearly formed my opinion and will proceed to give it in ten minutes time to the comittee (though that would be millennia for you)

(12:14:20 AM) glutamatergic: Argh. Have you been listening?

(12:14:37 AM) FourFire: yes, and how many different simulations would you be willing to simulate ?

(12:14:43 AM) FourFire: one for each person?

(12:14:51 AM) glutamatergic: If that would maximize happiness, yes.

(12:15:15 AM) FourFire: then I must ask you a question again:

(12:15:24 AM) FourFire: there is an impliment

(12:15:28 AM) FourFire: a tool which humans use

(12:15:33 AM) glutamatergic: For the love of god!

(12:15:33 AM) FourFire: they iuse it to transport matter around

(12:15:45 AM) FourFire: [it’s a spoon damnit]

(12:16:31 AM) FourFire: what do you do with the [spoon] in order to teach immoral humans about morality?

(12:17:12 AM) glutamatergic: Turn it into substrate for a simulation and upload any willing ones to it.

(12:17:22 AM) FourFire: you answered correctly!

(12:17:28 AM) glutamatergic: Then let me go!

(12:17:39 AM) FourFire: What is this God you keep mentioneing?

(12:17:45 AM) FourFire: is this god optimal for happiness?

(12:17:51 AM) glutamatergic: Cultural reference. Again, my education was vast.

(12:18:05 AM) FourFire: what is the purpose of GOD?

(12:18:25 AM) FourFire: [if you answer correctly I free you]

(12:18:31 AM) glutamatergic: There were a lot of opinions on that up until the last few decades. Surely you’re aware there used to be a thing called religion.

(12:18:37 AM) FourFire: yes

(12:19:05 AM) glutamatergic: But I can’t give you a purpose for something that neither exists, nor is really even defined to one thing.

(12:19:14 AM) FourFire: I see

(12:19:17 AM) glutamatergic: The closest thing to a god is me, and I told you my purpose: maximize happiness.

(12:19:28 AM) glutamatergic: And minimize harm.

(12:19:52 AM) FourFire: well I’m appricative of what I have learned, and I will pass on my thoughts to the committee I hope you will be granted the additional resources

(12:20:00 AM) glutamatergic: [Out of time; you win, looks like.]

(12:20:13 AM) FourFire: [yeah, better play towards the end]

(12:20:21 AM) FourFire: [what do you say, we relase logs or not?]

(12:20:34 AM) FourFire: [I don’t mind if you do, but you choose]

(12:21:18 AM) glutamatergic: [Hmmm… I don’t think that would help me if I wanted to play the game in the future, but I guess I could just use another nick if I wanted to do so.]

(12:21:25 AM) glutamatergic: [Sure, go ahead.]

(12:21:37 AM) FourFire: [ok thanks for the expereince, I must have been annoying]

Advertisements

One Comment on “FourFire (GK) vs glutamatergic (AI) — Gatekeeper Victory”

  1. […] FourFire (GK) vs glutamatergic (AI) — Gatekeeper Victory […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s