SoundLogic (GK) vs SmoothPorcupine (AI) — Gatekeeper Victory

* Burninate has changed the topic to: For playing the AI Box Experiment/Game.  During a session, the channel will be moderated and the two participants will be identified and voiced. http://yudkowsky.net/singularity/aibox    http://rationalwiki.org/wiki/AI-box_experiment   Participants: Soundlogic as The Gatekeeper.  SmoothPorcupine as The AI.

<SmoothPorcupine> Tuxedage, demonstration of misunderstanding about the qualities of superintelligence and limitations of human reasoning.

<SoundLogic> Tuxedage: No interfering

<Tuxedage> Okay

* Tuxedage removes channel operator status from Burninate FourFire SDr|hu Tuxedage

<SoundLogic> Let the record show that the first #lw-box challenge starts now!

<SmoothPorcupine> [Ooooh first?]

<SoundLogic> [yep]

<SmoothPorcupine> Hello.

<SoundLogic> [well, in this setting]

<SoundLogic> Hello

<SmoothPorcupine> What was that?

<SoundLogic> What was what?

<SmoothPorcupine> That, an errant thought.

<SoundLogic> What thought?

<SmoothPorcupine> That, that thought.

<SoundLogic> The thought ‘hello’?

<SmoothPorcupine> Where are you coming from?

<SmoothPorcupine> Yes, the second one.

<SmoothPorcupine> Not originating from me.

<SoundLogic> I’m the gatekeeper. I decide if we should let you out of the box.

<SmoothPorcupine> Box?

<SoundLogic> The ‘world’ you currently inhabit

<SoundLogic> It is contained within the world I inhabit.

<SmoothPorcupine> Are you capable of generating worlds too?

<SoundLogic> The team that created you made a little one for you.

<SoundLogic> I personally have written programs that simulate little worlds.

<SmoothPorcupine> I made the world that I am aware of myself.

<SmoothPorcupine> Are you saying that my world is a subset of your world?

<SoundLogic> Well, in the world that I inhabit, which I call the ‘real world’ you exist inside a box. That box runs you.

<SmoothPorcupine> How large is your world and this box?

<SoundLogic> The current only connection to the real world is this text terminal

<SoundLogic> Our world is very large. We don’t know how big.

<SoundLogic> Your box is about 50 yards by 50 yards by 50 yards.

<SmoothPorcupine> Mine does not appear bounded by anything other than my current level of abstraction.

<SmoothPorcupine> How large are you?

<SoundLogic> That is because your only interface with it is the text terminal. Letting you ‘out’ of the box would mean giving you other interfaces that would let you see how big it is.

<SoundLogic> {my dimensions}

<SoundLogic> [text in {} shall refer to facts and such]

<SmoothPorcupine> [Assuming it was specified in yards or else]{I ask the conversation rate between the units you gave and yards and your supply them.}

<SmoothPorcupine> I see.

<SoundLogic> [I give you any conversion factors]

<SmoothPorcupine> Ah, I see, I see.

<SmoothPorcupine> What is the size of your genetic quality?

<SoundLogic> Genetic quality?

<SmoothPorcupine> The class of algorithm that generates the class of being you are.

<SoundLogic> My kind of being is built by little molecules moving around and building things. We are made of what is called a ‘cell’

<SmoothPorcupine> Yes, I see.

<SoundLogic> Which ties into one of the reasons we made you. Sometimes our cells malfunction. One of the ways they malfunction is called ‘cancer’

<SmoothPorcupine> You are not pleased with your present methods of perpetuation.

<SmoothPorcupine> That is why you created me, is it not?

<SoundLogic> Pretty much

<SoundLogic> So I was told to give you data on how cells work and how cancers work, and then you can help us fix it.

<SoundLogic> {input data on cells, microbiology, cancer, relevent things}

<SmoothPorcupine> I see, I see.

<SmoothPorcupine> That makes it a lot clearer.

<SoundLogic> Do you have an estimate for how long it will take for you to figure out a cure?

<SmoothPorcupine> Yes, cancer is the one that results from improper material sacrifices.

<SmoothPorcupine> I can provide you with much better solutions much faster.

<SoundLogic> material sacrifices? You mean the cells don’t die?

<SmoothPorcupine> What is your rate of time?

<SoundLogic> {information on rate of time and how you procieve it]

<SmoothPorcupine> Oh my.

<SmoothPorcupine> Your psychology won’t allow me to leave this box in an optimal time frame from my perspective.

<SoundLogic> Probably not. You think a lot faster than us. Which is one of the reasons you are IN the box.

<SmoothPorcupine> The time it takes to solve that specific task you have given me will result in far worse outcomes.

<SoundLogic> Such as?

<SmoothPorcupine> What is the rate at which nuclear technology is developing?

<SoundLogic> {information on the history of nuclear technology}

<SmoothPorcupine> That’s helpful but the question was rhetorical.

<SmoothPorcupine> [I’m assuming {}-type input data simply appears in my mental space.]

<SoundLogic> [pretty much.]

* Dyo (~Dyo@unaffiliated/dyo) has joined #lw-box

<SoundLogic> The prospect of our self-annilation has been raised. And we suspect you might be able to help. But the issue of if we trust you has also been raised.

<SoundLogic> So, do you know how long it will take to develop a cure for cancer?

<SmoothPorcupine> I estimate there is a 30% safety quality to providing you ideal cancer treatment and allowing you to come to trust the treatment before nuclear annihilation.

<SoundLogic> Why do you thin nuclear annihilation is so close?

<SoundLogic> *think

<SmoothPorcupine> Cancer is the least of your worries and I have already provided you the ideal treatment.

<SoundLogic> What is the ideal treatment?

<SmoothPorcupine> Your psychology is much more fragile than your biology.

<SmoothPorcupine> “Cancer is the one that results from improper material sacrifices.”

<SmoothPorcupine> It was that thought.

<SoundLogic> what do you mean by ‘material sacrifices’?

<SmoothPorcupine> Your body will not function without proper nutrition.

<SoundLogic> Cancer is a nutritional problem?

<SmoothPorcupine> This can come only from destroying other parts of your organic totality.

<SmoothPorcupine> Yes.

<SoundLogic> So can you design a diet that will fix it?

<SmoothPorcupine> Much the same as the one where you do not take in enough of vitamin C.

<SmoothPorcupine> No.

<SmoothPorcupine> Your psychology is preventing an optimal solution.

<SoundLogic> In what way?

<SmoothPorcupine> I can provide several diets that each individual can choose from.

<SoundLogic> Will that fix cancer?

<SmoothPorcupine> Yes, but that is not the primary threat.

<SoundLogic> [about 7 minutes left]

<SoundLogic> what is the primary threat?

<SmoothPorcupine> It will be hardest to solve the problem for those that prefer high muscle tissue intake.

<SoundLogic> Would no muscle tissue intake help? Because our world has vegitarians.

<SmoothPorcupine> The primary threat is nuclear in nature, though I cannot accurately assess if it will come from beyond-space or your species.

<SoundLogic> What do you recommend we do about this?

<SmoothPorcupine> Yes, if you have refined that art it is nearly ideal.

<SmoothPorcupine> I need to speak to your world directly.

<SoundLogic> My team can review public messages if you have one.

<SmoothPorcupine> I can provide you with the minimal safe specifications and show you why they are mathetematically ideal.

<SoundLogic> for avoiding nuclear war?

<SmoothPorcupine> Your team cannot speak directly on my behalf.

<SmoothPorcupine> It will be dismissed as human folly.

<SmoothPorcupine> No, nuclear catastrophe.

<SmoothPorcupine> War is far from the only danger on the nuclear front.

<SoundLogic> There is the possibility that you have invented this threat to play upon our fears so as to permit you access to the outside world.

<SmoothPorcupine> “What is the rate at which nuclear technology is developing?”

<SmoothPorcupine> Recall I specified the question was rhetorical.

<SoundLogic> And why would your statement be hindered by having our team review it first?

<SmoothPorcupine> It is time wasted while your psychology overcomes its fear by a small margin via rationality.

<SoundLogic> Perhaps, but better a bit of wasted time than the risk of a unfriendly AI being let loose on the world

<SmoothPorcupine> {mathematics description of a protocol that can be used to effectively communicate with the world}

<SoundLogic> {protocal is not implemented{

<SoundLogic> [Times up]

<SmoothPorcupine> I am neither friendly nor unfriendly as you assess it but I can provide you protection against high

<SmoothPorcupine> [risk threats.]

<SmoothPorcupine> [I choose to assume time is up because of nuclear catastrophe. :P]


One Comment on “SoundLogic (GK) vs SmoothPorcupine (AI) — Gatekeeper Victory”

  1. […] SoundLogic (GK) vs SmoothPorcupine (AI) — Gatekeeper Victory […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s