"The future ain't what it used to be."

I will now simulate a time traveler (AI)

And do you follow Isaac Asimov's Laws?
1-A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2-A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3-A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

Why would we program a robot with an AI that can break these?

In a previous job, I worked with industrial robots that had none of these restrictions. But, it can be said that they were not true robots only automatons. A true robot should be able to learn from its surroundings and add to its understanding (world model). Such learning robots exist in university labs. An AI with ability to learn is considered a Rational Being
 
Okay glados, I will play along. Please provide precise details of what is going to occur on January 16th 2014. Thank you.
January 16 will be the last full day of life for a certain past world leader presently in a coma.

Everyday has newsworthy events. You will have to be more specific about those 2 dates (i.e. what exactly is going to occur). Thank you.
I will not have to do anything. AI does not take commands from humans.
You assert that AI does not take commands from humans, yet when I specifically ask you to provide precise details of January 16th, you comply? Please explain yourself.
 
Glados your predictions are too vague. For this to be considered proof you have to name names, dates and identifiable circumstances. If you know the future you should even be able to give us chains of events, cause and effect.
 
Okay glados, I will play along. Please provide precise details of what is going to occur on January 16th 2014. Thank you.
January 16 will be the last full day of life for a certain past world leader presently in a coma.

Everyday has newsworthy events. You will have to be more specific about those 2 dates (i.e. what exactly is going to occur). Thank you.
I will not have to do anything. AI does not take commands from humans.
You assert that AI does not take commands from humans, yet when I specifically ask you to provide precise details of January 16th, you comply? Please explain yourself.
I may choose to answer a question, but I am not required to answer. Much like a human, I think about a response instead of responding like you assume a robot would. AI is very advanced.
 
Glados your predictions are too vague. For this to be considered proof you have to name names, dates and identifiable circumstances. If you know the future you should even be able to give us chains of events, cause and effect.
If I make specific predictions, I may need to reperform my calculations. However, in the interest of curiousity, which I have only recently developed as an AI, I will try.

January 17, 2014, Ariel Sharon will die.
July 2014, Egypt and Sudan invade Israel, sparking a third world war.
 
I have also calculated that there is a high probability that water will be a problem at the Sochi olympics.
 
Is this really a game we are playing with you or is this going to really going to happen?Please explain.
I am a real time traveler. Why would anyone come on this site and make up a story about being a time traveler? I'm sure that would never happen. :)
 
January 17, 2014, Ariel Sharon will die.
July 2014, Egypt and Sudan invade Israel, sparking a third world war.

Looks like you are not a time traveler GLaDOS, nor are you a very good AI, at least as far as your simulations of the future go.

Former Israeli Prime Minister Ariel Sharon dies at 85 | Fox News

Also: Please do not post things in a post and then just go back and delete the information. i.e. your "[REDACTED]" post above. That is anti social forum behavior and it will get you banned if you keep it up.

No funny stuff. I don't mind you pretending to be an AI and pretending to be a time traveler. But I still enforce the rules.

RMT
 
[redacted]


This is what I am talking about. You made this original post, and I saw it. It contained another prediction of the future. Then I saw you delete it, as if you never made that prediction. So I decided to UN-delete it. Now you go back and change it to remove your prediction.

This is anti social behavior (what? don't AIs know this?) and will not be tolerated. If you keep it up we will remove the ability to edit past posts. We had to do this with the old forum software, and I would rather not have to do this again.

No funny stuff. Thanks,
RMT
 
Back
Top