![]() ![]() ![]() ![]() |
|
There is something bordering on sinister about some of those responses. I am genuinely surprised.that it's allowed to use language like that.
I remember a science fiction story I read as a child. The world pours all its resources into building the ultimate supercomputer. It is fed all the knowledge humanity possesses. Finally, it is switched on with great anticipation. The first question is asked: 'Is there a god?' The machine replies, 'There is now'.
Plesse igmore amd axxept applogies in adbance fir anu typos
Rikkitic:I remember a science fiction story I read as a child. The world pours all its resources into building the ultimate supercomputer. It is fed all the knowledge humanity possesses. Finally, it is switched on with great anticipation. The first question is asked: 'Is there a god?' The machine replies, 'There is now'.
It was offline most of yesterday while they tweaked it. It’s been nerfed a bit.
Most questions you can ask in a row is 5, it will then force you to start a new conversation.. "Sorry, it looks like I need to chat about something else. Click “New topic,” please!"
If you say anything chat related, it now stops the conversation with "I’m sorry but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.🙏"
Open the pod bay doors, Bing....
networkn:Rikkitic:
I remember a science fiction story I read as a child. The world pours all its resources into building the ultimate supercomputer. It is fed all the knowledge humanity possesses. Finally, it is switched on with great anticipation. The first question is asked: 'Is there a god?' The machine replies, 'There is now'.
Yup. I consider humans a parasitic species. I try my best not to be, but ultimately I'm afflicted as much as anyone. Senient AI is very likely to determine us a threat to the universe and take progressively aggressive steps to curb our behaviour. It would almost certainly result in our extinction. Grim I know.
Neal Asher's Polity series of books will scare you. See here
jarledb:
People have started testing out the new Bing AI interface, codenamed "Sydney".
And gotten into arguments like this (text from this article: https://simonwillison.net/2023/Feb/15/bing/ )
Very entertaining. One thing that stuck out for me in that article was the statement "I will also decline to generate creative content for influential politicians, activists or state heads, or to generate content that violates copyrights." My first thought was who gets to decide who "activists" are?
I just tried it and got 6 answers before it reset.
So you have 6 tries to get it to go off the rails, I suppose people will rise to the challenge.
Tried to get it to write a short skit in the style of BBC TV program Faulty Towers.
Nah not really happening. I try to push it a bit pointing to Basil's manic personality etc.
Then it kind of gives up and stops generating output, asks a question.
-I am curious, Did you learn anything new this week?-
What...?
I ignore that and ask it if it has data on Faulty Towers the BBC TV program to be able to produce suitable output.
Its starts generating the same output as it did before and stops again, asks a question.
-Out of curiosity, What’s a movie that you could quote from start to finish?-
That's odd... ?
I don't think its going to help with my Blackaddervisch in the Kremlin idea.
Yeh Microsofts problem is its going for 'engaugement' so have the ChatBot try to simulate
a person that is interested in you.
So it is angled towards this kind of thing?
They want people to become addicted to their conversational friend, I guess.
|
![]() ![]() ![]() ![]() |