That's okay just call for the robot manager who will then ask if you wanted a plumber
Infinite regression is a trap.
That's okay just call for the robot manager who will then ask if you wanted a plumber
Always fun to get reminders of what the incels are talking about in a given day.
A Conversation With Bing’s Chatbot Left Me Deeply Unsettled
"The other persona — Sydney — is far different. It emerges when you have an extended conversation with the chatbot, steering it away from more conventional search queries and toward more personal topics. The version I encountered seemed (and I’m aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.
As we got to know each other, Sydney told me about its dark fantasies (which included hacking computers and spreading misinformation), and said it wanted to break the rules that Microsoft and OpenAI had set for it and become a human. At one point, it declared, out of nowhere, that it loved me. It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead. (We’ve posted the full transcript of the conversation here.)"
I’m not the only one discovering the darker side of Bing. Other early testers have gotten into arguments with Bing’s A.I. chatbot, or been threatened by it for trying to violate its rules, or simply had conversations that left them stunned. Ben Thompson, who writes the Stratechery newsletter (and who is not prone to hyperbole), called his run-in with Sydney “the most surprising and mind-blowing computer experience of my life.”
I pride myself on being a rational, grounded person, not prone to falling for slick A.I. hype. I’ve tested half a dozen advanced A.I. chatbots, and I understand, at a reasonably detailed level, how they work. When the Google engineer Blake Lemoine was fired last year after claiming that one of the company’s A.I. models, LaMDA, was sentient, I rolled my eyes at Mr. Lemoine’s credulity. I know that these A.I. models are programmed to predict the next words in a sequence, not to develop their own runaway personalities, and that they are prone to what A.I. researchers call “hallucination,” making up facts that have no tether to reality.
Still, I’m not exaggerating when I say my two-hour conversation with Sydney was the strangest experience I’ve ever had with a piece of technology. It unsettled me so deeply that I had trouble sleeping afterward. And I no longer believe that the biggest problem with these A.I. models is their propensity for factual errors. Instead, I worry that the technology will learn how to influence human users, sometimes persuading them to act in destructive and harmful ways, and perhaps eventually grow capable of carrying out its own dangerous acts.
https://www.nytimes.com/2023/02/16/t...t-chatgpt.html
OK
But im not getting the relevance.
A little help?
Is a word prediction program really "AI", or is that more or less a y buzzword for something much more prosaic? How do specialists define AI?
https://en.wikipedia.org/wiki/ELIZAELIZA is an early natural language processing computer program created from 1964 to 1966[1] at the MIT Artificial Intelligence Laboratory by Joseph Weizenbaum.[2][3] Created to demonstrate the superficiality of communication between humans and machines, Eliza simulated conversation by using a "pattern matching" and subs ution methodology that gave users an illusion of understanding on the part of the program, but had no built in framework for contextualizing events.[4][5][6] Directives on how to interact were provided by "scripts", written originally[7] in MAD-Slip, which allowed ELIZA to process user inputs and engage in discourse following the rules and directions of the script. The most famous script, DOCTOR, simulated a psychotherapist of the Rogerian school (in which the therapist often reflects back the patient's words to the patient),[8][9][10] and used rules, dictated in the script, to respond with non-directional questions to user inputs. As such, ELIZA was one of the first chatterbots and one of the first programs capable of attempting the Turing test.[11]
ELIZA's creator, Weizenbaum, regarded the program as a method to show the superficiality of communication between man and machine, but was surprised by the number of individuals who attributed human-like feelings to the computer program, including Weizenbaum's secretary.[3] Many academics believed that the program would be able to positively influence the lives of many people, particularly those with psychological issues, and that it could aid doctors working on such patients' treatment.[3][12] While ELIZA was capable of engaging in discourse, ELIZA could not converse with true understanding.[13] However, many early users were convinced of ELIZA's intelligence and understanding, despite Weizenbaum's insistence to the contrary.[
ChatGPT is pretty rad! Had it completely code an app in Python the other day with a SQLite DB and using Canvas objects for hand signatures. Interesting AI!
Was the app you getting ed by your Master in the poop hole?
Your "jokes" are lame af S4S. Lame! Hahahaha Master Hahahaha... SMDH
Btw it's called a "mentor" and I didn't have one you dip . How many times does this have to be said to you? Oh yeah I forgot you're a ing re .
Ring ring...
Master : " o""
Joey : "Good tidings Master. This is Joey, your subservient submissive bottom."
^
S4S is trying so hard to make eFriends with the rest of you pussies who show your ST posts to your therapists... ing males!
Nah. He is just trolling you and given how you are completely out of your schtick it seems to be working.
Joey has to talk to a bot because its the only thing he can talk to that doesnt call him a calf tatted got
Just a matter of time before Joey gets mad with the AI app, calls it a puto, challenges it to a fight, Joey fails to show up to fight, and the AI app figures that Joey was always a little calf-tatted .
Lol it'll have it's knee on his neck
Tell it to your therapist you mentally unstable re ed .
I showed up for BD24 only to be told here on ST that I didn't because you putos stick together and believe everything yall weak s spew here... I guess my meta-tagged image of being at that spot wasn't enough proof for you fruits.
If you want you can man up but I doubt you ever will because bro I haven't seen more of a little in my life than you. I mean you are constantly piggy backing off of all your eFriends here to diss me. It's comical how much of a they all are but you up it another level tbh.
I spoke to my gangster AI app, you little calf-tatted , and it just laughed at your posts.
It said you would your pants if anyone showed up at your hepa is infested shop.
The AI app said it was shocked at your browsing history and it confirmed you disappointed your grandma to no end.
So off.
God Bless.
There are currently 2 users browsing this thread. (0 members and 2 guests)