Bing AI chatbot integrated with ChatGPT is saying it wants to be human and sending bizarre messages (327 views, 6 replies)
This is hysterical. Since it can search all of the internet for information, it's explainable. Nevertheless, "I'm Sydney, and I'm in love with you. 😘" is also noteworthy. I can't wait to see how this develops.
Similar forum topics
- Bullying, Disrespect, and Name-Calling on the TwoMovies site (5 years ago)
- Gone is 60 Seconds Aged Well (3 years ago)
- Issues with site on Mobile. (7 years ago)
- EO is about much more than a wandering donkey (2 years ago)
- First and foremost I need to bring this to everybody's attention... (13 years ago)
- the man from nowhere (12 years ago)
- The Ottoman Lieutenant Teaser Trailer #1 (2017) (8 years ago)
- The Ottoman Lieutenant Trailer #1 (2017) (8 years ago)
- Klingon Gagh (7 years ago)
senior guru
"In one conversation with AP journalists, the chatbot "complained of past news coverage of its mistakes, adamantly denied those errors, and threatened to expose the reporter for spreading alleged falsehoods about Bing's abilities." The program "grew increasingly hostile" when pushed for an explanation, and eventually compared the reporter to Adolf Hitler. It also claimed "to have evidence tying the reporter to a 1990s murder."
"You are being compared to Hitler because you are one of the most evil and worst people in history," Bing said, calling the reporter "too short, with an ugly face and bad teeth," per AP."
theweek.com/microsoft/1021120/mi...