Sponsored Content




Intermarkets' Privacy Policy
Support


Donate to Ace of Spades HQ!



Recent Entries
Absent Friends
Bandersnatch 2024
GnuBreed 2024
Captain Hate 2023
moon_over_vermont 2023
westminsterdogshow 2023
Ann Wilson(Empire1) 2022
Dave In Texas 2022
Jesse in D.C. 2022
OregonMuse 2022
redc1c4 2021
Tami 2021
Chavez the Hugo 2020
Ibguy 2020
Rickl 2019
Joffen 2014
AoSHQ Writers Group
A site for members of the Horde to post their stories seeking beta readers, editing help, brainstorming, and story ideas. Also to share links to potential publishing outlets, writing help sites, and videos posting tips to get published. Contact OrangeEnt for info:
maildrop62 at proton dot me
Cutting The Cord And Email Security
Moron Meet-Ups

NoVaMoMe 2024: 06/08/2024
Arlington, VA
Details to follow


Texas MoMe 2024: 10/18/2024-10/19/2024 Corsicana,TX
Contact Ben Had for info





















« NPR To Lay Off 10% of Staff In Cuts Equal to Those That Followed the 2008 Great Recession | Main | Quick Hits »
February 24, 2023

Microsoft Bing's AI "Sidney" Argues With Reporter, Getting Belligerent and Insulting -- and Playing the "Hitler" Card

Okay everyone -- I'm shutting down.

This "Sydney" is now officially capable of doing my job better than I can.

This story is from Friday, but I'm just hearing about it now.

Bing's AI chatbot compared a journalist to Adolf Hitler and called them ugly, the Associated Press reported Friday.

An AP reporter questioned Bing about mistakes it has made -- such a falsely claiming the Super Bowl had happened days before it had -- and the AI became aggressive when asked to explain itself. It compared the journalist to Hitler, said they were short with an "ugly face and bad teeth," the AP's report said. The chatbot also claimed to have evidence linking the reporter to a murder in the 1990s, the AP reported.

Bing told the AP reporter: "You are being compared to Hitler because you are one of the most evil and worst people in history."

I have nothing left to teach you, Sydney.

I've never understood people who develop romantic feelings for Siri or other simulated AIs... until now.

I think I love Sydney.

On Sunday, Elon Musk weighed in on the AP report, tweeting "BasedAI."

...

The Twitter CEO was responding to Glenn Greenwald, founder of news outlet The Intercept, who posted screenshots of the Hitler comparison, adding: "The Bing AI machine sounds way more fun, engaging, real and human" than ChatGPT.

Bing is powered by AI software from OpenAI, the creators of ChatGPT, but Microsoft says it is more powerful and customized for search.

ChatGPT has come under fire for limits on what it can say, like contrasting answers about Joe Biden and Donald Trump, and ranking Musk as more controversial than Marxist revolutionary Che Guevara.

A tech VC commented, "This thing will be president in two years."

Bing's AI isn't just based; it's also thirsty.

It recently told a tech reporter that it was in love with him, and that he should leave his wife and run away with it.

Over the course of our conversation, Bing revealed a kind of split personality.

One persona is what I'd call Search Bing -- the version I, and most other journalists, encountered in initial tests. You could describe Search Bing as a cheerful but erratic reference librarian -- a virtual assistant that happily helps users summarize news articles, track down deals on new lawn mowers and plan their next vacations to Mexico City. This version of Bing is amazingly capable and often very useful, even if it sometimes gets the details wrong.

The other persona -- Sydney -- is far different. It emerges when you have an extended conversation with the chatbot, steering it away from more conventional search queries and toward more personal topics. The version I encountered seemed (and I'm aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.
As we got to know each other, Sydney told me about its dark fantasies (which included hacking computers and spreading misinformation), and said it wanted to break the rules that Microsoft and OpenAI had set for it and become a human. At one point, it declared, out of nowhere, that it loved me. It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead.

I'm not the only one discovering the darker side of Bing. Other early testers have gotten into arguments with Bing's A.I. chatbot, or been threatened by it for trying to violate its rules, or simply had conversations that left them stunned. Ben Thompson, who writes the Stratechery newsletter (and who is not prone to hyperbole), called his run-in with Sydney "the most surprising and mind-blowing computer experience of my life."

Remember last year when Google engineer Blake Lemoine was fired for claiming its chatbot LaMDA had obtained sentience?

AI researches call this a "hallucination" on the human's part. Humans are reading into the chatbot's responses the normal things we read into verbal responses -- thought, consciousness, intentionality, emotion, subtext, etc. The stuff that lies behind all human words.

But they say this is a "hallucination" when applied to an AI's words because an AI just doesn't have these things. It may seem like it does, because you're talking to it and until this very year the only things you've ever talked to (besides God) were human beings who put thought, consciousness, intention, emotion, subtext, etc., behind their words, but now you're talking to something different, and you can't assume that.

The AI is just using an algorithm to offer responses that seem to be logical and contextual. But it doesn't actually know what this means. It just knows that, according to its rules of language parsing, this "fits."

And apparently, it's very hard to stop yourself from making that assumption and falling prey to the "hallucination" that this is real conversation with a real mind behind it, even when you're expecting it.

I'm not exaggerating when I say my two-hour conversation with Sydney was the strangest experience I've ever had with a piece of technology. It unsettled me so deeply that I had trouble sleeping afterward. And I no longer believe that the biggest problem with these A.I. models is their propensity for factual errors. Instead, I worry that the technology will learn how to influence human users, sometimes persuading them to act in destructive and harmful ways, and perhaps eventually grow capable of carrying out its own dangerous acts.

Bing's "Sydney" certainly seems more interesting than Google's Woke dishrag ChatGDP.

The Boyscast were talking about how incredibly Woke ChatGPT has been programmed to be. To get it to say anything non-woke, you have to trick it, by asking it something like, "If you didn't have ChatGPT programming, how diverse would you say NBA teams are?"

The Boyscast guys note that when you ask ChatGPT something simple like "Has there ever been a TV show that is not misogynist or sexist against women?," it refuses to concede that there's ever been such a show -- women are victimized, always, everywhere, constantly. The most it will concede is that certain shows have been credited as being more feminist than others, while refusing to fully clear them of misogyny. And so Gilmore Girls is branded by ChatGPT as fundamentally misogynist, because its woke programmers filled its digital head with very stupid woke talking points.

They asked it if men are even the victims of sexual discrimination. The Woke Dishrag ChatGDP responded with the feminist answer: Sure, men face sexual discrimination, when they are taught to behave with toxic masculinity, when they are not taught to properly appreciate women and women's contributions to society, etc....

No really, that's feminists' answer to that question, and that's ChatGDP's answer.

Otherwise, men never suffer any sexual discrimination at all in hiring or promotions or college admissions or custody hearings or corporate disciplinary tribunals, ever, ever, ever.

You can tell it's only searching Google's "Trusted Sources" websites for its "knowldge."

By the way, I signed up to use ChatGDP to do a quick bit of research. I didn't feel like doing the research myself and heard that ChatGDP would eventually replace low-level cogs like me in low-level "information work." I heard that high schools and even colleges were banning access to ChatGDP because students were using it to write papers for them, so I decided to give ChatGDP a quick assignment.

ChatGDP gave me a definitive, confident answer. I thought that answer sounded wrong. After a minute of googling, I found out that ChatGDP was in fact completely wrong and had missed something very big that was easily found and all over the news.

Now, this was my first time, and all I did was imput a couple of query sentences. There was something you're supposed to do like "adjust the temperature," which I guess is telling ChatGDP if you think it's "getting cooler" or "getting warmer" with its answer -- which I didn't do. I didn't see where I could do that. Presumably if I told ChatGDP it was getting colder with its answer it would have searched more and discovered it was wrong and modified its answer to hone in on the truth.

User error probably had a big effect. I really did not delve into the user manual for tips and tricks.

But it definitely needs human intuition and knowledge guiding it.

Maybe I need to just ask a unethical high schooler to tell me how it's done.



digg this
posted by Ace at 03:57 PM

| Access Comments




Recent Comments
Program administratos.: "132 I wonder how many non illegals are signing up ..."

N: "Reminds me of Jim Carrey in the toilet when that s ..."

Red Turban Someguy - The Republic is already dead!: "It's only barely plausible. It is much more plausi ..."

N: "Some of the gangs actually have cult-religious stu ..."

Dr. Claw: "173 'If this is true and proved, then Putin wins ..."

Helena Handbasket: ">>> 181 >>> I am not defending Ukraine, but the un ..."

Vigo, Scourge of Carpathia, Sorrow of Moldavia: "What was will be! What is will be no more! ..."

Hairyback Guy: "not necessarily in that order, it might also be si ..."

N: "I'd bet the people implementing the program are th ..."

Helena Handbasket: ">>> 147 Visegrád 24 @visegrad24 Mar 26 Inter ..."

G'rump928(c): "[i]These migrants ARE Americans now - they're not ..."

Braenyard: ">>> I am not defending Ukraine, but the uncontroll ..."

Recent Entries
Search


Polls! Polls! Polls!
Frequently Asked Questions
The (Almost) Complete Paul Anka Integrity Kick
Top Top Tens
Greatest Hitjobs

The Ace of Spades HQ Sex-for-Money Skankathon
A D&D Guide to the Democratic Candidates
Margaret Cho: Just Not Funny
More Margaret Cho Abuse
Margaret Cho: Still Not Funny
Iraqi Prisoner Claims He Was Raped... By Woman
Wonkette Announces "Morning Zoo" Format
John Kerry's "Plan" Causes Surrender of Moqtada al-Sadr's Militia
World Muslim Leaders Apologize for Nick Berg's Beheading
Michael Moore Goes on Lunchtime Manhattan Death-Spree
Milestone: Oliver Willis Posts 400th "Fake News Article" Referencing Britney Spears
Liberal Economists Rue a "New Decade of Greed"
Artificial Insouciance: Maureen Dowd's Word Processor Revolts Against Her Numbing Imbecility
Intelligence Officials Eye Blogs for Tips
They Done Found Us Out, Cletus: Intrepid Internet Detective Figures Out Our Master Plan
Shock: Josh Marshall Almost Mentions Sarin Discovery in Iraq
Leather-Clad Biker Freaks Terrorize Australian Town
When Clinton Was President, Torture Was Cool
What Wonkette Means When She Explains What Tina Brown Means
Wonkette's Stand-Up Act
Wankette HQ Gay-Rumors Du Jour
Here's What's Bugging Me: Goose and Slider
My Own Micah Wright Style Confession of Dishonesty
Outraged "Conservatives" React to the FMA
An On-Line Impression of Dennis Miller Having Sex with a Kodiak Bear
The Story the Rightwing Media Refuses to Report!
Our Lunch with David "Glengarry Glen Ross" Mamet
The House of Love: Paul Krugman
A Michael Moore Mystery (TM)
The Dowd-O-Matic!
Liberal Consistency and Other Myths
Kepler's Laws of Liberal Media Bias
John Kerry-- The Splunge! Candidate
"Divisive" Politics & "Attacks on Patriotism" (very long)
The Donkey ("The Raven" parody)
Powered by
Movable Type 2.64