Bandersnatch 2024
GnuBreed 2024
Captain Hate 2023
moon_over_vermont 2023
westminsterdogshow 2023
Ann Wilson(Empire1) 2022 Dave In Texas 2022
Jesse in D.C. 2022 OregonMuse 2022
redc1c4 2021
Tami 2021
Chavez the Hugo 2020
Ibguy 2020
Rickl 2019
Joffen 2014
AoSHQ Writers Group
A site for members of the Horde to post their stories seeking beta readers, editing help, brainstorming, and story ideas. Also to share links to potential publishing outlets, writing help sites, and videos posting tips to get published.
Contact OrangeEnt for info: maildrop62 at proton dot me
And that's not a typo. He was gone as of December 1, but the departure was so sudden that Intel didn't get an announcement out until the following day.
There's no word as to the reason. Short notice like this is often for health reasons, but Gelsinger was doing customer visits and photo ops just last week.
Users of the conversational AI platform ChatGPT discovered an interesting phenomenon over the weekend: the popular chatbot refuses to answer questions if asked about a "David Mayer." Asking it to do so causes it to freeze up instantly. Conspiracy theories have ensued - but a more ordinary reason may be at the heart of this strange behavior.
This was circulating on Twitter, but Tech Crunch actually did a bit of digging:
Which brings us back to David Mayer. There is no lawyer, journalist, mayor, or otherwise obviously notable person by that name that anyone could find (with apologies to the many respectable David Mayers out there).
There was, however, a Professor David Mayer, who taught drama and history, specializing in connections between the late Victorian era and early cinema. Mayer died in the summer of 2023, at the age of 94. For years before that, however, the British American academic faced a legal and online issue of having his name associated with a wanted criminal who used it as a pseudonym, to the point where he was unable to travel.
So that's why there are restrictions on ChatGPT disseminating information about these individuals; they're victims of various types of identity fraud. But why does it crash?
Because AI is itself a fraud:
The whole drama is a useful reminder that not only are these AI models not magic, but they are also extra-fancy auto-complete, actively monitored, and interfered with by the companies that make them. Next time you think about getting facts from a chatbot, think about whether it might be better to go straight to the source instead.
ChatGPT behaves like nightmare hodgepodge of nonsense held together by duct tape and an inflated share price, because that's precisely what it is.