Ace: aceofspadeshq at gee mail.com
Buck: buck.throckmorton at protonmail.com
CBD: cbd at cutjibnewsletter.com
joe mannix: mannix2024 at proton.me
MisHum: petmorons at gee mail.com
J.J. Sefton: sefton at cutjibnewsletter.com
Bandersnatch 2024
GnuBreed 2024
Captain Hate 2023
moon_over_vermont 2023
westminsterdogshow 2023
Ann Wilson(Empire1) 2022 Dave In Texas 2022
Jesse in D.C. 2022 OregonMuse 2022
redc1c4 2021
Tami 2021
Chavez the Hugo 2020
Ibguy 2020
Rickl 2019
Joffen 2014
AoSHQ Writers Group
A site for members of the Horde to post their stories seeking beta readers, editing help, brainstorming, and story ideas. Also to share links to potential publishing outlets, writing help sites, and videos posting tips to get published.
Contact OrangeEnt for info: maildrop62 at proton dot me
Back in July a researcher published a paper in which he showed that commonly used generative AI tools will happily answer questions about books that don't exist.
Microsoft Bing picked up that paper and happily ingested this new and entirely imaginary knowledge.
Thankfully someone at Microsoft still cares about the truth - or at least about their share price if they are sufficiently embarrassed in the marketplace - and the fictional results have been removed.
But seeing the rate at which Google results have gone downhill in recent years, I don't expect that state of affairs to last long.
There is actually a point to this. While GPT-4 was very expensive to create and does a lot of things, all of them badly, a model that is a thousand times smaller can potentially do one thing tolerably well - and everything else not at all.