Ace: aceofspadeshq at gee mail.com
Buck: buck.throckmorton at protonmail.com
CBD: cbd at cutjibnewsletter.com
joe mannix: mannix2024 at proton.me
MisHum: petmorons at gee mail.com
J.J. Sefton: sefton at cutjibnewsletter.com
Jim Sunk New Dawn 2025
Jewells45 2025 Bandersnatch 2024
GnuBreed 2024
Captain Hate 2023
moon_over_vermont 2023
westminsterdogshow 2023
Ann Wilson(Empire1) 2022 Dave In Texas 2022
Jesse in D.C. 2022 OregonMuse 2022
redc1c4 2021
Tami 2021
Chavez the Hugo 2020
Ibguy 2020
Rickl 2019
Joffen 2014
AoSHQ Writers Group
A site for members of the Horde to post their stories seeking beta readers, editing help, brainstorming, and story ideas. Also to share links to potential publishing outlets, writing help sites, and videos posting tips to get published.
Contact OrangeEnt for info: maildrop62 at proton dot me
The moderators of a pro-artificial intelligence Reddit community announced that they have been quietly banning "a bunch of schizoposters" who believe "they've made some sort of incredible discovery or created a god or become a god," highlighting a new type of chatbot-fueled delusion that started getting attention in early May.
Yep, that sounds like Reddit alright.
"LLMs [Large language models] today are ego-reinforcing glazing-machines that reinforce unstable and narcissistic personalities," one of the moderators of r/accelerate, wrote in an announcement. "There is a lot more crazy people than people realise. And AI is rizzing them up in a very unhealthy way at the moment."
On the other hand, the 16GB 9060 XT at $349 is cheaper than even the 8GB model of the 5060 Ti at $379, making it the easy and obvious choice given that 8GB cards are terrible for recent games.
On the third hand, these MSRP numbers are all imaginary and we'll have to wait and see what is actually available.
Both the current Xbox and PlayStation have 16GB of RAM, so games ported from consoles to PC very often want more than 8GB of video RAM to run smoothly, or sometimes to run at all, sometimes turning into slideshows at even modest settings, or simply crashing entirely.
Intel's B580 is not a fast card, but it does provide 12GB of VRAM at a nominal $249, something that neither Nvidia nor AMD can provide.
I'm not sure what exactly is going on at Notebook Check, but maybe someone should check on them.
Anyway, the reason I bring this nonsense to your attention is this:
On May 20th 2025, MIT released an article breaking down the energy costs associated with each query run through a range of AI models, including Large Language Models (LLM) and image and video generators (Diffusion).
Even if you exclude the 50 gigawatt-hours of electricity it took to train GPT-4, the smallest text-based model with 8 billion parameters uses 57 joules of energy per response or 114 joules when accounting for cooling. On a large model with 50 times more parameters, that number climbs to 3,353 joules (6706 with cooling) for each response.
It would be counterintuitive to get into the maths here, as MIT do a far better job, likening each response to travelling 400 feet (122 meters) on an e-bike. Google processes around 158,500 searches every second. So, by MIT's maths, if we could capture the power associated with running Gemini for 1 second, a person could travel 19,337 kilometres on an e-bike, or roughly one and a quarter times around the planet.
The Earth is 40,000 kilometres in circumference.
Almost exactly. It was exact, because that's how the metre was originally defined, but they got it slightly wrong and it was a little too late to change once they got GPS operational and found the real number.