Ace: aceofspadeshq at gee mail.com
Buck: buck.throckmorton at protonmail.com
CBD: cbd at cutjibnewsletter.com
joe mannix: mannix2024 at proton.me
MisHum: petmorons at gee mail.com
J.J. Sefton: sefton at cutjibnewsletter.com
Bandersnatch 2024
GnuBreed 2024
Captain Hate 2023
moon_over_vermont 2023
westminsterdogshow 2023
Ann Wilson(Empire1) 2022 Dave In Texas 2022
Jesse in D.C. 2022 OregonMuse 2022
redc1c4 2021
Tami 2021
Chavez the Hugo 2020
Ibguy 2020
Rickl 2019
Joffen 2014
AoSHQ Writers Group
A site for members of the Horde to post their stories seeking beta readers, editing help, brainstorming, and story ideas. Also to share links to potential publishing outlets, writing help sites, and videos posting tips to get published.
Contact OrangeEnt for info: maildrop62 at proton dot me
It's slightly faster than last year's 4080 Super, which was slightly faster than the previous year's 4080.
It does best at 4K resolutions, where it's 9% faster, probably thanks to the much faster GDDR7 memory. At lower resolutions where memory bandwidth doesn't matter so much, performance gains average just 3%.
And with average gains that low, yes, it is sometimes slower than the previous model. I'm not sure how, because by all the numbers it should be at least a little better.
All the benchmarks seem to pit it against AMD's 7900 XTX, which comes out looking pretty good aside from ray tracing, since it is cheaper, includes 24GB of RAM compared to the 5080's 16GB, and is just as fast for anything except ray tracing.
This review bizarrely suggests skipping the 5080 for the 5090, which delivers 50% better performance for twice the price. Yes, if a $1000 video card isn't fast enough to justify the price, you should spend twice as much for something that is even worse value.
We're going to see a repeat of this next month when the cheaper models - the 5070 Ti and 5070 - arrive. In March we might see something interesting as AMD's new cards launch.
But probably not.
Tech News
So Chinese AI platform DeepSeek reportedly ripped off OpenAI, but OpenAI ripped off everyone on the internet, so nobody cared.
And it reportedly was collecting massive amounts of data from its users, but so is everyone on the internet, so nobody cared.
The company is planning to develop - and release as almost-open source - two new models in the Llama series to compete head on with the closed source ChatGPT.
(The Llama license has a provision that you can't use it for free if you're a trillion-dollar tech giant, so technically it's not open source, but most of us don't need to worry about that.)