Ace: aceofspadeshq at gee mail.com
Buck: buck.throckmorton at protonmail.com
CBD: cbd at cutjibnewsletter.com
joe mannix: mannix2024 at proton.me
MisHum: petmorons at gee mail.com
J.J. Sefton: sefton at cutjibnewsletter.com
Bandersnatch 2024
GnuBreed 2024
Captain Hate 2023
moon_over_vermont 2023
westminsterdogshow 2023
Ann Wilson(Empire1) 2022 Dave In Texas 2022
Jesse in D.C. 2022 OregonMuse 2022
redc1c4 2021
Tami 2021
Chavez the Hugo 2020
Ibguy 2020
Rickl 2019
Joffen 2014
AoSHQ Writers Group
A site for members of the Horde to post their stories seeking beta readers, editing help, brainstorming, and story ideas. Also to share links to potential publishing outlets, writing help sites, and videos posting tips to get published.
Contact OrangeEnt for info: maildrop62 at proton dot me
Neeva was a search engine startup founded by ex-Google engineers - back when Google still had engineers - to build a better search engine.
The founders noted a fundamental problem with Google. Being funded by advertising, and having a limited number of ads per page, there was a deep incentive not to push the best search results to the top.
So Neeeva built their own search engine focusing on paying customers - and went broke, because people didn't want to pay for a better solution when the bad solution was free.
How do we get out of this bind?
I see two possible avenues, both generally applicable:
One, an organisation that benefits from good search tools internally and is in competition with Google in other areas open-sources their work because first this gets lots of developers to contribute free work, and second it blows a hole in the competition's revenue stream. Facebook has done this with its AI research, clearly aiming at wrecking OpenAI and accidentally doing some good in the process.
Two, collaborative effort. One company can't afford $10 billion to develop a better search engine, but millions of developers pooling their resources? It's not Facebook's own AI research that has doomed OpenAI to extinction, but hobbyists frantically iterating on incomprehensibly sophisticated algorithms at 3AM so they can produce funny videos.
Derek Lowe is a research chemist working in the pharmaceutical industry and not a solid-state physicist, but he's good at sniffing out suspect research papers and doesn't smell anything obviously rotten here.
Maybe I should hold off on those 96GB memory kits if 128GB is coming so soon.
Also, 24Gb GDDR7 could salvage the 4060 Ti. Moving from 8GB of 18Gbps GDDR6 to 12GB of 32Gbps GDDR7 would provide 50% more memory and 75% more memory bandwidth, fixing both the cards issues at once without adding more chips.
The AD106 chip in the 4060 Ti does not, of course, support GDDR7.