Ace: aceofspadeshq at gee mail.com
Buck: buck.throckmorton at protonmail.com
CBD: cbd at cutjibnewsletter.com
joe mannix: mannix2024 at proton.me
MisHum: petmorons at gee mail.com
J.J. Sefton: sefton at cutjibnewsletter.com
Chavez the Hugo 2020
Ibguy 2020
Rickl 2019
Joffen 2014
AoSHQ Writers Group
A site for members of the Horde to post their stories seeking beta readers, editing help, brainstorming, and story ideas. Also to share links to potential publishing outlets, writing help sites, and videos posting tips to get published.
Contact OrangeEnt for info: maildrop62 at proton dot me
Open source depends on copyright. If you write something - software, here, but anything - it is your creating and you have the right to sell it, or to give it away, on terms you decide. Large open source projects need to get permission from all their contributors to include their code in their project.
Generative AI trashes all of that. The products of generative AI are not generally copyrightable, and they don't keep track of where things came from either. A piece of code from AI could be nominally original, or it could be adapted from an open source project, or it could be copied verbatim from a copyrighted source that the AI might not even have had legal access to (Anthropic had to pay out $1.5 billion over similar problems with its training data).
Not the specific exploits, but buffer overruns, remote code execution, or simple authentication bypasses where the code never checks the password at all.
The answer? Because we're still dealing with code that was likely written in 1995:
"But when you're dealing with legacy code - we've actually seen some C++ applications where you have literally thousands of overflow issues and the original developers are long gone - it's very difficult to get a new developer to look at it, and they don't really want to touch the code. They get to a point where it's like: Well, prove to me it's exploitable, because this is a critical old piece of code that no one understands and it's dangerous to touch it."