Apr 15, 2026

GIGO

Disclaimer: Everything I'm saying here is from my own limited knowledge and experience of what it takes to "program" a computer.

AI
has already shown a nasty tendency to hallucinate. Its programming is such that it gets sent out to capture as much data as possible. Generally, that data has been processed down to actual information, and sometimes, it's been contaminated by "raw data" that's just out there floating around.

And as each AI thingie keeps eating everything put out by all the other AI thingies, there's a fair likelihood that we're going to get a kind of closed-loop information track that eventually leads to what we used to call Machine Psychosis.

ie:
10 print: Help, I'm stuck in a loop!
20 go to: 10

One of the problems right now is that everybody and his fuckin' uncle has been putting up an AI thingie, and they're all pretty much feeding on each other. So if some random yahoo throws some totally bogus info into cyberspace, it's going to get swept up and included in practically everybody's output.

So the big problem, as I see it, is that most AI engines haven't been taught any great skills of discernment - kinda like most Americans haven't. Not yet anyway.


No comments:

Post a Comment