LLM vendors are incredibly bad at responding to security issues
LLM vendors are incredibly bad at responding to security issues
LLM vendors are incredibly bad at responding to security issues
LLM vendors are incredibly bad at responding to security issues
LLM vendors are incredibly bad at responding to security issues
If models are trained on data that it would be a security breach for them to reveal to their users, then the real breach occurred at training.
now you know that and i know that,
The big LLMs everyone's talking about and using are just advanced forms of theft
Sloppy LLM programming? Never!
In completely unrelated news I've been staring at this spinner icon for the past five minutes after asking an LLM to output nothing at all:
same energy as “your request could not be processed due to the following error: Success”
What are the chances that the front end was not programmed to handle the LLM returning an empty string?
Quite likely yeah. There's no way they don't have a timeout on the backend.
now I wonder if you can make the LLM do fuzzing attacks on the backend by asking you to reply with some possible attacks.
LLM vendors are incredibly bad at responding to security issues
They're surprisingly skilled at getting money from idiots.
their previous experience in crypto is shining
My NSFW reply, including my own experience, is here. However, for this crowd, what I would point out is that this was always part of the mathematics, just like confabulation, and the only surprise should be that the prompt doesn't need to saturate the context in order to approach an invariant distribution. I only have two nickels so far, for this Markov property and for confabulation from PAC learning, but it's completely expected weird that it's happened twice.
Lol that's like expecting gold rushers to be squared away with OSHA, I hope nobody's surprised here
These guys got barely enough staff to run the model lol
Not really a security issue I'd say. The AI speaking gibberish when you try to make it speak gibberish isn't really that big of an issue.