From a legal/evidence perspective, what is going to happen when it will become impossible to tell the difference between a video generated by AI versus the real thing?
My prediction: verified video will start to become a thing.
Phones will be able to encode a digital signature with a video that certifies the date, time, and location where the video was captured. Modifying the video in any way will invalidate it.
Same for photos.
People will stop believing photos and video that don't have a verifiable signature. Social networks and news organizations will automatically verify the signatures of all photos and videos they display.
Technically this is already possible today, it just needs to become mainstream and the default.
Even that isn't possible. While you could confirm it hasn't been modified via hashing, it can only confirm that after it was created. If you created an entirely new file there's no way to prove it wasn't faked and then had a signature applied.
You'd need secure chips that can't reveal the key, and those would be signed by a trusted authority.
Then there'd be a black market for valid chips, or maybe some tomfoolery to make a camera think it's seeing something that's being fed into it via a different input.
Oh yeah, forgot about cryptographic hashing for a moment there. Though that "trusted authority" is the weak point, probably only a matter of time before it is corrupted and gives keys out to powerful others.
Or same thing for any workers with access to either the central key or one of the others. And if a specific company's key gets leaked, does that mean anything produced by their devices can no longer be trusted? If there's an inconvenient video in existence, will the way of defeating it just be to leak the private key protecting its hashes and just say that the hackers must have gotten ahold of the key earlier than everyone else to explain how that video existed before the key was leaked publically?
Or even just have someone break in to each of their systems and steal the keys but leave evidence of it happening and use that to create reasonable doubt about whatever videos they want to call fake news.