How are we going to have evidence in trials when deep fakes and AI is getting so perfect at "being real"?
If AI and deep fakes can listen to a video or audio of a person and then are able to successfully reproduce such person, what does this entail for trials?
It used to be that recording audio or video would give strong information which often would weigh more than witnesses, but soon enough perfect forgery could enter the courtroom just as it's doing in social media (where you're not sworn to tell the truth, though the consequences are real)
I know fake information is a problem everywhere, but I started wondering what will happen when it creeps in testimonies.
How will we defend ourselves, while still using real videos or audios as proof? Or are we just doomed?
The sort of case I was thinking of is if different parties present different versions of an image or video and you want to establish which version is altered and which is original.
You still have the same problem though. You can produce a camera in court and reject one of the images, but you still need to prove that the camera wasn't tampered with and it was the one at the scene of the crime.
In this case, digitally signing an image verifies that the image was generated by a specific camera (not just any camera of that brand) and that the image generated by that camera looks such and such a way. If anyone further edits the image the hash won't match the one from the signature, so it will be apparent it was tampered with.
What it can't do is tell you if someone pasted a printout of some false image over the lens, or in some other sophisticated way presented a doctored scene to the camera. But there's nothing preventing us from doing that today.
The question was about deepfakes right? So this is one tool to address that, but certainly not the only one the legal system would want to use.
Usually I see non-technical people throw ideas like this and they're stupid, but I've been thinking about this for a few minutes and it's actually kinda smart
I think that’s exactly how it’s going to work - you can’t force all ‘fake’ sources to have signatures- it’s too easy to make one without one for malicious reasons. Instead you have to create trusted sources of real images. Much easier and more secure