With the current political climate and fake news issues, I'm genuinely surprised this sort of technology hasn't become more of an issue already, and I wonder when it will start to happen. Fake...
With the current political climate and fake news issues, I'm genuinely surprised this sort of technology hasn't become more of an issue already, and I wonder when it will start to happen. Fake news articles are already circulated quickly, why have no fake videos like this made the rounds? At least if they have, I haven't heard about it yet.
I'm not so sure that it's "collapse of society" severe just yet, but like the article says, this technology's potential for abuse is pretty high. I figure it's only a matter of time before people start focusing on making fake videos with the intent of spreading misinformation.
I wonder if at some point we'll start to rely on cryptographic signatures to verify authenticity of photos. You could have a hardware security module with a key in it for every camera. Then,...
I wonder if at some point we'll start to rely on cryptographic signatures to verify authenticity of photos. You could have a hardware security module with a key in it for every camera. Then, optionally, you could sign photos with that key. People with your public key would be able to verify that the photo was at least taken by that camera.
Of course this won't really solve all of the problems this causes. But it should make false positives hard to make. False negatives on the other hand (where a person claims that a real image is fake due to a missing signature) will still be an issue. Were signatures like this to become common, I can see governments leveraging this to their advantage. All state media would be signed and non-state media would be considered untrustworthy.
I would love to see politicians brought down this way, especially conservatives. I see nothing wrong with this sort of character assassination as a tool for enacting political change.
I would love to see politicians brought down this way, especially conservatives. I see nothing wrong with this sort of character assassination as a tool for enacting political change.
The end justifies the means. Republicans have been spreading blatant misinformation and slander to push their agenda since 1994. Why shouldn't we do the same? Because we're better than them? We...
The end justifies the means. Republicans have been spreading blatant misinformation and slander to push their agenda since 1994. Why shouldn't we do the same? Because we're better than them?
We aren't. We used to fight just as dirty, if not dirtier. Hillary's email server is bupkis compared to how we used to do things in Tammany Hall. And if we want to drive the Republicans out of power, we need to be willing to do so by any means necessary.
If that means Mrs. Pence has to see a video of Mike cavorting with Stormy Daniels, so be it.
You can't control who's going to be brought down by something like this. It's naive to assume attacks won't affect both sides. In fact, I'd argue that conservatives would be more likely to use...
You can't control who's going to be brought down by something like this. It's naive to assume attacks won't affect both sides.
In fact, I'd argue that conservatives would be more likely to use dirty tactics like this to squash the liberals, as they've been more prone to underhanded tactics in recent years.
That also works for me. If things get to the point where you can't even run for town tax collector without somebody deep-faking you into a donkey show video and putting it on YouTube, people might...
That also works for me. If things get to the point where you can't even run for town tax collector without somebody deep-faking you into a donkey show video and putting it on YouTube, people might finally stop being so goddamned uptight about "sex scandals".
There have been works to detect faked videos, though obviously fakes will be continuously improved and it will probably turn into a race between creating and detecting faked videos. For the near...
There have been works to detect faked videos, though obviously fakes will be continuously improved and it will probably turn into a race between creating and detecting faked videos.
For the near future a short term solution would be to require high resolution videos in order for a clip to be trustful as those are harder and take more time to fake.
For the long term it will likely become necessary to build a cryptographic solution, as @teaearlgraycold mentioned, with videos being signed by the people starring in them.
With the current political climate and fake news issues, I'm genuinely surprised this sort of technology hasn't become more of an issue already, and I wonder when it will start to happen. Fake news articles are already circulated quickly, why have no fake videos like this made the rounds? At least if they have, I haven't heard about it yet.
I'm not so sure that it's "collapse of society" severe just yet, but like the article says, this technology's potential for abuse is pretty high. I figure it's only a matter of time before people start focusing on making fake videos with the intent of spreading misinformation.
I wonder if at some point we'll start to rely on cryptographic signatures to verify authenticity of photos. You could have a hardware security module with a key in it for every camera. Then, optionally, you could sign photos with that key. People with your public key would be able to verify that the photo was at least taken by that camera.
Of course this won't really solve all of the problems this causes. But it should make false positives hard to make. False negatives on the other hand (where a person claims that a real image is fake due to a missing signature) will still be an issue. Were signatures like this to become common, I can see governments leveraging this to their advantage. All state media would be signed and non-state media would be considered untrustworthy.
Before you know it, we're going to see deep fakes porn starring political leaders and fake children or fake animals.
I would love to see politicians brought down this way, especially conservatives. I see nothing wrong with this sort of character assassination as a tool for enacting political change.
You see nothing wrong with spreading blatant misinformation and slander to further a political goal? That's kind of ridiculous.
The end justifies the means. Republicans have been spreading blatant misinformation and slander to push their agenda since 1994. Why shouldn't we do the same? Because we're better than them?
We aren't. We used to fight just as dirty, if not dirtier. Hillary's email server is bupkis compared to how we used to do things in Tammany Hall. And if we want to drive the Republicans out of power, we need to be willing to do so by any means necessary.
If that means Mrs. Pence has to see a video of Mike cavorting with Stormy Daniels, so be it.
You can't control who's going to be brought down by something like this. It's naive to assume attacks won't affect both sides.
In fact, I'd argue that conservatives would be more likely to use dirty tactics like this to squash the liberals, as they've been more prone to underhanded tactics in recent years.
That also works for me. If things get to the point where you can't even run for town tax collector without somebody deep-faking you into a donkey show video and putting it on YouTube, people might finally stop being so goddamned uptight about "sex scandals".
Misinformation is a poison. You might have the antidote in hand but its a better policy not to spread poison around.
The poison has already been spread around. We're just arguing over whether it's OK to repay the poisoners in their own toxic coin.
There have been works to detect faked videos, though obviously fakes will be continuously improved and it will probably turn into a race between creating and detecting faked videos.
For the near future a short term solution would be to require high resolution videos in order for a clip to be trustful as those are harder and take more time to fake.
For the long term it will likely become necessary to build a cryptographic solution, as @teaearlgraycold mentioned, with videos being signed by the people starring in them.