Trust in our journalistic institutions is only going to be more and more important as this kind of stuff gets easier and more prolific. I'm on the fence about how devastating this tech could be....
Trust in our journalistic institutions is only going to be more and more important as this kind of stuff gets easier and more prolific.
I'm on the fence about how devastating this tech could be. On the one hand, it's always been a bit of a problem. It has always been easy to spread lies. I mean Fox gets away with making their own version of reality every day, and has for a long time. People share old clips on Twitter claiming they're video from a current event. Etc etc.
On the other hand, and the thing that scares me the most about it is that people trust our news outlets less than ever, and too many people blindly trust whatever garbage their friend shares on Facebook.
I think this is bang on the money. Those who've written off "mainstream media" are among the many of us who will value credibility and journalistic integrity very strongly in the future, even...
Trust in our journalistic institutions is only going to be more and more important as this kind of stuff gets easier and more prolific.
I think this is bang on the money.
Those who've written off "mainstream media" are among the many of us who will value credibility and journalistic integrity very strongly in the future, even though it's often taken for granted today.
A lot of people talk about the problems with being able to produce fake content that is unfalsifiable - which are massive - but people seem to forget the dangers on the opposite side of the same...
A lot of people talk about the problems with being able to produce fake content that is unfalsifiable - which are massive - but people seem to forget the dangers on the opposite side of the same coin. If everybody knows about deepfakes (or worse, knows about them at the same level the average person knows about, say, 'hacking'), then it also becomes difficult to prove that a legitimate recording is real - especially to the layman. So now if someone gets a recording of a politician making corrupt deals, etc., they can just throw their hands up and say 'It's a fake! My political enemies are trying to stitch me up!' and so formerly damning evidence is effectively useless.
If you wish to have reliable recordings, then we're going to need a step forward in recording technology for cameras. The devices can create and embed a cryptographic signature throughout the file...
If you wish to have reliable recordings, then we're going to need a step forward in recording technology for cameras. The devices can create and embed a cryptographic signature throughout the file that will not verify if so much as a single bit is changed. That info can include the date/time/device serial number/whatever else you want. That would provide some sort of 'legal standard' for verification of evidence used in a courtroom, for example. I doubt it'll help much with public opinion.
Sounds nice, but doesn't sound feasible to me from a tech point of view. For any sort of video copy or edit protection, there's always going to be the ability of recapturing (basically, the...
Sounds nice, but doesn't sound feasible to me from a tech point of view. For any sort of video copy or edit protection, there's always going to be the ability of recapturing (basically, the high-tech version of pointing a video camera at your TV to make a copy of a videotape) and editing that, and then re-applying the new security measures (EDIT: with a different but still-valid key or whatever) to the edited file.
Not true - if every individual recording device has a unique key, that's the proof. It can't be re-signed with that key by any other device, and it can't be forged by any other computers. You...
Not true - if every individual recording device has a unique key, that's the proof. It can't be re-signed with that key by any other device, and it can't be forged by any other computers. You could create a copy and edit, and use your own key to encode it, but that key won't be a match to the original.
These keys can be tied to the manufacturer, serial number of the device, the exact date of recording, and any other data we like. It's like a cryptographic chain of custody.
Sure - what I forgot to put into my comment is that such a system would probably still allow for verification in a court as you say. It doesn't enable an average person (or even e.g. journalists)...
Sure - what I forgot to put into my comment is that such a system would probably still allow for verification in a court as you say. It doesn't enable an average person (or even e.g. journalists) to determine if it's real or not. I wasn't saying that the keys could somehow be forged.
The other thing to note is that if every recording device has a private key that we assume is unbreakable (so, no quantum-computing breakthroughs or whatever), then that still just only proves that video A was made with device X. If I then edit it in fake video file B that says it was now made with device Y (the key of the editing device or ripped from another similar camera), how does one know from looking at B and Y that the original was in fact B and X? You'd be able to verify that a certain video came from the camera of a certain source, but in cases where the source is unknown or anonymous (e.g. a leak of a politician doing something embarrassing, where the recorder won't reveal themselves) it doesn't do anything to determine if the video itself is genuine or tampered with.
Anytime a Reddit thread shifts my opinion on something, I have to wonder if it was engineered to do so. And if it wasn’t, then I’m staring at the blueprints. Use two accounts to fake a discussion...
Anytime a Reddit thread shifts my opinion on something, I have to wonder if it was engineered to do so. And if it wasn’t, then I’m staring at the blueprints.
Use two accounts to fake a discussion about illegal immigration between a compassionate republican and a snarling liberal with unpopular views.
Claim to be a black woman, and describe the abusive treatment you’ve received from all your previous black boyfriends, and that you now only date white men. Reply with other fake accounts that, as a black woman, this has also been your overwhelming experience, and that you’re glad it’s being brought up.
Claim you’re a cardiologist that’s helped low income seniors for the last 2 decades, but will likely have to stop due to Obamacare. Add in jargon to taste.
Most readers will consume it quickly and move on, and their model of the world will have been updated.
You know, I most likely subconsciously changed my opinion on many topics after reading Reddit comment chains that looked informed. Thanks for bringing this up, I need to be more conscious of this.
You know, I most likely subconsciously changed my opinion on many topics after reading Reddit comment chains that looked informed. Thanks for bringing this up, I need to be more conscious of this.
Where does the deepfakes community live nowadays? It started on Reddit but was banned. I know there was some activity on Voat for awhile but it looks pretty dead over there now too. It seems like...
Where does the deepfakes community live nowadays? It started on Reddit but was banned. I know there was some activity on Voat for awhile but it looks pretty dead over there now too. It seems like people are just losing interest in it, which is surprising because it's such a powerful technique. Surely people are still refining the process somewhere, right?
Which do you think is the bigger danger: the ability to generate content that misrepresents reality or the ability to invalidate or dismiss actual footage?
Which do you think is the bigger danger: the ability to generate content that misrepresents reality or the ability to invalidate or dismiss actual footage?
For me it's a resounding: the ability to invalidate or dismiss actual footage. This is right up Russian and Trump era GOP propaganda teams' alleys. There is no truth, don't believe anything you see.
For me it's a resounding: the ability to invalidate or dismiss actual footage. This is right up Russian and Trump era GOP propaganda teams' alleys. There is no truth, don't believe anything you see.
The ability to do this has been around for a while now, just pretty crudely. I remember back when Contact came out, there was a big concern because they had edited a speech by Clinton to make it...
The ability to do this has been around for a while now, just pretty crudely. I remember back when Contact came out, there was a big concern because they had edited a speech by Clinton to make it look like he was talking about Mars. With photoshop, aftereffects and deepfakes, it only makes it easier. Sadly, viral bullshit spreads like wildfire which is why we see so many conspiracy things pop up over literal fake news.
Trust in our journalistic institutions is only going to be more and more important as this kind of stuff gets easier and more prolific.
I'm on the fence about how devastating this tech could be. On the one hand, it's always been a bit of a problem. It has always been easy to spread lies. I mean Fox gets away with making their own version of reality every day, and has for a long time. People share old clips on Twitter claiming they're video from a current event. Etc etc.
On the other hand, and the thing that scares me the most about it is that people trust our news outlets less than ever, and too many people blindly trust whatever garbage their friend shares on Facebook.
I think this is bang on the money.
Those who've written off "mainstream media" are among the many of us who will value credibility and journalistic integrity very strongly in the future, even though it's often taken for granted today.
A lot of people talk about the problems with being able to produce fake content that is unfalsifiable - which are massive - but people seem to forget the dangers on the opposite side of the same coin. If everybody knows about deepfakes (or worse, knows about them at the same level the average person knows about, say, 'hacking'), then it also becomes difficult to prove that a legitimate recording is real - especially to the layman. So now if someone gets a recording of a politician making corrupt deals, etc., they can just throw their hands up and say 'It's a fake! My political enemies are trying to stitch me up!' and so formerly damning evidence is effectively useless.
If you wish to have reliable recordings, then we're going to need a step forward in recording technology for cameras. The devices can create and embed a cryptographic signature throughout the file that will not verify if so much as a single bit is changed. That info can include the date/time/device serial number/whatever else you want. That would provide some sort of 'legal standard' for verification of evidence used in a courtroom, for example. I doubt it'll help much with public opinion.
Sounds nice, but doesn't sound feasible to me from a tech point of view. For any sort of video copy or edit protection, there's always going to be the ability of recapturing (basically, the high-tech version of pointing a video camera at your TV to make a copy of a videotape) and editing that, and then re-applying the new security measures (EDIT: with a different but still-valid key or whatever) to the edited file.
All the signature proves is that the person who has access to the key produced the file. It won't mean that the video wasn't edited.
Yes, that was my point.
Not true - if every individual recording device has a unique key, that's the proof. It can't be re-signed with that key by any other device, and it can't be forged by any other computers. You could create a copy and edit, and use your own key to encode it, but that key won't be a match to the original.
These keys can be tied to the manufacturer, serial number of the device, the exact date of recording, and any other data we like. It's like a cryptographic chain of custody.
Sure - what I forgot to put into my comment is that such a system would probably still allow for verification in a court as you say. It doesn't enable an average person (or even e.g. journalists) to determine if it's real or not. I wasn't saying that the keys could somehow be forged.
The other thing to note is that if every recording device has a private key that we assume is unbreakable (so, no quantum-computing breakthroughs or whatever), then that still just only proves that video A was made with device X. If I then edit it in fake video file B that says it was now made with device Y (the key of the editing device or ripped from another similar camera), how does one know from looking at B and Y that the original was in fact B and X? You'd be able to verify that a certain video came from the camera of a certain source, but in cases where the source is unknown or anonymous (e.g. a leak of a politician doing something embarrassing, where the recorder won't reveal themselves) it doesn't do anything to determine if the video itself is genuine or tampered with.
Anytime a Reddit thread shifts my opinion on something, I have to wonder if it was engineered to do so. And if it wasn’t, then I’m staring at the blueprints.
Use two accounts to fake a discussion about illegal immigration between a compassionate republican and a snarling liberal with unpopular views.
Claim to be a black woman, and describe the abusive treatment you’ve received from all your previous black boyfriends, and that you now only date white men. Reply with other fake accounts that, as a black woman, this has also been your overwhelming experience, and that you’re glad it’s being brought up.
Claim you’re a cardiologist that’s helped low income seniors for the last 2 decades, but will likely have to stop due to Obamacare. Add in jargon to taste.
Most readers will consume it quickly and move on, and their model of the world will have been updated.
You know, I most likely subconsciously changed my opinion on many topics after reading Reddit comment chains that looked informed. Thanks for bringing this up, I need to be more conscious of this.
Where does the deepfakes community live nowadays? It started on Reddit but was banned. I know there was some activity on Voat for awhile but it looks pretty dead over there now too. It seems like people are just losing interest in it, which is surprising because it's such a powerful technique. Surely people are still refining the process somewhere, right?
Which do you think is the bigger danger: the ability to generate content that misrepresents reality or the ability to invalidate or dismiss actual footage?
For me it's a resounding: the ability to invalidate or dismiss actual footage. This is right up Russian and Trump era GOP propaganda teams' alleys. There is no truth, don't believe anything you see.
More like "The truth is unknowable, so just give up and listen to me"
The ability to do this has been around for a while now, just pretty crudely. I remember back when Contact came out, there was a big concern because they had edited a speech by Clinton to make it look like he was talking about Mars. With photoshop, aftereffects and deepfakes, it only makes it easier. Sadly, viral bullshit spreads like wildfire which is why we see so many conspiracy things pop up over literal fake news.
There will be an arms race, there is already ml research going into detecting deepfakes
Generative adversarial networks, but at the speed of publishing papers.