Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

The White House wants to 'cryptographically verify' videos of Joe Biden so viewers don't mistake them for AI deepfakes

The White House wants to 'cryptographically verify' videos of Joe Biden so viewers don't mistake them for AI deepfakes::Biden's AI advisor Ben Buchanan said a method of clearly verifying White House releases is "in the works."

npaladin2000 ,
@npaladin2000@lemmy.world avatar

If the White House actually makes the deep fakes, do they count as "fakes?"

drathvedro ,

I've been saying for a long time now that camera manufacturers should just put encryption circuits right inside the sensors. Of course that wouldn't protect against pointing the camera at a screen showing a deepfake or someone painstakingly dissolving top layers and tracing out the private key manually, but that'd be enough of the deterrent from forgery. And also media production companies should actually put out all their stuff digitally signed. Like, come on, it's 2024 and we still don't have a way to find out if something was filmed or rendered, cut or edited, original or freebooted.

GeneralVincent ,
drathvedro ,

Oh, they've actually been developing that! Thanks for the link, I was totally unaware of C2PA thing. Looks like the ball has been very slowly rolling ever since 2019, but now that the Google is on board (they joined just a couple days ago), it might fairly soon be visible/usable by ordinary users.

Mark my words, though, I'll bet $100 that everyone's going to screw it up miserably on their first couple of generations. Camera manufacturers are going to cheap out on electronics, allowing for data substitution somewhere in the pipeline. Every piece of editing software is going to be cracked at least a few times, allowing for fake edits. And production companies will most definitely leak their signing keys. Maybe even Intel/AMD could screw up again big time. But, maybe in a decade or two, given the pace, we'll get a stable and secure enough solution to become the default, like SSL currently is.

petrol_sniff_king ,
drathvedro ,

Oh, so Adobe already screwed it up miserably. Thanks, had a good laugh at it

Natanael , (edited )

Oof.

They need to implement content addressing for "sidecar" signature files (add a hash) both to prevent malleability and to allow independent caches to serve up the metadata for images of interest.

Also, the whole certificate chain and root of trust issues are still there and completely unaddressed. They really should add various recommendations for default use like not trusting anything by default, only showing a signature exists but treating it unvalidated until the keypair owner has been verified. Accepting a signature just because a CA is involved is terrible, and that being a terrible idea is exactly the whole reason who web browsers dropped support for displaying extended validation certificate metadata (because that extra validation by CAs was still not enough).

And signature verification should be mandatory for every piece, dropping old signatures should not be allowed and metadata which isn't correctly signed shouldn't be displayed. There's even schemes for compressing multiple signatures into one smaller signature blob so you can do this while saving space!

And one last detail, they really should use timestamping via "transparency logs" when publishing photos like this to support the provenance claims. When trusted sources uses timestamping line this before publication then it helps verifying "earliest seen" claims.

hyperhopper ,

If you've been saying this for a long time please stop. This will solve nothing. It will be trivial to bypass for malicious actors and just hampers normal consumers.

drathvedro ,

You must be severely misunderstanding the idea. The idea is not to encrypt it in a way that it's only unlockable by a secret and hidden key, like DRM or cable TV does, but to do the the reverse - to encrypt it with a key that is unlockable by publicly available and widely shared key, where successful decryption acts as a proof of content authenticity. If you don't care about authenticity, nothing is stopping you from spreading the decrypted version, so It shouldn't affect consumers one bit. And I wouldn't describe "Get a bunch of cameras, rip the sensors out, carefully and repeatedly strip the top layers off and scan using electron microscope until you get to the encryption circuit, repeat enough times to collect enough scans undamaged by the stripping process to then manually piece them together and trace out the entire circuit, then spend a few weeks debugging it in a simulator to work out the encryption key" as "trivial"

hyperhopper ,

I think you are misunderstanding things or don't know shit about cryptography. Why the fuck are y even talking about publicly unlockable encryption, this is a use case for verification like a MAC signature, not any kind of encryption.

And no, your process is wild. The actual answer is just replace the sensor input to the same encryption circuits. That is trivial if you own and have control over your own device. For your scheme to work, personal ownership rights would have to be severely hampered.

drathvedro ,

I think you are misunderstanding things or don’t know shit about cryptography. Why the fuck are y even talking about publicly unlockable encryption, this is a use case for verification like a MAC signature, not any kind of encryption.

Calm down. I was just dumbing down public key cryptography for you

The actual answer is just replace the sensor input to the same encryption circuits

This will not work. The encryption circuit has to be right inside the CCD, otherwise it will be bypassed just like TPM before 2.0 - by tampering with unencrypted connection in between the sensor and the encryption chip.

For your scheme to work, personal ownership rights would have to be severely hampered.

You still don't understand. It does not hamper with ownership rights or right to repair and you are free to not even use that at all. All this achieves is basically camera manufacturers signing every frame with "Yep, this was filmed with one of our cameras". You are free to view and even edit the footage as long as you don't care about this signature. It might not be useful for, say, a movie, but when looking for original, uncut and unedited footage, like, for example, a news report, this'll be a godsend.

Natanael ,

Analog hole, just set up the camera in front of a sufficiently high resolution screen.

You have to trust the person who owns the camera.

drathvedro ,

Yes, I've mentioned that in the initial comment, and, I gotta confess, I don't know shit about photography, but to me it sounds like a very non-trivial task to make such shot appear legitimate.

hyperhopper ,

It's not. Wait till you find out how they made movies before CGI!

Natanael ,

A MAC is symmetric and can thus only be verified by you or somebody who you trust to not misuse or leak the key. Regular digital signatures is what's needed here

You can still use such a signing circuit but treat it as an attestation by the camera's owner, not as independent proof of authenticity.

hyperhopper ,

A MAC is symmetric and can thus only be verified by you or somebody who you trust to not misuse or leak the key.

You sign them against a known public key, so anybody can verify them.

Regular digital signatures is what's needed here You can still use such a signing circuit but treat it as an attestation by the camera's owner, not as independent proof of authenticity.

If it's just the cameras owner attesting, then just have them sign it. No need for expensive complicated circuits and regulations forcing these into existence.

Natanael ,

You can't use a MAC for public key signatures. That's ECC, RSA, and similar.

Drewelite ,

Thank you, lol. This is what people end up with when they think of the first solution that comes to mind. Often just something that makes life harder for everyone EXCEPT bad actors. This just creates hoops for people following the rules to jump though while giving the impression the problem was solved, when it's not.

helenslunch ,
@helenslunch@feddit.nl avatar

I mean they could just create a highly-secure official Fediverse server/account?

stockRot ,

What problem would that solve?

helenslunch ,
@helenslunch@feddit.nl avatar

An official channel to post and review deepfakes for accuracy.

otl ,
@otl@hachyderm.io avatar

A link to the video could be shared via ActivityPub.
The video would be loaded over HTTPS; we can verify that the video is from the white house, and that it hasn't been modified in-transit.

A big issue is that places don't want to share a link to an independently verifiable video, they want you to load a copy of it from their website/app. This way we build trust with the brand (e.g. New York Times), and spend more time looking at ads or subscribe.
@stockRot @technology

stockRot ,

A big issue is that places don't want to share a link to an independently verifiable video, they want you to load a copy of it from their website/app.

Exactly. This "solution" doesn't take into account how people actually use the Internet. Unless we expect billions of people to change their behavior, this is just a pointless comment.

otl ,
@otl@hachyderm.io avatar

Might be closer than you think. The White House is just using Instagram right now: https://www.whitehouse.gov
(See section “featured media”)

@stockRot @technology

hyperhopper ,

Just because you're writing this on the fediverse doesn't mean it's the answer to everything. It's certainly not the answer to this.

helenslunch ,
@helenslunch@feddit.nl avatar

Sick Strawman bro

Blackmist ,

Honestly I'd say that's on the way for any video or photographic evidence.

You'd need a device private key to sign with, probably internet connectivity for a timestamp from a third party.

Could have lidar included as well so you can verify that it's not pointing at a video source of something fake.

Is there a cryptographically secure version of GPS too? Not sure if that's even possible, and it's the weekend so I'm done thinking.

SpaceCowboy ,
@SpaceCowboy@lemmy.ca avatar

It's way more feasible to simply require social media sites to do the verification and display something like a blue check on verified videos.

This is actually a really good idea. Sure there will still be deepfakes out there, but at least a deepfake that claims to be from a trusted source can be removed relatively easily.

Theoretically a social media site could boost content that was verified over content that isn't, but that would require social media sites to not be bad actors, which I don't have a lot of hope in.

kautau , (edited )

I agree that it’s a good idea. But the people most swayed by deepfakes of Biden are definitely the least concerned with whether their bogeyman, the “deep state” has verified them

captain_aggravated ,
@captain_aggravated@sh.itjust.works avatar

Yeah it's not going to change the mind of the folks making the deepfakes.

Natanael ,

Positioning using distance bounded challenge-response protocols with multiple beacons is possible, but none of the positioning satellite networks supports it. And you still can't prove the photo was taken at the location, only that somebody was there.

ZombiFrancis ,

It would become quite easy to dismiss anything for not being cryptographically verified simply by not cryptographically verifying.

I can see the benefit of having such verification but I also see how prone it might be to suppressing unpopular/unsanctioned journalism.

Unless the proof is very clear and easy for the public to understand the new method of denial just becomes the old method of denial.

abhibeckert ,

It would be nice if none of this was necessary... but we don't live in that world. There is a lot of straight up bullshit in the news these days especially when it comes to controversial topics (like the war in Gaza, or Covid).

You could go a really long way by just giving all photographers the ability to sign their own work. If you know who took the photo, then you can make good decisions about wether to trust them or not.

Random account on a social network shares a video of a presidential candidate giving a speech? Yeah maybe don't trust that. Look for someone else who's covered the same speech instead, obviously any real speech is going to be covered by every major news network.

That doesn't stop a ordinary people from sharing presidential speeches on social networks. But it would make it much easier to identify fake content.

jabjoe ,
@jabjoe@feddit.uk avatar

Once people get used to cryptographical signed videos, why only trust one source? If a news outlet is found signing a fake video, they will be in trouble. Loss of said trust if nothing else.

We should get to the point we don't trust unsigned videos.

FlyingSquid ,
@FlyingSquid@lemmy.world avatar

If a news outlet is found signing a fake video, they will be in trouble.

I see you've never heard of Fox News before.

https://en.wikipedia.org/wiki/Fox_News_controversies#Video_footage_manipulation

OsrsNeedsF2P ,

Yes, and now people don't trust Fox News, to the point it is close to being banned from being used as a source for anything on Wikipedia

FlyingSquid ,
@FlyingSquid@lemmy.world avatar

I don't know that 'about to be banned by Wikipedia' is a good metric for how much the general American public trusts Fox News. It could be that most of them don't, but that is not a good way to tell considering there's no general public input on what Wikipedia accepts as a source.

Also, it should have been banned by Wikipedia years ago.

ZombiFrancis ,

Not trusting unsigned videos is one thing, but will people be judging the signature or the content itself to determine if it is fake?

Why only one source should be trusted is a salient point. If we are talking trust: it feels entirely plausible that an entity could use its trust (or power) to manufacture a signature.

And for some it is all too relevant that an entity like the White House, (or the gambit of others, past or present), have certainly presented false informstion as true to do things like invade countries.

Trust is a much more flexible concept that is willing to be bent. And so cryptographic verification really has to demonstrate how and why something is fake to the general public. Otherwise it is just a big 'trust me bro.'

jabjoe ,
@jabjoe@feddit.uk avatar

Your right in that cryptographic verification only can prove someone signed the video. But that will mean nutters sharing "BBC videos", that don't have the BBC signature can basically be dismissed straight off. We are already in a soup of miss information, so sourcing being cryptographically provable is a step forward. If you trust those sources or not is another matter, but at least your know if it's the true source or not. If a source abuse trust it has, it loses trust.

surewhynotlem ,

Fucking finally. We've had this answer to digital fraud for ages.

BrianTheeBiscuiteer ,

Sounds like a very Biden thing (or for anyone well into their Golden Years) to say, "Use cryptography!" but it's not without merit. How do we verify file integrity? How to we digitally sign documents?

The problem we currently have is that anything that looks real tends to be accepted as real (or authentic). We can't rely on humans to verify authenticity of audio or video anymore. So for anything that really matters we need to digitally sign it so it can be verified by a certificate authority or hashed to verify integrity.

This doesn't magically fix deep fakes. Not everyone will verify a video before distribution and you can't verify a video that's been edited for time or reformatted or broadcast on the TV. It's a start.

SpaceCowboy ,
@SpaceCowboy@lemmy.ca avatar

The President's job isn't really to be an expert on everything, the job is more about being able to hire people who are experts.

If this was coupled with a regulation requiring social media companies to do the verification and indicate that the content is verified then most people wouldn't need to do the work to verify content (because we know they won't).

It obviously wouldn't solve every problem with deepfakes, but at least it couldn't be content claiming to be from CNN or whoever. And yes someone editing content from trusted sources would make that content no longer trusted, but that's actually a good thing. You can edit videos to make someone look bad, you can slow it down to make a person look drunk, etc. This kind of content should not considered trusted either.

Someone doing a reaction video going over news content or whatever could have their stuff be considered trusted, but it would be indicated as being content from the person that produced the reaction video not as content coming from the original news source. So if you see a "news" video that has it's verified source as "xXX_FlatEarthIsReal420_69_XXx" rather than CNN, AP News, NY Times, etc, you kinda know what's up.

go_go_gadget ,

We've had this discussion a lot in the Bitcoin space. People keep arguing it has to change so that "grandma can understand it" but I think that's unrealistic. Every technology has some inherent complexities that cannot be removed and people have to learn if they want to use it. And people will use it if the motivation is there. Wifi has some inherent complexities people have become comfortable with. People know how to look through lists of networks, find the right one, enter the passkey or go through the sign on page. Some non-technical people know enough about how Wifi should behave to know the internet connection might be out or the route might need a reboot. None of this knowledge was commonplace 20 years ago. It is now.

The knowledge required to leverage the benefits of cryptographic signatures isn't beyond the reach of most people. The general rules are pretty simple. The industry just has to decide to make the necessary investments to motivate people.

nxdefiant ,

The number of 80 year olds that know what cryptography is AND know that it's a proper solution here is not large. I'd expect an 80 year old to say something like "we should only look at pictures sent by certified mail" or "You cant trust film unless it's an 8mm and the can was sealed shut!"

andrew_bidlaw ,
@andrew_bidlaw@sh.itjust.works avatar

Why not just official channels of information, e.g. White house Mastodon instance with politicians' accounts, government-hosted, auto-mirrored by third parties.

DrCake ,

Yeah good luck getting to general public to understand what “cryptographically verified” videos mean

maynarkh ,

Just make it a law that if as a social media company you allow unverified videos to be posted, you don't get safe harbour protections from libel suits for that. It would clear right up. As long as the source of trust is independent of the government or even big business, it would work and be trustworthy.

General_Effort ,

Back in the day, many rulers allowed only licensed individuals to operate printing presses. It was sometimes even required that an official should read and sign off on any text before it was allowed to be printed.

Freedom of the press originally means that exactly this is not done.

FunderPants ,

Jesus, how did I get so old only to just now understand that press is not journalism, but literally the printing press in 'Freedom of the press'.

vithigar ,

You understand that there is a difference between being not permitted to produce/distribute material and being accountable for libel, yes?

"Freedom of the press" doesn't mean they should be able to print damaging falsehood without repercussion.

General_Effort ,

What makes the original comment legally problematic (IMHO), is that it is expected and intended to have a chilling effect pre-publication. Effectively, it would end internet anonymity.

It's not necessarily unconstitutional. I would have made the argument if I thought so. The point is rather that history teaches us that close control of publications is a terrible mistake.

The original comment wants to make sure that there is always someone who can be sued/punished, with obvious consequences for regime critics, whistleblowers, and the like.

Dark_Arc ,
@Dark_Arc@social.packetloss.gg avatar

We need to take history into account but I think we'd be foolish to not acknowledge the world has indeed changed.

Freedom of the press never meant that any old person could just spawn a million press shops and pedal whatever they wanted. At best the rich could, and nobody was anonymous for long at that kind of scale.

Personally I'm for publishing via proxy (i.e. an anonymous tip that a known publisher/person is responsible for) ... I'm not crazy about "anybody can write anything on any political topic and nobody can hold them accountable offline."

vithigar ,

So your suggestion is that libel, defamation, harassment, et al are just automatically dismissed when using online anonymous platforms? We can't hold the platform responsible, and we can't identify the actual offender, so whoops, no culpability?

I strongly disagree.

Supermariofan67 ,

That's not what the commenter said and I think you are knowingly misrepresenting it.

vithigar ,

I am not. And if that's not what's implied by their comments then I legitimately have no idea what they're suggesting and would appreciate an explanation.

bionicjoey ,

As long as the source of trust is independent of the government or even big business, it would work and be trustworthy

That sounds like wishful thinking

FunderPants ,

Democrats will want cryptographically verified videos, Republicans will be happy with a stamp that has trumps face on it.

https://lemmy.ca/pictrs/image/8a89f6ea-0959-45f5-a571-17c8f8b6ddef.jpeg

https://lemmy.ca/pictrs/image/e018ce17-b09a-42b2-a7e4-ac687e93dde5.jpeg

wizardbeard , (edited )
@wizardbeard@lemmy.dbzer0.com avatar

I mean, how is anyone going to crytographically verify a video? You either have an icon in the video itself or displayed near it by the site, meaning nothing, fakers just copy that in theirs. Alternatively you have to sign or make file hashes for each permutation of the video file sent out. At that point how are normal people actually going to verify? At best they're trusting the video player of whatever site they're on to be truthful when it says that it's verified.

Saying they want to do this is one thing, but as far as I'm aware, we don't have a solution that accounts for the rampant re-use of presidential videos in news and secondary reporting either.

I have a terrible feeling that this would just be wasted effort beyond basic signing of the video file uploaded on the official government website, which really doesn't solve the problem for anyone who can't or won't verify the hash on their end.


Maybe some sort of visual and audio based hash, like musicbrainz ids for songs that are independant of the file itself but instead on the sound of it. Then the government runs a server kind of like a pgp key server. Then websites could integrate functionality to verify it, but at the end of the day it still works out to a "I swear we're legit guys" stamp for anyone not techinical enough to verify independantly thenselves.


I guess your post just seemed silly when the end result of this for anyone is effectively the equivalent of your "signed by trump" image, unless the public magically gets serious about downloading and verifying everything themselves independently.

Fuck trump, but there are much better ways to shit on king cheeto than pretending the average populace is anything but average based purely on political alignment.

You have to realize that to the average user, any site serving videos seems as trustworthy as youtube. Average internet literacy is absolutely fucking abysmal.

beefontoast ,

In the end people will realise they can not trust any media served to them. But it's just going to take time for people to realise... And while they are still blindly consuming it, they will be taken advantage of.

If it goes this road... Social media could be completely undermined. It could become the downfall of these platforms and do everyone a favour by giving them their lives back after endless doom scrolling for years.

technojamin ,

People aren’t going to do it, the platforms that 95% of people use (Facebook, Tik Tok, YouTube, Instagram) will have to add the functionality to their video players/posts. That’s the only way anything like this could be implemented by the 2024 US election.

Strykker ,

Do it basically the same what TLS verification works, sure the browsers would have to add something to the UI to support it, but claiming you can't trust that is dumb because we already use that to trust the site your on is your bank and not some scammer.

Sure not everyone is going to care to check, but the check being there allows people who care to reply back saying the video is faked due to X

patatahooligan ,
@patatahooligan@lemmy.world avatar

The general public doesn't have to understand anything about how it works as long as they get a clear "verified by ..." statement in the UI.

kandoh ,

The problem is that even if you reveal the video as fake,the feeling it reinforces on the viewer stays with them.

"Sure that was fake,but the fake that it seems believable tells you everything you need to know"

go_go_gadget , (edited )

"Herd immunity" comes into play here. If those people keep getting dismissed by most other people because the video isn't signed they'll give up and follow the crowd. Culture is incredibly powerful.

BradleyUffner ,

It could work the same way the padlock icon worked for SSL sites in browsers back in the day. The video player checks the signature and displays the trusted icon.

Natanael ,

It needs to focus on showing who published it, not the icon

makeasnek ,
@makeasnek@lemmy.ml avatar

"Not everybody will use it and it's not 100% perfect so let's not try"

NateNate60 ,

That's not the point. It's that malicious actors could easily exploit that lack of knowledge to trick users into giving fake videos more credibility.

If I were a malicious actor, I'd put the words "✅ Verified cryptographically by the White House" at the bottom of my posts and you can probably understand that the people most vulnerable to misinformation would probably believe it.

JasSmith ,

This doesn’t solve anything. The White House will only authenticate videos which make the President look good. Curated and carefully edited PR. Maybe the occasional press conference. The vast majority of content will not be authenticated. If anything this makes the problem worse, as it will give the President remit to claim videos which make them look bad are not authenticated and should therefore be distrusted.

ours ,

I don't understand your concern. Either it'll be signed White House footage or it won't. They have to sign all their footage otherwise there's no point to this. If it looks bad, don't release it.

maynarkh ,

The point is that if someone catches the President shagging kids, of course that footage won't be authenticated by the WH. We need a tool so that a genuine piece of footage of the Pres shagging kids would be authenticated, but a deepfake of the same would not. The WH is not a good arbiter since they are not independent.

ours ,

But we are talking about official WH videos. Start signing those.

If it's not from the WH, it isn't signed. Or perhaps it's signed by whatever media company is behind its production or maybe they've verified the video and its source enough to sign it. So maybe, let's say the Washington Post can publish some compromising video of the President but it still has certain accountability as opposed to some completely random Internet video.

brbposting ,

Politicians and anyone at deepfake risk wear a digital pendant at all times. Pendant displays continually rotating time-based codes. People record themselves using video hardware which crypto graphically signs output.

Only a law/Big 4 firm can extract video from the official camera (which has a twin for hot swapping).

Natanael ,

Codes which don't embedd any information about what you're saying or doing can be copied over to faked images.

In theory you could have such a pendant record your voice, etc, and continously emit signatures for compressed versions of your speech (or a signed speech-to-text transcript)

JasSmith ,

Then this exercise is a waste of time. All the hard hitting journalism which presses the President and elicits a negative response will be unsigned, and will be distributed across social media as it is today: without authentication. All the videos for which the White House is concerned about authenticity will continue to circulate without any cause for contention.

cynar ,

It needs to be more general. A video should have multiple signatures. Each signature relies on the signer's reputation, which works both ways. It won't help those who don't care about their reputation, but will for those that do.

A photographer who passes off a fake photo as real will have their reputation hit, if they are caught out. The paper that published it will also take a hit. It's therefore in the paper's interest to figure out how trustworthy the supplier is.

I believe canon recently announced a camera that cryptographically signs photographs, at the point of creation. At that point, the photographer can prove the camera, the editor can prove the photographer, the paper can prove the editor, and the reader can prove the newspaper. If done right, the final viewer can also prove the whole chain, semi-independently. It won't be perfect (far from it) but might be the best will get. Each party wants to protect their reputation, and so has a vested interest in catching fraud.

For this to work, we need a reliable way to sign images multiple times, as well as (optionally) encode an edit history into it. We also need a quick way to match cryptographic keys to a public key.

An option to upload a time stamped key to a trusted 3rd party would also be of significant benefit. Ironically, Blockchain might actually be a good use for this. In case a trusted 3rd can't be established.

JasSmith ,

Great points and I agree. I also think the signature needs to be built into the stream in a continuous fashion so that snippets can still be authenticated.

cynar ,

Agreed. Embed a per-frame signature it into every key frame when encoding. Also include the video file time-stamp. This will mean any clip longer than around 1 second will include at least 1 signed frame.

Natanael ,

Merkle tree hashes exists for this purpose

Note that videos uses "keyframes" so you can't extract arbitrary frames in isolation, you need to pull multiple if the frame you're snapshotting isn't a keyframe itself

General_Effort ,

I don't think that's practical or particularly desirable.

Today, when you buy something, EG a phone, the brand guarantees the quality of the product, and the seller guarantees the logistics chain (that it's unused, not stolen, not faked, not damaged in transport, ...). The typical buyer does not care about the parts used, the assembly factory, etc.

When a news source publishes media, they vouch for it. That's what they are paid for (as it were). If the final viewer is expected to check the chain, they are asked to do the job of skilled professionals for free. Do-your-own-research rarely works out, even for well-educated people. Besides, in important cases, the whole chain will not be public to protect sources.

cynar ,

It wouldn't be intended for day to day use. It's intended as a audit trail/chain of custody. Think of it more akin to a git history. As a user, you generally don't care, however it can be excellent for retrospective analysis, when someone/something does screw up.

You would obviously be able to strip it out, but having it as a default would be helpful with openness.

LarmyOfLone ,

I've thought about this too but I'm not sure this would work. First you could hack the firmware of a cryptographically signed camera. I already read something about a camera like this that was hacked and the private key leaked. You could have an individual key for each camera and then revoke it maybe.

But you could also photograph a monitor or something like that, like a specifically altered camera lens.

Ultimately you'd probably need something like quantum entangled photon encoding to prove that the photons captured by the sensor were real photons and not fake photons. Like capturing a light field or capturing a spectrum of photons. Not sure if that is even remotely possible but it sounds cool haha.

Natanael ,

Look up transparency logs for that last part, it's already used for TLS certificates

BrianTheeBiscuiteer ,

Anyone can digitally sign anything (maybe not easily or for free). The Whitehouse can verify or not verify whatever they choose but if you, as a journalist let's say, want to give credence to video you distribute you'll want to digitally sign it. If a video switches hands several times without being signed it might as well have been cooked up by the last person that touched it.

go_go_gadget , (edited )

That's fine?

Signatures aren't meant to prove authenticity. They're proving the source which you can use to weigh the authenticity.

I think the confusion comes from the fact that cryptographic signatures are mostly used in situations where proving the source is equivalent to proving authenticity. Proving a text message is from me proves the authenticity as there's no such thing as doctoring my own text message. There's more nuance when you're using signatures to prove a source which may or may not be providing trustworthy data. But there is value in at least knowing who provided the data.

circuitfarmer ,
@circuitfarmer@lemmy.world avatar

I'm sure they do. AI regulation probably would have helped with that. I feel like congress was busy with shit that doesn't affect anything.

ours ,

I salute whoever has the challenge of explaining basic cryptography principles to Congress.

Spendrill ,

Might just as well show a dog a card trick.

wizardbeard ,
@wizardbeard@lemmy.dbzer0.com avatar

That's why I feel like this idea is useless, even for the general population. Even with some sort of visual/audio based hashing, so that the hash is independant of minor changes like video resolution which don't change the content, and with major video sites implementing a way for the site to verify that hash matches one from a trustworthy keyserver equivalent...

The end result for anyone not downloading the videos and verifying it themselves is the equivalent of those old ”✅ safe ecommerce site, we swear" images. Any dedicated misinformation campaign will just fake it, and that will be enough for the people who would have believed the fake to begin with.

brbposting ,
johnyrocket ,

Should probably start out with the colour mixing one. That was very helpfull for me to figure out public key cryptography. The difficulty comes in when they feel like you are treating them like toddlers so they start behaving more like toddlers. (Which they are 99% if the time)

lemmyingly ,

I see no difference between creating a fake video/image with AI and Adobe's packages. So to me this isn't an AI problem, it's a problem that should have been resolved a couple of decades ago.

Deello ,

So basically Biden ads on the blockchain.

TheGrandNagus ,

...no

Think of generating an md5sum to verify that the file you downloaded online is what it should be and hasn't been corrupted during the download process or replaced in a Man in the Middle attack.

brbposting ,

generating an md5sum to verify that the file you downloaded online

https://sh.itjust.works/pictrs/image/fb788092-0889-4dcb-8500-5f0265f51f96.jpeg

Muehe ,

Cryptography ⊋ Blockchain

A blockchain is cryptography, but not all cryptography is a blockchain.

long_chicken_boat ,

what if I meet Joe and take a selfie of both of us using my phone? how will people know that my selfie is an authentic Joe Biden?

PhlubbaDubba ,

Probably a signed comment from the Double-Cone Crusader himself, basically free PR so I don't see why he or any other president wouldn't at least have an intern give you a signed comment fist bump of acknowledgement

fidodo ,

That's the big question. How will we verify anything as real?

cynar ,

Ultimately, reputation based trust, combined with cryptographic keys is likely the best we can do. You (semi automatically) sign the photo, and upload it's stamp to a 3rd party. They can verify that they received the stamp from you, and at what time. That proves the image existed at that time, and that it's linked to your reputation. Anything more is just likely to leak, security wise.

Zehzin ,
@Zehzin@lemmy.world avatar

Official Joe Biden NFTs comfirmed

Gork ,

Don't trust any key you know is malarkey!

PhlubbaDubba ,

I can totally see this being a thing and I kinda wish it would just because I love old people trying to seem like they know tech when they don't but in the context of still helpful tech stuff.

ryannathans ,

I have said for years all media that needs to be verifiable needs to be signed. Gpg signing lets gooo

NateNate60 ,

Very few people understand why a GPG signature is reliable or how to check it. Malicious actors will add a "GPG Signed" watermark to their fake videos and call it a day, and 90% of victims will believe it.

optissima ,
@optissima@lemmy.world avatar

As soon as VLC adds the gpg sig feature, it's over.

TheKingBee ,
@TheKingBee@lemmy.world avatar

And that will in no way be the first step on the road to VLC deciding which videos it allows you to play...

NateNate60 ,

No, it's not. People don't use VLC to watch misinformation videos. They see it on Reddit, Facebook, YouTube, or TikTok.

QuaternionsRock ,

…how popular do you think VLC is among those who don’t understand cryptographic signatures?

PhlubbaDubba ,

Yeah but all it takes is proving it doesn't have the right signature and you can make the Social Media corpo take every piece of media with that signature just for that alone.

What's even better is that you can attack entities that try to maliciously let people get away with misusing their look and fake being signed for failing to defend their IP, basically declaring you intend to take them to court to Public Domainify literally everything that makes them any money at all.

If billionaires were willing to allow disinformation as a service then they wouldn't have gone to war against news as a service to make it profitable to begin with.

captain_aggravated ,
@captain_aggravated@sh.itjust.works avatar

I just mentioned this in another comment tonight; cryptographic verification has existed for years but basically no one has adopted it for anything. Some people still seem to think pasting an image of your handwriting on a document is "signing" a document somehow.

ryannathans ,

Still trying to get people to sign their emails lol

captain_aggravated ,
@captain_aggravated@sh.itjust.works avatar

I mean, part of it is PGP is the exact opposite of streamlined and you've got to be NSA levels of paranoid to bother with it.

ryannathans ,

It's automated in all mainstream email clients, you don't even have to think about it if a contact has it set up

NateNate60 ,

if a contact has it set up

Well, there's your problem.

The most commonly-used mail client in the world is the Gmail web client which does not support it. Uploading your PGP key to Gmail and having them store it server-side for use in a webmail client is obviously problematic from a security standpoint. Number 2 I would guess is Outlook, which appears also not to support it. For most people, I don't think they understand the value of cryptographically signing emails and going through the hassle of generating and publishing their PGP keys, especially since Windows has no built-in easy application for generating and managing such keys.

There's also the case that for most people, signing their emails provides absolutely no immediate benefit to them.

captain_aggravated ,
@captain_aggravated@sh.itjust.works avatar

Plus that's email. What about... Literally everything else?

NateNate60 ,

Yeah, almost nothing has good PGP integration.

Except Git, apparently.

wizardbeard ,
@wizardbeard@lemmy.dbzer0.com avatar

It doesn't help that in a lot of cases, this is actually accepted by a shit ton of important institutions that should be better, but aren't.

bionicjoey ,

The average Joe won't know what any of what you just said means. Hell, the Joe in the OP doesn't know what any of you just said means. There's no way (IMO) of simultaneously creating a cryptographic assurance and having it be accessible to the layman.

NateNate60 ,

There is, but only if you can implement a layer of abstraction and get them to trust that layer of abstraction.

Few laymen understand why Bitcoin is secure. They just trust that their wallet software works and because they were told by smarter people that it is secure.

Few laymen understand why TLS is secure. They just trust that their browser tells them it is secure.

Few laymen understand why biometric authentication on their phone apps is secure. They just trust that their device tells them it is secure.

bionicjoey ,

Each of those perfectly illustrates the problem with adding in a layer of abstraction though:

Bitcoin is a perfect example of the problem. Since almost nobody understands how it works, they keep their coins in an exchange instead of a wallet and have completely defeated the point of cryptocurrency in the first place by reintroducing blind trust into the system.

Similarly, the TLS ecosystem is problematic. Because even though it is theoretically supposed to verify the identity of the other party, most people aren't savvy enough to check the name on the cert and instead just trust that if their browser doesn't warn them, they must be okay. Blind trust one again is introduced alongside the necessary abstraction layers needed to make cryptography palatable to the masses.

Lastly, people have put so much trust in the face scanning biometrics to wake their phone that they don't realize they may have given their face to a facial recognition company who will use it to help bring about the cyberpunk dystopia that we are all moving toward.

Aurenkin ,

I think this is a great idea. Hopefully it becomes the standard soon, cryptographically signing clips or parts of clips so there's no doubt as to the original source.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • random
  • incremental_games
  • meta
  • All magazines