This isn’t new. Check out Yasha Levine’s Surveillance Valley. It’s a nice primer. Most of our internet tech was built for the military or funded by the military for military ideas (no matter what MIT or Berkeley theoreticians might try to convince you of).
No way in hell would I do that if I had that kind of knowledge. Look what happened to Snowden for doing something like that.
He would still spend the remainder of his life in federal prison or be executed if he ever steps back on US soil or the soil of someone with an extradition treaty that is looking to get some brownie points.
That wouldn't happen to all of them, but I bet you there are some working on some classified mess that would be found and made an example of in short order to shut the others up.
In 2017, I played a part in the successful #CancelMaven campaign that got Google to end its participation in Project Maven, a contract with the US Department of Defense to equip US military drones with artificial intelligence.
Today a similar movement, organized under the banner of the coalition No Tech for Apartheid, is targeting Project Nimbus, a joint contract between Google and Amazon to provide cloud computing infrastructure and AI capabilities to the Israeli government and military.
If a strategically placed insider released information not otherwise known to the public about the Nimbus project, it could really increase the pressure on management to rethink its decision to get into bed with a military that’s currently overseeing mass killings of women and children.
It certainly wasn’t a spontaneous response to an op-ed, and I don’t presume to advise anyone currently at Google (or Amazon, Microsoft, Palantir, Anduril, or any of the growing list of companies peddling AI to militaries) to follow my example.
Back then, the company responded to our actions by defending the nature of the contract, insisting that its Project Maven work was strictly for reconnaissance and not for weapons targeting—conceding implicitly that helping to target drone strikes would be a bad thing.
Today it maintains that the work it is doing as part of Project Nimbus “is not directed at highly sensitive, classified, or military workloads relevant to weapons or intelligence services.” At the same time, it asserts that there is no room for politics at the workplace and has fired those demanding transparency and accountability.
The original article contains 1,257 words, the summary contains 258 words. Saved 79%. I'm a bot and I'm open source!
I for one vote everyone just be done with public internet media and stick to outside, known person to known person interaction, and info from reputable news sources that vet their sources and maintain a high standard of accuracy.
It's noble how many of you are willing to get philosophical about the rise of deep fakes freeing us from puritan beliefs and readdressing the concept of truth.
While completely fucking ignoring the harassment and extortion of deep fakes. Y'all want to get high minded about YOUR right to free speach using OTHER peoples bodies as a gateway to some utopia, while playing dumb that this is just another form of mysgonstic abuse. If it truly is just you something you are doing in the privacy of your own home, why the fuck do you need other people's media?
Your ideals are built upon YET AGAIN women taking one for the team. The "truth" is immposible to know so YOLO, let's turn any women who made the mistake of being photographed in to porn. Her consent doesn't matter between the privacy of me and my dataset, even if I do upload it and blackmail her a lil'.
I had anticipated that there would be an uptick in cryptographic signing to combat the problem as this sort of fakery has become ubiquitous, which in my mind would assure the recipient of a file that
A) the file is unaltered after the date/time of the signing
B) that the file was created by the named photographer or videographer
This is not proof of authenticity but with a verifiable source, the file recipient could at least judge for themselves based on the reputation of the file creator (say, a notable AP photojournalist vs. some random schmoe).
Thus far, whenever I have raised this idea in a public forum, it has met with silence or even derision. What am I missing?
It takes an extra 2 minutes, that’s why its dead in the water. People go who would spend x amount of time to deepfake me that I should spend an extra two minutes on assuring integrity? And well for most of the population they are probably right.
I suppose you are correct, but it seems like a standard could be adopted to automate the process for both the creator and the consumer. And while the system would work for any creator, it seems most important to be able to ensure the integrity of the work product of journalistic professionals.
You seem to be conflating NFTs and digital signatures. Any file can be signed, unrelated to any sort of block chain technology. See PGP and related for more information.
Why should I trust the authenticity of your signning key?
Solution 1: web of trust like PGP. Impossible with foreign content.
Solution 2: Trusted Certificate authorities (private/ state/ UN)
Solution 3: a block chain (scaling problems)
If I generate a key pair and use it to sign a file and distribute it and then I publish the public key somewhere like Facebook, any recipient of the file could be assured that the file originated from my Facebook account. A commercial certificate is not required to do this. As to whether the Facebook account holder is actually me is another problem, but hopefully major social media platforms require at least a photo ID.
Edit: Sorry, I said public certificate when I meant commercial certificate.
technologyreview.com
Hot