Bill Gates says the massive power draw required for AI processing is nothing to worry about as AI will ultimately identify ways to help cut power consumption and drive the transition to sustainable energy.
Climate scientists: "do these things to fix climate change"
Everyone: "but that's HAAARD and I don't wanna!"
AI developers: create AI
Climate scientists: "AI is drawing massive power accelerating climate change, we need to stop that"
Everyone: "but it can tell us how to fix climate change so it's going to be okay!"
AI climate model: "do these same things to fix climate change"
Everyone: "but that's HAAARD and I don't wanna!"
Yeah, I can't see any way this could possibly fail...
This screams FAITH (Filthy Assumptions Instead of THinking) from a distance, on multiple levels:
Assuming that the current machine learning development will lead to artificial general intelligence. Will it?
Assuming that said AGI would appear in time to reduce power consumption. Will it?
Assuming that lowering the future power consumption will be enough to address issues caused by the current power consumption. Will it?
Assuming that addressing issues from a distant future means that the whole process won't cause harm for people in a nearer future. Will it?
Furthermore, Gates in the quote is being disingenuous:
"Let's not go overboard on this," he said. "Datacenters are, in the most extreme case, a 6 percent addition [to the energy load] but probably only 2 to 2.5 percent. The question is, will AI accelerate a more than 6 percent reduction? And the answer is: certainly," Gates said.
The answer addresses something far, far more specific than the main issue.
If I may, here's my alternative solution for the problem, in the same style as Gates':
Kill everyone between the North Pole and the Equator.
What do you mean, it would kill 85% people in the world? Well, you can't make an omelet without breaking some eggs, right? Nobody that I know personally lives there, so Not My Problem®. (Just keep Japan, I need my anime to watch.)
...I'm being clearly sarcastic to deliver a point here - it's trivially easy to underestimate issues affecting humankind, and problems associated with their solutions, if you are not directly affected by either. Gates is some billionaire bubbled around rich people; this sort of problem will affect the poor first, as the rich can simply throw enough money into their problems to make them go away.
Even as our species destroys its only home, we assume that the solutions to climate change must lie in technology, without stopping to examine the role that this very attitude has played in the crisis.
This is so deeply ingrained in our social consciousness that, when there is a new impressive technology, we assume that it must be here to solve one of our big problems. As the AI hype quickens the pace of our ecological devastation, we're so dazzled by the technology that there is actual debate in supposedly serious publications as to whether AI is going to save us from climate change, despite all evidence pointing to the contrary.
Oxford university had previously secured funding from the UK gov to develop the vaccine under the expectation they open source it so that poorer countries would have greater vaccine access and the rollout could be faster.
Bill Gates says the massive power draw required for AI processing is nothing to worry about as AI will ultimately identify ways to help cut power consumption and drive the transition to sustainable energy.
The final solution the AI comes up with: Cut the power of the poor, euthanize the old and weak.
I hate that they decided to have Morpheus hold up a battery instead of a processor because some empty suit thought audiences were too stupid to get it.
Didn't it also have something to do with a brand deal? Like the suit got extra funding for the movie by making a deal with Duracell to have their batteries in the movie or something.
The whole thing never made much sense anyways, machines would be without scrupules and cut off any redundancies like extra limbs, they'd probably just keep your brain in a jar.
Well, perhaps that process would be more difficult and resource-intensive in this hypothetical scenario, so it would be much easier and less hassle to just keep the bodies alive?
I wonder if all this is to burn enough energy to make ignorant people believe that we have AI. And then use that AI as a justification of the existing order of things, the same way "civil contract" is. That it's not really technical, but rather a very big and expensive propaganda campaign for abolishing democracies.
Wow, that is so dumb. I saw some crack pot dude trying to solve unsolved physics problems by using prompts like "imagine you are Einstein, then how would you solve: ...". Good to see he is not alone, but has Bill fucking Gates with similarly dumb AI takes.
This is even more dumb when even Joan f-g Rowling in her books about magic for children described how and why magic can't do this. One of the reasons I like Harry Potter - not for the plot or the human part, but for the magic there being quite similar to computers in our time. With similar limitations, except for unique cases.
So no matter how much one hates Rowling (I don't, she's done more good than evil by far still), she's smarter and more decent than most of the humanity. That sucks.
So no matter how much one hates Rowling (I don't, she's done more good than evil by far still), she's smarter and more decent than most of the humanity. That sucks.
Lol. She lucked into an amazing world that managed to remain a good story despite her writing, not because of it. She's not an idiot, but literally every other piece of writing she's ever put out kinda slams the "smarter than most of humanity" line.
They weren't meant to be causative, and I stand by both of my statements. Her writing is objectively bad, and it's a small miracle that she didn't manage to ruin this series like everything else she's written. Yes, I know those are strong words, and yes, I do believe them.
Well, how can one speak about some thing's author, the person who has built it from scratch, as of someone who can ruin it or not?
That said, it's hard for me to read her in English, and I've read HP mostly in at least three translations to Russian, one official and two unofficial ones. The former sucks, and from the latter two the one which reads the best is by the least professional translator (actually she's not a translator at all), and I mean Maria Spivak (the original one to circulate in the Runet and samizdat versions, not the abomination published much later).
It communicates the feeling of mad and a bit hooligan-ish fairy tale, I suspect that emotionally it's the closest to the original.
Anyway, it's pretty normal for an author to have a magnum opus and the rest of their works to just not make sense.
Dead wrong. AI is not as reliable as their makers would like to believe. AI is more likely to adopt all the flaws of humanity than make anything “better.” A subjective term.
It's a text generator. All these people, were they to live in Antiquity, would jump ship to ship trying to visit every oracle and prophet in the Mediterranean asking questions about universe and seeking deep meaning in short texts of the Chinese fortune cookie kind.