Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

Jensen Huang says kids shouldn't learn to code — they should leave it up to AI.

Jensen Huang says kids shouldn't learn to code — they should leave it up to AI.::At the recent World Government Summit in Dubai, Nvidia CEO Jensen Huang made a counterintuitive break with tech leader wisdom by saying that programming is no longer a vital skill due to the AI revolution.

stardust ,

Kids shouldn't learn to read. They should stick to audio books.

Annoyed_Crabby ,

[Jeff Bezos likes this]

Rob ,

I don't know if that's the best example since with an audio book you're still getting the same reading material.

It's more like, kids shouldn't learn how to sing, they should just have ai sing with their voice for them. They'l never know the ins and outs of it, but they'l know what they want it to be like and describe it to the ai.

ABCDE ,

Mainly because you don't need to learn to read.

Rob ,

Makes sense, but you still have to think about the plot, climax and stuff related to the story as the ai reads it to you.

With ai the idea is they take away all the thinking and I think it will get to a point where you can ask an ai to sing a song with your voice and you won't have to put in any effort whatsoever. Or you can tell ai to sing a song about a trending topic with your voice and you not knowing anything about it and still generating clicks and ad revenue.

aluminium ,

So the Nvidia drivers will be 100% written by AI then?

jerrythegenius ,
@jerrythegenius@lemmy.world avatar

Oh dear

r00ty ,
@r00ty@kbin.life avatar

Well, he's put the writing on the wall for his own developers. So, even if it isn't AI that writes them, the quality may well go down when those that can easily do so, leave for pastures new :P

Passerby6497 ,

Might make them more stable

elshandra ,

Does that mean they can't be copyright anymore? I'll take it.

yildolw ,

Yes, yes, keep my labour in high demand and my salary high

bionicjoey ,
snek ,
@snek@lemmy.world avatar

What is this from?

Blemgo ,

Linus Torvals talk at the Aalto University.
Specifically a segment where he talks about how hard it is to work with Nvidia when it comes to the Linux kernel.

Rob ,

Leave coding to ai? What does this look like? How does this concept work? any examples?

Or does he mean just let ai handle everything and they don't give the ai any input?? I am no programmer but this just doesn't sound right to me. As a regular user.

HeavyDogFeet ,
@HeavyDogFeet@lemmy.world avatar

It makes no sense. AI tools will obviously have an impact on the profession development, but suggesting that no one should learn to code is like saying no one should learn to drive because one day cars will drive themselves. It’s utter self-serving nonsense.

Rob ,

Maybe Nvidia knows something we don't. Are their any examples of using ai to help code? What is Nvidia specifically referencing to making them say this? I ask these questions because obviously something they are seeing is making them say this.
yet, all I see from ai in general are things like.

Image generation, story generation, some ai can even roleplay a specific story with you that you inputed, but I just can't see ai doing actual coding, without over simplifying it and making it boring and less different from the next 'creation.'

HeavyDogFeet ,
@HeavyDogFeet@lemmy.world avatar

LLM tools can already write basic code and will likely improve a lot, but there are more reasons to learn to code than to actually do coding yourself. Even just to be able to understand and verify the shit the AI tools spit out before pushing it live.

Nvidia knows that the more people who use AI tools, the more their hardware sells. They benefit directly from people not being able to code themselves or relying more on AI tools.

They want their magic line to keep going up. That’s all.

Rob ,

How does the use of ai tools specifically sell Nvidia hardware? it could help sell other hardware as well. They don't specify any specific ai software that might be exclusive to Nvidia hardware or anything like that. That at the surface doesn't make much sense to me.

HeavyDogFeet ,
@HeavyDogFeet@lemmy.world avatar

Nvidia makes some of the best hardware for training AI models. Increased investment in AI will inevitably increase demand for Nvidia hardware. It may boost other hardware makers too, but Nvidia is getting the biggest boost by far.

Maybe I’m being dumb or missing something but this feels incredibly obvious to me.

rambaroo ,

LLMs do most of their processing on GPUs using platforms like CUDA, which is an Nvidia product. Nvidia stands to make a lot of money off of CUDA and ML hardware.

snek ,
@snek@lemmy.world avatar

Money is making them say this.

r00ty ,
@r00ty@kbin.life avatar

I think my take is, he might be right. That is that by the time kids become adults we may have AGI and we'll either be enslaved or have much less work to do (for better or worse).

But AI as it is now, relies on input from humans. When left to take their own output as input, they go full Alabama (sorry Alabamites) with their output pretty quickly. Currently, they work as a tool in tandem with a human that knows what they're doing. If we don't make a leap from this current iteration of AI, then he'll be very very wrong.

bionicjoey ,

If you think AGI is anywhere close to what we have now, you haven't been using any critical thinking skills when interacting with language models.

r00ty ,
@r00ty@kbin.life avatar

I don't. We're talking about the next generation of people here. Do pay attention at the back.

bionicjoey ,

Okay but what I'm saying is that AGI isn't the logical progression of anything we have currently. So there's no reason to assume it will be here in one generation.

r00ty ,
@r00ty@kbin.life avatar

I'd tend to agree. I said we may have that, and then he might have a point. But, if we don't, he'll be wrong because current LLMs aren't going to (I think at least) get past the limitations and cannot create anything close to original content if left to feed on their own output.

I don't think it's easy to say what will be the situation in 15-20 years. The current generation of AI is moving ridiculously fast. Can we sidestep to AGI? I don't know the answer, probably people doing more work in this area have a better idea. I just know on this subject it's best not to write anything off.

bionicjoey ,

The current generation of AI is moving ridiculously fast.

You're missing my point. My point is that the current "AI" has nothing to do with AGI. It's an extension of mathematical and computer science theory that has existed for decades. There is no logical link between the machine learning models of today and true AGI. One has nothing to do with the other. To call it AI at all is actually quite misleading.

Why would we plan for something if we have no idea what the time horizon is? It's like saying "we may have a Mars colony in the next generation, so we don't need to teach kids geography"

r00ty ,
@r00ty@kbin.life avatar

Why would we plan for something if we have no idea what the time horizon is? It’s like saying “we may have a Mars colony in the next generation, so we don’t need to teach kids geography”

Well, I think this is the point being made quite a bit in this thread. It's general business level hyperbole, really. Just to get a headline and attention (and it seems to have worked). No-one really knows at which point all of our jobs will be taken over.

My point is that in general, the current AI models and diffusion techniques are moving forward at quite the rate. But, I did specify that AGI would be a sidestep out of the current rail. I think that there's now weight (and money) behind AI and pushing forward AGI research. Things moving faster in one lane right now can push investment into other lanes and areas of research. AI is the buzzword every company wants a piece of.

I'm not as confident as Mr Nvidia is, but with this kind of money behind it, AGI does have a shot of happening.

In terms of advice regarding training for software development, though. What I think for sure is that the current LLMs and offshoots of the techniques will develop, better frameworks for businesses to train them on their own material will become commonplace, I think one of the biggest consultancy growth areas will be in producing private models for software (and other) companies.

The net effect of that is going to mean they will just want fewer (better) engineers to make use of the AI to produce more, with less people. So, even without AGI the demand for software developers and engineers is going to be lower I think. So, is it as favourable an industry to train for now as it was for the previous generations? Quite possibly it's not.

OrangeCorvus ,
@OrangeCorvus@lemmy.world avatar

So this movie is going to become a reality a lot sooner? 1-2 generations max, yikes.

https://www.youtube.com/watch?v=6lai9QhBibk

cloudless ,
@cloudless@feddit.uk avatar

This is a good case study of how self interest blinds simple logic.

janAkali ,

it's a marketing stunt not a logic-related problem

cloudless ,
@cloudless@feddit.uk avatar

Why not both?

Vipsu , (edited )
@Vipsu@lemmy.world avatar

While large language models are impressive they seem to still lack the ability to actually reason which is quite important for programmer. Another thing they lack is human like intuition that allows us to seek solutions to problems with limited knowledge or without any existing solutions.

With the boom bringing a lot more money and attention to A.I the reasoning abilities will probably improve but until it's good enough we'll need people who can actually understand code. Once it's good enough then we don't really need people like Jensen Huang since robots can do whatever he does but better.

snek ,
@snek@lemmy.world avatar

GPT4 (the preview) still produces code where it adds variables that it never uses anywhere... and when I asked one time about one variable, it was like, "Oh, you're right, let me re-write the code to put variable X into use", then just added it in a nonsensical location to serve a nonsensical purpose.

gerryflap ,
@gerryflap@feddit.nl avatar

Bullshit. Even if AI were to fully replace is software developers (which I highly doubt), programming is still a very useful skill to learn just for the problem solving skills.

Arkaelus ,

Do you want Warhammer 40k? Because this is how you get Warhammer 40k.

Riccosuave ,
@Riccosuave@lemmy.world avatar

Don't forget Dune! Frank Herbert invented the no-code dystopia, courtesy of the Butlerian Jihad™

Arkaelus ,

Hey, the Spice must flow!:))

snapoff ,

Bless the maker

snek ,
@snek@lemmy.world avatar

And I say I don't even know this person and he should just stfu and leave those kids alone.

VaultBoyNewVegas ,

He's the CEO of Nvidia one of the largest GPU manufacturers in the world and also a trillion dollar company.

snek ,
@snek@lemmy.world avatar

Good for him. I like Nvidia and use one, but I have the rest of his company to thank for that.

I think for me it was a combination of:

< Name of person I don't know > says < big unhinged sweeping generalization > for < reason that makes no sense to anyone in the field >

My first instinct is not to click stuff like this altogether. I also think that anyone trying to preach what kids should or shouldn't do is already in the wrong automatically by assuming they have any say in this without a degree in pedagogy.

dojan ,
@dojan@lemmy.world avatar

He’s also obviously biased since the more people use LLMs and the like the more money he gets.

It’s a bit like “lions think gazelles should be kept in their enclosure”.

resetbypeer ,

And who will code the code for ML/AI models ? I mean for Jr. Developers this is going to be a better way to learn than "did you Google it? " And maybe have precise answers to your questions. But it sounds more to me like "maybe you should buy more of our silicon".

Sounds a bit like "640kb is more than enough" oneliner. But let's see what it will bring.

empireOfLove2 ,
@empireOfLove2@lemmy.dbzer0.com avatar

But it sounds more to me like “maybe you should buy more of our silicon”.

gotta drum up that infinite demand to meet and grow their insane valuation bubble when they already can't even produce enough to fill all orders.

PhlubbaDubba ,

Even if AI were able to be trusted, you still need to know the material to know what you're even asking the AI for.

It's a ruler to guide the pencil, not the pencil drawing a straight line itself, you still have to know how to draw to be able to use it in a way that fits what you want to do.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • random
  • incremental_games
  • meta
  • All magazines