Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

vulcanpost.com

CarbonatedPastaSauce , to Technology in End of coding? Microsoft framework makes devs AI supervisors

I write automation code for devops stuff. I’ve tried to use ChatGPT several times for code, and it has never produced anything of even mild complexity that would work without modification. It loves to hallucinate functions, methods, and parameters that don’t exist.

It’s very good for helping point you in the right direction, especially for people just learning. But at the level it’s at now (and all the articles saying we’re already seeing diminishing returns with LLMs) it won’t be replacing any but the worst coders out there any time soon.

tal ,
@tal@lemmy.today avatar

I can believe that they manage to get useful general code out of an AI, but I don't think that it's gonna be as simple as just training an LLM on English-code mapping. Like, part of the job is gonna be identifying edge conditions, and that can't be just derived from the English alone; or from a lot of other code. It has to have some kind of deep understanding of the subject matter on which it's working.

Might be able to find limited-domain tasks where you can use an LLM.

But I think that a general solution will require not just knowing the English task description and a lot of code. An AI has to independently know something about the problem space for which it's writing code.

Cryan24 ,

It's good for doing the boilerplate code for you that's about it.. you still need a human to do the thinking on the hard stuff.

QuadratureSurfer ,
@QuadratureSurfer@lemmy.world avatar

It's great for Pseudo code. But I prefer to use a local LLM that's been fine tuned for coding. It doesn't seem to hallucinate functions/methods/parameters anywhere near as much as when I was using ChatGPT... but admittedly I haven't used ChatGPT for coding in a while.

I don't ask it to solve the entire problem, I mostly just work with it to come up with bits of code here and there. Basically, it can partially replace stack overflow. It can save time for some cases for sure, but companies are severely overestimating LLMs if they think they can replace coders with it in its current state.

Pantherina ,

Could you recommend one?

QuadratureSurfer ,
@QuadratureSurfer@lemmy.world avatar

I use this model for coding:
https://huggingface.co/TheBloke/dolphin-2.5-mixtral-8x7b-GGUF
I would recommend the one with the Q5_K_M quant method if you can fit it.

TimeSquirrel , (edited )
@TimeSquirrel@kbin.social avatar

Context-aware AI is where it's at. One that's
integrated into your IDE and can see your entire codebase and offer suggestions with functions and variables that actually match the ones in your libraries. Github Copilot does this.

Once the codebase gets large enough, a lot of times you can just write out a comment and suddenly you'll have a completely functional code block pop up underneath it, and you hit "tab" to accept it and move on. It's a very sophisticated autocomplete. It removes tediousness and lets you focus on logic.

7heo ,

The thing is, devops is pretty complex and pretty diverse. You've got at least 6 different solutions among the popular ones.

Last time I checked only the list of available provisioning software, I counted 22.

Sure, some like cdist are pretty niche, but still, when you apply for a company, even tho it is going to either be AWS (mostly), azure, GCE, oracle, or some run of the mill VPS provider with extended cloud features (simili S3 based on minio, "cloud LAN", etc), and you are likely going to use terraform for host provisioning, the most relevant information to check is which software they use. Packer? Or dynamic provisioning like Chef? Puppet? Ansible? Salt? Or one of the "lesser ones"?

And thing is, even among successive versions, among compatible stacks, the DSL evolved, and the way things are supposed to be done changed. For example, before hiera, puppet was an entirely different beast.

And that's not even throwing docker or (or rkt, appc) in the mix. Then you have k8s, podman, helm, etc.

The entire ecosystem has considerable overlap too.

So, on one hand, you have pretty clean and useable code snippets on stackoverflow, github gist, etc. So much so that tools like that emerged... And then, the very second LLMs were able to produce any moderately usable output, they were trained on that data.

And on the other hand, you have devops. An ecosystem with no clear boundaries, no clear organisation, not much maturity yet (in spite of the industry being more than a decade old), and so organic that keeping up with developments is a full time job on its own. There's no chance in hell LLMs can be properly trained on that dataset before it cools down. Not a chance. Never gonna happen.

AdamEatsAss , to Technology in End of coding? Microsoft framework makes devs AI supervisors

Lol. Humans are just moving up on the stack. I'm sure some people were upset about how we wouldn't need electrical engineers anymore once digital circuits were invented. AI is a tool, without a trained user a tool is almost useless.

abhibeckert ,

AI is a tool, without a trained user a tool is almost useless.

Exactly. This feels a bit like the invention of the wheel to me. Suddenly some things are a lot easier than they used to be and I'm sitting here thinking "holy crap half my job is so easy now" while watching other people harp on about all the things it doesn't help with. Sure - they're right, but who cares about that? Look at all the things this tool can do.

vanderbilt ,
@vanderbilt@lemmy.world avatar

I use Claude to write plenty of the code we use, but it comes with the huge caveat that you can't blindly accept what it says. Ever hear newscasters talk about some hacker thing and wonder how they get it so wrong? Same thing with AI code sometimes. If you can code you can tell what it does wrong.

tsonfeir , to Technology in End of coding? Microsoft framework makes devs AI supervisors
@tsonfeir@lemm.ee avatar

Bugs. Bugs. Bugs.

AI is fine as an assistant, or to brainstorm ideas, but don’t let it run wild, or take control.

ptz , (edited ) to Technology in End of coding? Microsoft framework makes devs AI supervisors
@ptz@dubvee.org avatar

Is that why Windows 11 sucks so much? Like, did they just turn their codebot loose on the repo?

taanegl , to Technology in Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path

lisa_su-3

"I'm coming fo dat ass, Jensen"

Lisa Su, probably.

KingThrillgore , to Technology in Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path
@KingThrillgore@lemmy.ml avatar

"Time to pull the ladder up!"

curiousaur ,

That's how this statement and the state of the industry feels. The ai tools are empowering senior engineers to be as productive as a small team, so even my company laid off all the junior engineers.

So who's coming up behind the senior engineers? Is the ai they use going to take the reigns when they retire? Nope, the companies will be fucked.

ICastFist ,
@ICastFist@programming.dev avatar

Nope, the companies will be fucked.

"That's a future CEO's problem, not mine!" - Current CEO

Blackmist , to Technology in Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path

I don't think he's seen the absolute fucking drivel that most developers have been given as software specs before now.

Most people don't even know what they want, let alone be able to describe it. I've often been given a mountain of stuff, only to go back and forth with the customer to figure out what problem they're actually trying to solve, and then do it in like 3 lines of code in a way that doesn't break everything else, or tie a maintenance albatross around my neck for the next ten years.

ICastFist ,
@ICastFist@programming.dev avatar

Yesterday, I had to deal with a client that literally contradicted himself 3 times in 20 minutes, about whether a specific Date field should be obligatory or not. My boss and a colleague who were nearby started laughing once the client went away, partly because I was visibly annoyed at the indecision.

SuperSpruce , (edited ) to Technology in Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path

Good thing I'm majoring in computer engineering instead of computer science. I have a backdoor through electrical engineering.

AnxiousOtter ,

Good luck getting a job as an EE in north America without a masters degree.

Source: Also a CE, have looked at the EE job market. Bachelor's won't cut it.

SuperSpruce ,

Seems like there's a program at my university to get a master's degree a year after my Bachelor's. Sounds like I should go for it.

AnxiousOtter ,

If you want to work as an EE, ya probably.

Wooki , (edited ) to Technology in Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path

This overglorified snake oil salesman is scared.

Anyone who understands how these models works can see plain as day we have reached peak LLM. Its enshitifying on itself and we are seeing its decline in real time with quality of generated content. Dont believe me? Go follow some senior engineers.

Michal ,

Any recommendations whom to follow? On Mastodon?

thirteene ,

There is a reason they didn't offer specific examples. LLM can still scale by size, logical optimization, training optimization, and more importantly integration. The current implementation is reaching it's limits but pace of growth is also happening very quickly. AI reduces workload, but it is likely going to require designers and validators for a long time.

Wooki ,

For sure evidence is mounting that model size benefit is not returning the quality expected. Its also had the larger net impact of enshitifying itself with negative feedback loops between training data, humans and back to training. This one being quantified as a large declining trend in quality. It can only get worse as privacy, IP laws and other regulations start coming into place. The growth this hype master is selling is pure fiction.

msage ,

But he has a lot of product to sell.

And companies will gobble it all up.

On an unrelated note, I will never own a new graphics card.

Wooki ,

Secondhand is better value, still new cost right now is nothing short of price fixing. You only need look at the size reduction in memory since A100 was released to know what’s happening to gpu’s.

We need serious competition, hopefully intel is able to but foreign competition would be best.

msage ,

I doubt that any serious competitor will bring any change to this space. Why would it - everyone will scream 'shut up and take my money'.

Wooki ,

I think you answered your own question: money

Wooki , (edited )

Fediverse is sadly not as popular as we would like sorry cant help here. That said i follow some researchers blogs and a quick search should land you with some good sources depending on your field of interest

Animated_beans ,

Why do you think we've reached peak LLM? There are so many areas with room for improvement

Wooki , (edited )

You asked the question already answered. Pick your platform and you will find a lot of public research on the topic. Specifically for programming even more so

Cosmicomical , (edited ) to Technology in Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path

Just because you're the CEO of a big company, it doesn't mean you know what you're talking about. In this case it's clear he doesn't. You may say "but the company makes a lot of money" and that's not a point in his favor either, as this is a clear example of survivor bias. Coding is going nowhere and the companies laying off people are just a proof CEOs don't know what they are doing.

For years there have been open source solutions ready for basically any purpose, and if that has not made coders useless, nothing will. Maybe they will change designation, but people that understand what's going on at a technical level will always be necessary.

There have been some situations in the past few years that made the situation less clear-cut, but that doesn't make coders optional.

rottingleaf , to Technology in Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path

I think this is bullshit regarding LLMs, but making and using generative tools more and more high-level and understandable for users is a good thing.

Like various visual programming means, where you sketch something working via connected blocks (like PureData for sounds), or in Matlab I think one can use such constructors to generate code for specific controllers involved in the scheme, or like LabView.

Or like HyperCard.

Not that anybody should stop learning anything. There's a niche for every way to do things.

I just like that class of programs.

gazter ,

As someone who's had a bit of exposure to PLCs and ladder logic, and dabbled in some more 'programming' type languages, I would love to find some sort of 'language' that fits together like ladder logic, but for more computery type applications.

I like systems, not programs. Most of my software design is done by building a flowchart, then stumbling around trying to figure out how to write that into code. I feel it would be so much easier if I could just make the flowchart be the code.

I want a grown up Scratch.

rottingleaf ,

In some sense this is regressive, but I agree that ladder logic is more intuitive.

I hated drawing flowcharts in university, but at this point have learned that if you understand what you're doing, you can draw a flowchart. If you don't, you shouldn't have written that program.

So yeah.

I think the environment to program "Buran" used such a language (that is, they'd draw flowcharts instead of code).

Sibbo , to Technology in Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path
@Sibbo@sopuli.xyz avatar

Founder of company which makes major revenue by selling GPUs for machine learning says machine learning is good.

hitmyspot ,

It doesn’t make him wrong.

Just like we can now uss LLM to create letters or emails with a tone, it’s not going to be a big leap to allow it to do similar with coding. It’s quite exciting, really. Lots of people have ideas for websites or apps but no technical knowledge to do it. AI may allow it, just like it allows non artists to create art.

TangledHyphae ,

I use AI to write code for work every day. Many different models and services, including https://ollama.ai on my own hardware. It's useful for a developer when they can take the code and refactor it to fit into large code-bases (after fixing its inevitable broken code here and there), but it is by no means anywhere close to actually successfully writing code all on its own. Eventually maybe, but nowhere near anytime soon.

Lmaydev ,

Agreed. I mainly use it for learning.

Instead of googling and skimming a couple blogs / so posts, I now just ask the AI. It pulls the exact info I need and sources it all. And being able to ask follow up questions is great.

It's great for learning new languages and frameworks

It's also very good at writing unit tests.

Also for recommending Frameworks/software for your use case.

I don't see it replacing developers, more reducing the number of developers needed. Like excel did for office workers.

TangledHyphae ,

You just described all of my use cases. I need to get more comfortable with copilot and codeium style services again, I enjoyed them 6 months ago to some extent. Unfortunately current employer has to be federally compliant with government security protocols and I'm not allowed to ship any code in or out of some dev machines. In lieu of that, I still run LLMs on another machine acting, like you mentioned, as sort of my stackoverflow replacement. I can describe anything or ask anything I want, and immediately get extremely specific custom code examples.

I really need to get codeium or copilot working again just to see if anything has changed in the models (I'm sure they have.)

hitmyspot ,

It can't tell yet when the output is ridiculous or incorrect for non coding, but it will get there. Same for coding. It will continue to grow in complexity and ability.

It will get there, eventually. I don't think it will be writing complex code any time soon, but I can see it being aware of all the libraries and foss that a person cannot be across.

I would foresee learning to code as similar to learning to do accounting manually. Yes, you'll still need to understand it to be a coder, but for the average person that can't code, it will do a good enough job, like we use accounting software now for taxes or budgets that would have been professionally done before. For complex stuff, it will be human done, or human reviewed, or professional coders giving more technical instructions for ai. For simple coding, like you might write a python script now, for some trivial task, ai will do it.

Jolan ,

I think this is going to age really badly and I don't like LLMs but I think it will be soon. People also said that AI as we see it now is decades away but we got it quite quickly so I think it's a very small step to go from writing fully grammatically correct English to fully correct code. It's basically just a language the ai has to learn. But I guess what do I know. We'll just have to wait and see

TangledHyphae , (edited )

I've been doing this for over a year now, started with GPT in 2022, and there have been massive leaps in quality and effectiveness. (Versions are sneaky, even GPT-4 has evolved many times over and over without people really knowing what's happening behind the scenes.) The problem still remains the "context window." Claude.ai is > 100k tokens now I think, but the context still limits an entire 'session' to only make so much code in that window. I'm still trying to push every model to its limits, but another big problem in the industry now is effectiveness via "perplexity" measurements given a context length.

https://pbs.twimg.com/media/GHOz6ohXoAEJOom?format=png&name=small

This plot shows that as the window grows in size, "directly proportional to the number of tokens in the code you insert into the window, combined with every token it generates at the same time" everything that it produces becomes less accurate and more perplexing overall.

But you're right overall, these things will continue to improve, but you still need an engineer to actually make the code function given a particular environment. I just don't get the feeling we'll see that within the next few years, but if that happens then every IT worker on earth is effectively useless, along with every desk job known to man as an LLM would be able to reason about how to automate any task in any language at that point.

variaatio ,

Well difference is you have to know coming to know did the AI produce what you actually wanted.

Anyone can read the letter and know did the AI hallucinate or actually produce what you wanted.

On code. It might produce code, that by first try does what you ask. However turns AI hallucinated a bug into the code for some edge or specialty case.

Hallucinating is not a minor hiccup or minor bug, it is fundamental feature of LLMs. Since it isn't actually smart. It is a stochastic requrgitator. It doesn't know what you asked or understand what it is actually doing. It is matching prompt patterns to output. With enough training patterns to match one statistically usually ends up about there. However this is not quaranteed. Thus the main weakness of the system. More good training data makes it more likely it more often produces good results. However for example for business critical stuff, you aren't interested did it get it about right the 99 other times. It 100% has to get it right, this one time. Since this code goes to a production business deployment.

I guess one can code comprehensive enough verified testing pattern including all the edge cases and with thay verify the result. However now you have just shifted the job. Instead of programmer programming the programming, you have programmer programming the very very comprehensive testing routines. Which can't be LLM done, since the whole point is the testing routines are there to check for the inherent unreliability of the LLM output.

It's a nice toy for someone wanting to make a quick and dirty test code (maybe) to do thing X. Then try to find out does this actually do what I asked or does it have unforeseen behavior. Since I don't know what the behavior of the code is designed to be. Since I didn't write the code. good for toying around and maybe for quick and dirty brainstorming. Not good enough for anything critical, that has to be guaranteed to work with promise of service contract and so on.

So what the future real big job will be is not prompt engineers, but quality assurance and testing engineers who have to be around to guard against hallucinating LLM/ similar AIs. Prompts can be gotten from anyone, what is harder is finding out did the prompt actually produced what it was supposed to produce.

MartianSands ,

It might not make him wrong, but he also happens to be wrong.

You can't compare AI art or literature to AI software, because the former are allowed to be vague or interpretive while the latter has to be precise and formally correct. AI can't even reliably do art yet, it frequently requires several attempts or considerable support to get something which looks right, but in software "close" frequently isn't useful at all.
In fact, it can easily be close enough to look right at first glance while actually being catastopically wrong once you try to use it for real (see: every bug in any released piece of software ever)

Even when AI gets good enough to reliably produce what it's asked for first time & every time (which is a long way away for quite a while yet), a sufficiently precise description of what you want is exactly what programmers spend their lives writing. Code is a description of a program which another program (such as a compiler) can convert into instructions for the computer. If someone comes up with a very clever program which can fill in the gaps by using AI to interpret what it's been given, then what they've created is just a new kind of programming language for a new kind of compiler

hitmyspot ,

I don't disagree with your point. I think that is where we are heading. How we interact with computers will change. We're already moving away from keyboard typing and clicks, to gestures and voice or image recognition.

We likely won't even call it coding. Hey Google, I've downloaded all the episodes for the current season of Pimp My PC, can you rename the files by my naming convention and drop them into jellyfin. The AI will know to write a python script to do so. I expect it to be invisible to the user.

So, yes, it is just a different instruction set. But that's all computers are. Data in, data out.

SlopppyEngineer ,

Until somewhere things go wrong and the supplier tries the "but an AI wrote it" as a defense when the client sues them for not delivering what was agreed upon and gets struck down, leading to very expensive compensations that spook the entire industry.

hitmyspot ,

Aor Canada already tried that and lost. They had to refund the customer as the chatbot gave incorrect information.

BombOmOm ,
@BombOmOm@lemmy.world avatar

Turns out the chatbot gave the correct information. Air Canada just didn’t realize they had legally enabled the AI to set company policy. :)

Murvel ,

Yes but Nvidia relies heavily on programmers themselves. Without them Nvidia wouldn't have a single product. The fact that he despite this makes these claims is worth taking note.

WhatAmLemmy ,

Lol. They're at the top of the food chain. They can afford the best developers. They do not benefit from competition. As with all leading tech corporations, they are protectionist, and benefit more from stifling competition than from innovation.

Also, more broadly the oligarchy don't want the masses to understand programming because they don't want them to fundamentally understand logic, and how information systems work, because civilization is an information system. It makes more sense when you realize Linux/FOSS is the socialism of computing, and anti-competitive closed source corporations like Nvidia (notorious for hindering Linux and FOSS) are the capitalist class of computing.

Sanctus ,
@Sanctus@lemmy.world avatar

Techno-Mage has been trying to warn us this whole time.

fidodo , to Technology in Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path

As a developer building on top of LLMs, my advice is to learn programming architecture. There's a shit ton of work that needs to be done to get this unpredictable non deterministic tech to work safely and accurately. This is like saying get out of tech right before the Internet boom. The hardest part of programming isn't writing low level functions, it's architecting complex systems while keeping them robust, maintainable, and expandable. By the time an AI can do that, all office jobs are obsolete. AIs will be able to replace CEOs before they can replace system architects. Programmers won't go away, they'll just have less busywork to do and instead need to work at a higher level, but the complexity of those higher level requirements are about to explode and we will need LLMs to do the simpler tasks with our oversight to make sure it gets integrated correctly.

I also recommend still learning the fundamentals, just maybe not as deeply as you needed to. Knowing how things work under the hood still helps immensely with debugging and creating better more efficient architectures even at a high level.

I will say, I do know developers that specialized in algorithms who are feeling pretty lost right now, but they're perfectly capable of adapting their skills to the new paradigm, their issue is more of a personal issue of deciding what they want to do since they were passionate about algorithms.

gazter ,

In my comment elsewhere in the thread I talk about how, as a complete software noob, I like to design programs by making a flowchart first, and how I wish the flowchart itself was the code.

It sounds like what I'm doing might be (super basic) programming architecture? Where can I go to learn more about this?

fidodo ,

Look up visual programming languages. When you apply a visual metaphor to programming it really is basically just really detailed and complex flow charts.

OleoSaccharum , to Technology in Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path

Nvidia is such a stupid fucking company. It's just slapping different designs onto TSMC chips. All our "chip companies" are like this. In the long run they are all going to get smoked. I won't tell you by whom. You shouldn't need a reminder.

ammonium ,

Designing a chip is something completely different from manufacturing them. Your statement is as true as saying TSMC is such a stupid company, all they are doing is using ASML machines.

And please tell me, I have no clue at all who you're talking about.

barsoap ,

The Chinese? I think their claim to fame is making processes stolen from TSMC work using pre-EUV lithography. Expensive AF because slow but they're making some incredibly small structures considering the tech they have available. Russians are definitely out of the picture they're in the like 90s when it comes to semiconductors and can't even do that at scale.

And honestly I have no idea where OP is even from, "All our chip companies". Certainly not the US not at all all US chip companies are fabless: IBM, Ti and Intel are IDMs. In Germany IDMs predominate, Bosch and Infineon though there's of course also some GlobalFoundries here, that's pure play, so will be the TSMC-Bosch-NXP-Infineon joint venture ESMC. Korea and Japan are also full of IDMs.

Maybe Britain? ARM is fabless, OTOH ARM is hardly British any more.

OleoSaccharum ,

Amazon is fabless for their chip design unit, there all little mini design units for shit like datacenters.

It's hilarious you're saying that because Intel labelled itself an investor in USA foundry projects you think they are exempt from this. Okay man, go work at the plants in Ohio and Arizona. Oh wait, they don't fucking exist bruh

barsoap ,

Intel, Ti and IBM all made chips before pure-play and fabless were even a thing, and are still doing so. Intel has 16 fabs in the US, Ti 8, IBM... oh, they sold their shit, I thought they still had some specialised stuff for their mainframes. Well, whatever.

Of all companies, the likes of Amazon and Google not fabbing their own chips should hardly be surprising. They're data centre operators, they don't even sell chips, if they set up fabs they'd have to start doing that, or compete with TSMC to not have idle capacity standing around costing tons of money. A fab is not like a canteen which you can expect to actually be in use all the time so there's going to be no need to compete in the restaurant business to make it work.

And that's only really looking at logic chips, e.g. Micron also has fabs at home in the US.

OleoSaccharum ,

None of those companies even make a blip on global chip production though. Are they for research or something? Why should I give a shit about a tiny technically existing fraction of production that will never expand?

Go look at where there has been actual foundry production for decades. None of the companies you mentioned even exist in foundry. Who cares if they have A facility or two? That's just part of figuring out what they're going to order from TSMC.

barsoap ,

Your goalposts, they are moving.

The US has the know-how to produce modern chips at scale, or at least not too far behind in strategic terms. You could bring all production home if that's what you wanted, it'd cost a lot of money but it's simply a policy issue. And Amazon wouldn't suddenly start to run fabs they'd hire capacity from Intel or whomever.

...you'd still be reliant on European EUV machines, though. Everyone is, if you intend to produce very modern chips at scale. But if your strategic interest is making sure that the DMV has workstations and the military guidance computers that's not necessary, pre-EUV processes are perfectly adequate.

OleoSaccharum ,

You are the one moving the goalposts with your boasts about how these companies make up LITERALLY an INFINITESIMAL portion of global chip production. Even if you cut out Samsung and TSMC they wouldn't be global players.

No, we can't just bring all production home lol. We've been saying we will for years. Where is the foundry in Ohio dude? Where is the Arizona foundry that's supposed to bolster TSMC production?

Lol yeah sure go ask ASML how their business is doing rn in light of the US chip war sanctions. European manufacturing is in as dire a state as the US now due to financialization and now the skyrocketing energy costs.

People said this about our military production too. "Oh, Russia messed up now, we're going to get serious and amp up our military production." 🦗🦗🦗🦗🦗🗓️🗓️🗓️🗓️ (time loudly passing and nothing happening)

How many times is it going to take for people to learn it gets transmuted directly into stock buybacks lmao? We don't have the electrical grid to build up our manufacturing base in the modern world yet. The US is a giant casino for the elite of our empire full of slums.

barsoap ,

We don’t have the electrical grid to build up our manufacturing base in the modern world yet. The US is a giant casino for the elite of our empire full of slums.

You won't hear me disagree with that. But to say, and I quote you directly:

It’s just slapping different designs onto TSMC chips. All our “chip companies” are like this.

While Intel might very well take the tech crown (gate all around with backside power) from TSMC this year is wildly incorrect.

European manufacturing is in as dire a state as the US now due to financialization and now the skyrocketing energy costs.

"Skyrocketing", yeah. Gas looks similar.

And no European manufacturing is not in nearly as dire a state as in the US. For that to be the case we'd have to have as shoddy infrastructure and decades-long underinvestment and offshoring as the US has. The US has in fact a more advanced chip industry than the Europe: We're good at the basic science, we're good at bulk production of specialised stuff, one thing that we're not great at is top-tier CPUs and GPUs, chips that are their own products, what we produce is the usual "the thing that goes into a thing that goes into a thing you buy". Like, random example, pretty much every smartphone in the world uses a Bosch gyroscope and they produce those things in-house.

But that doesn't mean that the US is fucked, in the least: If need be it would be able to spring back to life quite quickly, Thing is, needs do not be, so if your worry is elite casinos maybe don't focus so much on chips and incorrect statements about US capacity there but said elite casino directly.

OleoSaccharum ,

Yeah the casino bit is the most important part for sure. In light of how financialized everything is, huge costs, massively inflated financial asset & real estate prices etc, labor costs, it's more likely for Detroit to spring back into being an industrial hub.

We focus all of our political energy on monopolizing the top of the value chain TSMC is a part of and we can't replace it with our own production bc it's so crucial for cutting costs down. They can't even expand the production for lower end chips (ROI isn't there) now so Russia and China are gonna scoop up orders from expansion in the many industries that use them (low end chips were like 20% of TSMC's revenue recently, iirc). Which will help them develop their higher end foundries which they definitely can make I mean Rosatom produces super high quality Xray mirrors and the Chinese govt won't balk at industrial investment or high tech training programs.

ASML's whole position in this convoluted supply chain means they only make those shipping container sized thingies with the rube goldberg machine of mirrors hooked up to a gallium plasma light thingy, and that ultimately limits the minimum nanometer size of the circuits made in the fabrication units they ship out. If I'm getting that right 🤪. This is really futuristic stuff I'm talking about now but the next-next gen fabrication units beyond Russia and China catching up could even be hooked up to a particle accelerator. That's pretty hard to export in the same way.

I just don't see how we can politically or financially solve these problems in the US or EU lol. We're kind of caught by the balls as workers no?

barsoap ,

We’re kind of caught by the balls as workers no?

You don't need to fab chips to have a job. Let the Taiwanese have their speciality you have yours what's wrong with that.

Also TSMC's company culture is excessively Confucian you probably wouldn't want to work there anyway. It's not just the company and culture that makes the company but also things like Taiwanese universities churning out masses of highly-skilled electrical engineers, basically the only reason TSMC even agreed to that European joint-venture is because Dresden's universities have been focussed on that exact field for decades, even before reunification. For a similar reason you couldn't just take Zeiss and move it out of Jena: They need the local university to funnel students into their workforce. There's no better place to study optics in the world than Jena.

Which actually brings me to another point: All these are labour aristocracy jobs, not just trained but highly educated, comparable to a doctor at a hospital. They have a lobby, they have a good bargaining position. Worrying about them won't do anything for the burger flipper at your local fast-food joint who tends to have neither. It also won't really do much for the injection moulding machine operator producing tea sieves (sorry I was just admiring one it has stainless steel mesh embedded in it, not easy to produce, made in Germany, not cheap but oh gods is that thing worth the extra three bucks (it was five)).

OleoSaccharum ,

sure, they're labor aristocracy jobs bc they're at the tip top of the global supply chain, but most people do not partake in that at all, or management etc, or legal/medical/whatever other high end shit, and 50% of the US is in crappy service work like mcdonalds literally.

no matter what industry you work in you can only be pessimistic here lmao unless you're like in finance or useless c suite shit

i'm not crying for the TSMC foundry or trying to work there. i hope the NATO+ intelligence services edge from high end chip production being under our control is unseated, it would be good for all of us

what's going on with TSMC is indicative of wider issues with all kinds of US industries I'm in solar and frankly I plan to gtfo in the long term to a more interesting area of development. I don't expect it to make my life easier per se but there are a lot of reasons.

barsoap ,

solar

Aaaaaa once upon a time Germany was world leader in solar. Then a conservative government came along and slashed subsidies in ways that noone could adopt to (mostly because suddenly, against everyone's expectations, and without tapering) and now the US of all places has more of a market share, though the bulk of course is Chinese -- who bought German tech for cheap at bankruptcy auctions.

All that is certainly annoying, OTOH you gotta admit that keeping walking after shotgunning your own feet several times in a row does mean that you have some rather solid feet.

OleoSaccharum ,

Solar here makes me so pissed since you know Texas could be a magnificent location but instead it's such a libertarian shithole
lmao

OleoSaccharum , (edited )

Oh also insurance prices and shit like that here are a nightmare. Legal is too obviously. That's why Nike could never just move all their production here even thought it would be trivial to teach people to make shoes. The convoluted global supply chain is the whole point

ammonium ,

None of those companies even make a blip on global chip production though

Neither does TSMC, high end chips is just a tiny part of the number of chips (albeit an important and lucrative part of the market).

TSMC is alone at the top is because it's so damn expensive and the market is not that big, there's basically no place for a competitor. Anyone trying to dethrone them has to have very deep pockets and a good reason not to simply buy from TSMC. The Chinese might be able to pull it off, they have the money and a good reason.

OleoSaccharum ,

Except what NVIDIA is doing can be done by numerous other chip design firms, TSMC cannot be replaced.

slappy , to Technology in Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path

[Thread, post or comment was deleted by the author]

  • Loading...
  • tsonfeir ,
    @tsonfeir@lemm.ee avatar

    I use “AI” when I work. It’s like having a really smart person who knows a bit about everything available 24/7 with useful responses. Sure, it’s not all right, but it usually leads me down the path to solving my problem a lot faster than I could with “Googling.” Remember Google? What a joke.

    CosmoNova ,

    I think it‘s less of a really smart person and more of a very knowledgeable person with an inflated ego so you take everything they say with a grain of salt. Useful nonetheless.

    rikudou ,

    I think a colleague of mine made a great comparison: It's like having access to a thousand junior devs who can reply really fast.

    tsonfeir ,
    @tsonfeir@lemm.ee avatar

    That’s a good comparison

    Cargon ,
    Wooki ,

    It’s hallucinating garbage wrapped in over hyped marketing

    ___ ,

    This. The technology is here to stay and will literally change the world. In a few years when the Sora and SD3 models are released and well understood, and desktop GPUs begin offering 24GB vram to midrange cards out of demand, it will be crazier than we can imagine. LLMs are already near human level with enough compute. As tech gets faster and commoditized, everyone becomes and artist and a programmer. Information will no longer be trusted, and digital verification technology will proliferate.

    Invest now.

    That and nuclear batteries capable of running pi like machines for decades. 1w is on the horizon by BetaVolt.

    Nachorella ,

    I'm not sure why you're being downvoted. I don't think the current technology is going to replace programmers or artists any time soon (speaking as someone who works as an artist and programmer in a field that monitors ai and its uses) but I also acknowledge that my guess is as good as yours.

    I don't think it's going to replace artists because as impressive as the demos we all see are, inevitably, whenever I've done any thorough testing, every AI model fails at coming up with something new. It's so held back by what it's trained on, that to contemplate it replacing an artist - who are very capable of imagining new things - seems absurd to me.

    Same with programming - ask for something it doesn't know about and it'll lie and make something up and confidently proclaim it as truth. It can't fact check itself and so I can only see it as a time saving tool for professionals and a really cool way for hobbyists to get results that were otherwise off the table.

    Womble ,

    I cant speak for certain about generating art, I'm no artist and my limit of experience there is playing around with stable diffusion, but it feels like its in the same place as LLMs for programming. Its incredibly impressive at first but once you've used it for a bit the flaws become obvious. It will be a very powerful tool for artists to use, just like LLMs are for programming, and will likely significantly decrease the time needed to produce something, but is nowhere near replacing a human entirely.

    Nachorella ,

    Yeah, for art it's similar, you can get some really compelling results, but once tasked with creating something a bit too specific it ends up wasting your time more than anything.

    There's definitely uses for it and it's really cool, but I don't think it's as close to replacing professionals as some people think.

    314xel , (edited )
    @314xel@lemmy.world avatar

    Unique style paintings will become even more valuable in the future. Generative AI only spews "art" based on previous styles it learned / was trained on. Everything will be even more rehashed than it is today (nod to Everything is a Remix). Having a painting made by an actual human hand on your wall will be more ego-boosting than an AI generated one.

    Sure, for general digital art (ie logos, game character design, etc) when uniqueness isn't really mandatory, AI is a good, very cheap tool.

    As for the "everyone becomes a programmer" part... naah.

    rikudou ,

    Having a painting made by an actual human hand on your wall will be more ego-boosting

    Nothing really changes, this has always been the case.

    JackFrostNCola ,

    Just being a stickler here but Electronics Engineers, not Electrical. Similar sounding but like the difference between a submarine captain and an airplane captain.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • incremental_games
  • meta
  • All magazines