Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

chemicalwonka ,
@chemicalwonka@discuss.tchncs.de avatar

of course they will, it is for profit

egeres ,
@egeres@lemmy.world avatar

I can't believe I'm reading this in 2024

horse ,

There is exactly one reason why they do this: So they can charge you $200 to upgrade it to 16GB and in doing so make the listed price of the device look $200 cheaper than it actually is. Or sometimes $400 if it's a model where the base model comes with a 256GB SSD (the upgrade to 512GB, the minimum I'd ever recommend, is also $200).

The prices Apple charges for storage and RAM are plain offensive. And I say that as someone who enjoys using their stuff.

Jesus_666 ,

That's why I dropped them when my mid-2013 MBP got a bit long in the tooth. Mac OS X, I mean OS X, I mean macOS is a nice enough OS but it's not worth the extortionate prices for hardware that's locked down even by ultralight laptop standards. Not even the impressive energy efficiency can save the value proposition for me.

Sometimes I wish Apple hadn't turned all of their notebook lines into MacBook Air variants. The unibody MBP line was amazing.

ebc ,

Sometimes I wish Apple hadn’t turned all of their notebook lines into MacBook Air variants. The unibody MBP line was amazing.

Typing this from a M2 Max Macbook Pro with 32GB, and honestly, this thing puts the "Pro" back in the MBP. It's insanely powerful, I rarely have to wait for it to compile code, transcode video, or run AI stuff. It also does all of that while sipping battery, it's not even breaking a sweat. Yes, it's pretty thin, but it's by no means underpowered. Apple really is onto something with their M* lineup.

But yeah, selling "Pro" laptops with 8GB in 2024 is very stupid.

NostraDavid ,
@NostraDavid@programming.dev avatar

I haven't used 8GB since... 2008 or so? TBF, I'm a power user (as are most people on any Lemmy instance, I presume), but still...

And sure, Mac OS presumably uses less RAM than Windows, but all the applications don't.

BilboBargains ,

As engineers, we should never insert proprietary interfaces into our designs. We shouldn't obfuscate the design.

The motivation for these toxic practices comes from the business side because it's profitable. These people won't share the profits with you because they are psychopaths. Ultimately we are making more waste when electronics cannot be upgraded, maintained and repaired. It's bad for people and it's bad for the environment.

TheGrandNagus , (edited )

So much stuff in both the hardware and software world really annoys me and makes me think our future is shit the more I think about it.

Things could be so much better. Pretty much everything could be open and standardised, yet it isn't.

Software can be made in a way that isn't user-hostile, but that's not the way of things. Hardware could be repairable and open, without OEMs having to navigate a minefield of IP and patents, much of which shouldn't have been granted in the first place, or users having no ability to repair or upgrade their devices.

It's all so tiresome.

rottingleaf ,

I think Napoleon said something similar to "the army is commanded by me and the sergeants"?

Well, not true anymore today. All this connectivity and processing power, however seemingly inefficiently they are used, allow to centralize the world more than it could ever be. No need to consider what sergeants think.

(Which also means no Napoleons, cause much more average, grey, unskilled and generally unpleasant and uninteresting people are there now.)

It's about power and it happened in the last 15 years.

I think it's a political tendency, very intentional for those making decisions, not a "market failure" and other smartassery. It comes down to elites making laws. I feel they are more similar to Goering than to Hitler all over the world today.

This post may seem nuts, but our daily lives significantly depend on things more complex and centralized in supply chains and expertise than nukes and spaceships.

We don't need desktop computers which can't be fully made in, say, Italy, or at least in a few European countries taken together. Yes, this would mean kinda going back to late 90s at best in terms of computing power per PC, but we waste so much of it on useless things that our devices do less now than then.

We trade a lot of unseen security for comfort.

drmoose ,

Apple has been really stretching their takes lately. Nice to see some fire under their ass though it's not going to matter. Too many ignorant people falling for likeable propaganda.

iAvicenna ,

does that mean people wont be able to use chrome in their macs?

ours ,

One tab only.

mightyfoolish ,

I get upgrades help the bottom line but considering that 8GB of RAM chokes the silicon they are allegedly so proud of... seems like a slap in the face to their own engineers (and the customer as well but that is not my point).

Raz ,

Like the upper management and C-suite give a fuck about any of their employees.

anhydrous ,

My X220 and T520 each have 16GB. The designed max was actually "only" 8GB, but it turns out 16 GB actually works. I replaced the RAM modules myself without asking Lenovo for permission. Those models came out in 2011.

jaschen ,

My HP Omen 17" was designed for a maximum of 32GB ram. I'm currently running 64GB on it.

Duamerthrax ,

This was also true for Apple computers before they started soldering the ram in place. I remember going way over spec in my old G4 tower. Hell, I doubt the system would crash if you found larger ram chips and soldered them in.

Klause ,

I doubt the system would crash if you found larger ram chips and soldered them in.

You can't even swap components with official ones from other upgraded models. Everything is tied down with verification codes and shit nowadays. So I doubt you could solder in new ram and get it to work.

Valmond ,

Yeah lol my thinkcentre with a 6gen intel had only 8GB (I paid under 100€ for it) so I went shopping to double that on a second hand site, but the price for 4, 8 or the 16GB ddr4 ram stick (sodimm, there seems to be a flood of used ones) I bought was about the same, like 30€ shipping included, so now I got 24GB.

GlobalMind ,

I also can not figure out why so many companies are selling them with only a 500Gb drive. SSD or HDD.

vinyl ,

So they can charge more for an upgrade. Simple business tactics.

Classy ,

Don't forget cloud services!

jenny_ball ,
@jenny_ball@lemmy.world avatar

i have more ram on my old gpu apple sucks

dustyData ,

A friend has a phone with more ram.

jenny_ball ,
@jenny_ball@lemmy.world avatar

all my phones have more ram since like 2015

KillingTimeItself ,

what a weird title bro, of course they argue in favor of it, they sell the fucking hardware that they created. Be a little weird if they just argued against it after spending billions designing and manufacturing it.

Regardless, i still can't believe apple thought 8GB minimum was ok, genuinely baffling to me.

Blackmist ,

8GB RAM is what my phone has.

Having that in a laptop shows what they think of people buying their kit. They think you're only buying it so you can type easier on Facebook.

KillingTimeItself ,

TBF 8gb of ram on a phone is actually psychotic. You really shouldn't be doing all that much on a phone lol.

IthronMorn ,

Then what should I be doing on my phone?

KillingTimeItself ,

nothing that requires 8GB of ram lol.

I've played the entirety of java minecraft on an old thinkpad with 4GB of ram. It didn't crash (i dont use swap)

There literally shouldn't be anything capable of using that much memory.

greedytacothief ,

Is this bait? Because like, you could be rendering, simulating, running virtual machines. Lots of stuff that aren't web browsers also eat ram

RippleEffect ,

Web browsers also eat ram.

KillingTimeItself ,

90% of which can be paged in the background, it's not like most people are chronically browsing the web on their phones.

woelkchen ,
@woelkchen@lemmy.world avatar

it’s not like most people are chronically browsing the web on their phones.

Yes, they do.

KillingTimeItself ,

and it's also the worst place to do that. If you're going to be chronically online like me, you should at least give it clear boundaries between something you carry on you at all times, and something that you regularly have access to, like my workstation for instance.

Unless you like being horribly depressed or something.

greedytacothief ,

I was trying to mention things that weren't just web browsers. Since it seemed the comment was about programs that use more ram than they seemingly need to.

Edit: There's like photogrammetry and stuff that happens on phones now!

RippleEffect ,

And games!

KillingTimeItself ,

games are probably a better argument honestly, but even at that point, it's not a really good experience. Unless you buy a gaming phone, which i guess is an option. Regardless the mobile gaming market is actually vile.

KillingTimeItself ,

i suppose photo editing would be one? Maybe? I'm not sure how advanced photo editing would be on mobile, it's not like you're going to load up the entirety of GIMP or something.

As for photogrammetry, i'm not sure that would consume very much ram. It could, i honestly don't think it would be that significant.

AdrianTheFrog ,
@AdrianTheFrog@lemmy.world avatar

There’s like photogrammetry and stuff that happens on phones now!

No, the photogrammetry apps all use cloud processing. The LIDAR ones don't, but that's only for Apple phones and the actual mesh quality is pretty bad.

KillingTimeItself ,

on a phone? Why the fuck would anyone be running virtual machines on a phone?

dustyData , (edited )

My man, have you been to selfhosted? People are using smart phones for all kinds of crazy stuff. They are basically mini ARM computers. Particularly the flagships, they can do many things like editing video, rendering digital drawings, after they end their use life they can host adguards, do torrent to NAS, host nextcloud. You name it.

pythonoob ,

Something like the Samsung Dex app that basically turns your phone into a mini computer with kbm and a monitor wouldn't bee too bad tbh for most people. Take all your shit with you in your pocket and dock it at home or at work or whatever.

KillingTimeItself ,

yeah, i literally selfhost a server, running like 8 different services. I'm quite acclimated to it by now. Using a phone for this kind of thing is the wrong device. A chromebook is going to be a better alternative. You can probably get those cheaper anyway.

A big problem with phones is that they just aren't really designed for that kind of thing, you leave a phone plugged in constantly and it's going to spicy pillow itself. Let alone even trying to do that on something that isn't an android. I cannot imagine the hell that self hosting on an android would be, let alone on an iphone.

I could see a usecase for it as a network relay in the event that you need a hyper portable node or something. GLHF with the dongling if you need those.

Unfortunately, if you already have a server, it's going to be better to just spin up a new task on that server, as the cost of running a new device is going to outweight the cost of just using an existing one that's already running. Also, you can get stuff like a raspi or le potato for pretty cheap also. not very powerful, but probably more utility, especially given the IO.

dustyData ,

Yeah, god forbids anyone ever does anything suboptimal or worse…for fun 😱

KillingTimeItself , (edited )

i'm not saying that you can't but like, you shouldn't buy a phone with the prospect to turn it into a server. You should sell your old phone. Or use it until it dies. That's probably going to be better in the long run honestly. You use a laptop? A desktop? An SBC even? All of those can be converted into a server with MUCH longer lifespans, and better software support.

Mobile hardware often has a support period of like 2-3 years, although that's changed recently, the hardware expectancy is probably more like 5 years at most. Meanwhile, desktop hardware, and mobile hardware in particular can easily last like 10 years. Even longer if you're ok with running legacy hardware.

My primary mobile laptops are 10 12 years old respectively. They're perfectly fine for what i need. I would NOT want to be using a 10 year old phone for that.

If you aren't the type of person buying or owning laptops, you almost certainly do not know what self hosting is.

AdrianTheFrog ,
@AdrianTheFrog@lemmy.world avatar

It sounds a lot more cost effective to get a used mini-pc than a flagship phone for any sort of server stuff.

KillingTimeItself ,

literally this, anything other than a phone is going to be more purpose suited. cheaper, and probably more versatile. You're spending money on a really expensive screen that you are literally not going to be using. You might as well buy something with a shitty screen, or none at all.

AdrianTheFrog ,
@AdrianTheFrog@lemmy.world avatar

I got a ThinkCentre M700 with an i7-6700, 16gb of ram and a 256gb SSD for $70 total. It's really hard to get a phone with anywhere near that value for money.

KillingTimeItself ,

exactly, even if we're talking buying brand new modern desktop hardware. The sheer benefit you gain of having an sata port, and being able to stuff an 18TB exos drive on it, for example, will immediately pay itself off in terms of what cloud storage would cost, while also not being limited to your internet uplink speeds. You could easily run 10gig if you really wanted to. Although realistically, 2.5gb is going to be more apt.

greedytacothief ,

Does the JVM count?

KillingTimeItself ,

thats really funny, but no.

AdrianTheFrog ,
@AdrianTheFrog@lemmy.world avatar

you could be rendering, simulating, running virtual machines

On a phone? I guess you could, although 4gb is probably enough for any video game that any amount of people use.

woelkchen ,
@woelkchen@lemmy.world avatar

People use phone apps for photo and video editing these days. The common TikTok kid out there doesn't use Adobe Premiere on a desktop workstation.

Phone apps often are desktop applications with a specialized GUI these days.

KillingTimeItself ,

i mean yeah, but even then those aren't significant filters, and what makes you think that tiktok isn't running a render farm somewhere in china to collect shit tons of data? They're already collecting the data, might as well provide a rendering service to make the UI nicer, but i don't use tiktok so don't quote me on it.

Those are also all built into tiktok, and im pretty sure tiktok doesn't require 8GB of ram to open.

woelkchen ,
@woelkchen@lemmy.world avatar

i mean yeah, but even then those aren’t significant filters, and what makes you think that tiktok isn’t running a render farm somewhere in china to collect shit tons of data?

Pretty sure my Adobe Premiere comparison made it clear I wasn't talking about the TikTok app itself but 3rd party apps to later upload to online services like TikTok.

Just because you are completely inapt to think of use cases, doesn't mean they don't exist.

KillingTimeItself ,

i mean yeah, you could, but then tiktok doesn't have you on it's app, and im pretty sure tiktok has a pretty comprehensive editing tool set, otherwise people wouldnt be making as much edited content on it.

even then, there are still a lot of people that do edit video intended for 9:16 consumption, and they do it on PC. Primarily because it's just a better place to edit things.

IthronMorn ,

What about running a chrooted nix install and using a vnc to connect to it? While web browsing and playing a background video? Just because you don't use your ram doesn't mean others don't. And no, I don't use all my ram, but a little overhead is nice.

KillingTimeItself ,

on a phone? I mean i suppose you could do that, but VNC is not a very slick remote access tool for anything other than, well, remote access. The latency and speed over WIFI would be a significant problem, i suppose you could stream from your phone to your TV, but again, most TVs that exist today are smart TVs so literally a non issue.

my example here was using a computer rather than a phone, to show that even desktop computing tasks, don't really use all that much ram.

IthronMorn ,

Well, then by that logic, since desktop computing tasks don't really use all that ram: we shouldn't need more than 8GB in a desktop ever. Yes, my example was a tad extreme, vnc-ing into your own VM on your phone, but my point was rather phones are becoming capable and replacing traditional computers more and more. A more realistic example is when I was using Samsung Dex the other day I had 80ish chrome tabs open, a video chat, and a terminal ssh'd into my computer fixing it. I liked the overhead of ram I had above me. Was I even close to 12GB? No. But it gave me room if I wanted another background program or had to spin something up quickly without disrupting my flow or lagging out/crashing.

KillingTimeItself ,

Well, then by that logic, since desktop computing tasks don’t really use all that ram: we shouldn’t need more than 8GB in a desktop ever.

if this is the logic we're using, then we shouldn't have phones at all. Since clearly they do nothing more than a computer. Or we shouldn't have desktops/laptops at all. Because clearly they do nothing more than a phone.

I understand that phones are more capable, my point is that they have no reason to be more capable. 99% of what you do on a phone is going to be the same whether you spend 200 dollars on it, or 2000.

Khanzarate ,

Obviously using it as a thin client for this MacBook, duh.

Blackmist ,

Yeah, but if you have plenty of RAM on Android, there's a chance those apps you left in the background will still be running when you go back to them, rather than doing the usual Android thing of just restarting them.

KillingTimeItself ,

yeah i get that, but i often only have like 2 apps open on my android phone. And even if you didn't have enough ram there's no reason android can't cache old apps to page file or something. Then you don't need to restart them, just load it from page. Given how fast modern phone storage is likely to be, this should be pretty negligible.

macrocephalic ,

My phone was manufactured in 2022, cost under USD250, and has 8gb of ram. New phones generally come with 12gb or more.

mechoman444 ,

I mean. It makes sense. The vast majority of people buying apple computers are loyalists or people that simply need an Internet/word processor.

And if you want to develop in apple then you have to spend a massive premium for their higher end hardware.

AdrianTheFrog ,
@AdrianTheFrog@lemmy.world avatar

Their CPUs are actually really good now, when the apps are actually optimized for them. Especially in single core, they are very competitive with top Intel or AMD chips while being way more power efficient.

ex: in Geekbench 5.1 single core the M2 max gets 1967 points (85%) compared to 2311 points from the 7950X3D and 2369 from the 14900k. The M2 max (12 cores (8 p + 4 e), 12 threads) can draw a maximum of 36 watts while the 7950X3D (16 cores, 32 threads) can draw around 250 watts, and the 14900k (16 cores (8 p + 16 e), 32 threads) can draw around 350 watts.

Apple's GPUs are definitely lacking though, in terms of performance.

mechoman444 ,

Ya. Their CPUs are really good. Got to give credit where credit is due.

kamen ,

Yeah, sure. Even if what they say about the OS resource usage is true, it's only a fraction of the total usage. A lot of the multiplatform software will use the same resources regardless of the OS. Many apps eat RAM for breakfast, doesn't matter if it's content creation or software development. Heck, even smartphones these days have have this much or more RAM.

I won't argue, I just won't buy an Apple product in the near future or probably ever at all.

KillingTimeItself ,

buys [insert price] laptop, top of the line, flagship, custom silicon, built ground up to be purpose specific.

Opens final cut pro: crashes

ok...

Retrograde ,
@Retrograde@lemmy.world avatar

Especially paired with Apple's 128gb integrated, non replaceable hard drives. Whoops you installed all of Microsoft office? Looks like you have no room to save any documents :(

KillingTimeItself ,

ah yes, we can't forget the proprietary non controller based nvme drives that use m.2 but arent actually nvme drives, they're just flash.

jaschen ,

No way. It isn't NVME?!?!

KillingTimeItself ,

it's NVME in the sense that it's non volatile flash, probably even higher quality than most existing NVME ssds out there today.

The thing is that it literally just the flash. On a card with an m.2 pin out, that fits into an m.2 slot, it doesn't have a storage controller or any standardized method of communication, that already exists. It's literally a proprietary non standard standard form factor SSD.

The controller is integrated onto the silicon chip die itself, there is no storage controller on the storage itself.

jaschen ,

Yet another reason to never go back to Apple

KillingTimeItself ,

apple, we innovate where no one else does, because for some reason, we like doing that.

jaschen ,

Apple, we innovate where "everyone" else has already done.

Fixed it for you.

KillingTimeItself ,

Apple, we innovate things, sometimes, for reasons.

GlobalMind ,

Same. And I bet you the price will also go up with less ram.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • incremental_games
  • meta
  • All magazines