Transcoding an HDR blueray to h265 filled it up pretty quick and I'm about to start dabbling with game development/3d modeling.
I've also filled it up pretty quick learning how fast various data structures are in which situations. You don't really see a difference in speed until you get into the billions of items at least for python.
For automations and small apps it's fast enough. It's a fair traidoff for the fast turnaround time.
I'm thinking of learning go or c though because i don't care much for the runtime errors. It's no fun using an application for a while just for a typo in a rarely used function to tank the entire app.
I feel like recently developed games and apps expect the user to have a "moden" sized RAM, meaning that the decs don't give a crap about optimizing RAM-usage.
I just took a Core i5, 6 GB RAM laptop from 2011 and reinstalled Linux Mint and put in a 1 TB SSD. The difference between that and Ubuntu 23.10 and a 750 GB 5400 RPM drive was like night and day.
I ran with 8gb ram for 7 years because zram would shove my swap into what little ram I had available and it actually worked well enough that I didn't feel like upgrading until this year lol.
"Free" memory is actually usually used for cache. So instead of waiting to get data from the disk, the system can just read it directly from RAM after the first access. The more RAM you have, the more free space you'll have to use for cache. My machine often has over 20GB of RAM used as cache. You can see this with free -m. IIRC both Gnome and KDE's system managers also show that now.
In a similar fashion I got my sons old netbook. It has 32GB flash as storage medium. 27GB were in use by Windows, Office, and Firefox. User file size was neglectable. Then it ran into problems because it wanted to download an 8GB update.
Now it runs Kubuntu, which uses about 4GB with LibreOffice and a load of other things.