This, MFers will have the most top spec computer and worry about bloat while I install random shit for fun on my 320gb had drive that's also my boot drive on my core 2 duo computer with 3 gigs of ram that struggles to run firefox and thunar at the same time (also cinnamon is the best running on my computer from my testing, xfce is laggy af and I'm not even going to mention kde, bspwm or any other since the, either lag beyond usability (KDE) or just straight up crash my computer into tty when i try to launch them (bspwm), one massive note is that I'm using software rendering since the GPU on the core 2 duo is struggling with even drawing the boot screen)
Literally have probably a ton of overlap software from installing the desktop environments and other random (well not very random, stuff I used on windows before) software that I don't bother googling the deleting commands since apt installed them all as snaps because I never noticed in my first three months of use, fuck you Ubuntu, Xubuntu and all other derivatives, this shit makes me not want to use Ubuntu ever again (not like i can, my pc is fucked and no other drive is bootable, i can't even boot an install usb)
Yeah, I'm not actually not very into the tiling window manager thing, I tried bspwm just for the sake of wanting to try one but I since lost interest, I'll keep it in mind though and maybe come back and try it one day
to be fair - core 2 duo computer with 3 gigs of ram - you're using the desktop I had in 2009. At some point, do you think that it's time to upgrade? no, wait, I think I had a core 2 quad actually...
You also control what's being installed on other distros. In fact, other distros split their packages in a way more modular way which allows one to pick and choose what one needs granularly. In Arch, the package count is lower because the maintainers don't split stuff up. But you get all the so called bloat when you installna regular package
or you can't buy if you're not successful enough or you're in the wrong country. For example, in my country, the minimum cost of a 1TB SSD is about $85 and a salary of $2,000 is considered a very successful salary at the upper limit
For me it's not about the size, it's about the understanding. I'd really like to understand what everything on my system does and why it's there. It seems impossible with modern systems. Back in the '90s I needed a secure email relay - it had lilo, kernel, init, getty, bash, vi, a few shell utils (before busybox..), syslogd and sendmail. I'm not sure any more as it was a long time ago, but I think I even statically linked everything so there was no libc. I liked that system.
What I find interesting is that no one is asking about the quality of code, nor do they seem concerned about the dependencies but they do care about that one package/app/program of any size they see and don't immediately know why it's there.
It's not about storage. It's about complexity getting back at you, for example not knowing what caused a problem because multiple programs are stepping on each others feet
For me it was a problem with update frequency and how long they would take. Once i got rid of my flatpaks and moved to stable firefox i update once a week instead of daily now and it takes seconds instead of minutes. Probably also solvable with auto updates.
You realize you don't have to backup the actual "bloated" programs. Just maybe their configs and any files those programs generate that you'd like to keep, right?
That's committing the cardinal sin of cherrypicking your backup contents. You may end up forgetting to include things that you didn't know you needed until restore time and you're creating a backup that is cumbersome to restore. Always remember: you should really be creating a restore strategy rather than a backup strategy.
As a general rule I always backup the filesystem wholesale, optionally exclude things of which I'm 100% sure that I don't need it, and keep multiple copies (daylies and monthlies going some time back) so I always have a complete reference of what my system looked like at a particular point in time, and if push comes to shove I can always revert to a previous state by wiping the filesystem and copying one of the backups to it.
Started playing with arch this week for the first time. Got a pretty good laugh when I realized that I forgot to install a dhcp client and had to boot the install media again to add networking.
I appreciate what they’re doing and I’m going to keep poking at it, but my first impression is that philosophy is driving and the utility is in the back seat.
So just run archinstall Personally as a relative newbie I found arch a lot easier to deal with than fedora and ubuntu, both of which have had me in dependency hell on previous attempts to switch to linux. Not only that but I have a much better idea of what makes up my system.
It's definitely a philosophy, and you have to understand the implications. But I'm not sure utility is in the back seat. It's just that you personally own your own config.
Flatpaks have helped me a lot reducing bloat, avoiding dependency hell.
That said, probably there's some overlapping dependencies that, if installed in a different way I could save some space, but it's not worth it in my opinion.
I'm also using rootless podman+systemd for certain services, but that's been a mixed bag compared with plain old docker or LXC.
Neofetch is unmaintained btw, fastfetch is a good replacement... for whoever needs that. I wrote my own tool for getting system info and I like my terminals to have free space
But at least you don't need to use a stupidly long argument to start it (I know both don't really have support but sway runs it in your face even more than normal) because you can't quite be as choosy with laptops as you can be with desktops....