Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

LightDelaBlue ,
@LightDelaBlue@lemmy.world avatar

FF no !

buddascrayon ,

LoL, and I had been contemplating switching to Firefox on my phone. Fucking nope! Not gonna board a ship that has decided to follow the pack into the ice...

twoshoes ,

I'm using chrome on phone, because it's basically part of the operating system, but I did like Fennec. It's a fork of Firefox mobile with a few more privacy features (or so they advertise)

buddascrayon ,

I'm just using Brave (yes, I know but it doesn't annoy ME so I'm fine with it) which is just Chrome without the constraints.

havokdj ,

Chromium is chrome without the constraints, Brave just has a different master holding the keys.

Not saying brave is bad btw, but chromium itself is literally the master branch for all of these different browsers

buddascrayon ,

I did address that. I am fully aware of Brave's faults. But the benefits to me outweigh the negatives. Besides which, there isn't a browser in existence that doesn't collect data on you. Not unless you go ahead and compile your own. So I choose the one that is blatantly not following the rules and allows me some leeway to enjoy the web the way I like. It's also not the only browser on my phone. Chrome is obviously still here but so is Duckduckgo and I do have Firefox on here even though I never actually use it.

kzhe ,

Chromite? Fennec? Iceraven? Lots of mobile browsers without telemetry on android.

buddascrayon ,

Cromite and Fennec I've seen and don't trust their security at all. Better the devil you know and all that. But Iceraven is new to me and will have to look into that one. So thanks

kzhe ,

Just remembered Mull. Have you heard of it?

milicent_bystandr ,

To be fair, I'll take the ship at the back of the ice-kamakaze-pack over the one at the front. More time to jump ship when something better comes by.

cikano ,

So instead you're using a browser by companies already in the ice?

buddascrayon ,

Jumping ship means trying to go in a different direction. I'm not gonna upheave my entire online presence just to get onboard with a company who is not only going the same way, but is woefully behind in the race to go that way.

Coreidan ,

This world blows

A_Random_Idiot ,
@A_Random_Idiot@lemmy.world avatar

You were the Chosen One! It was said that you would destroy the Sith, not join them! Bring balance to the browsers, not leave it in darkness!

witx ,

That's a very good way of me leaving Firefox behind...

ikidd ,
@ikidd@lemmy.world avatar

If the past is any indication, it'll either be off by default or you can turn it off. So maybe it isnt' all the drama that people make it sound like.

jeeva ,

But it's a hellishly expensive thing that seems to not attract enjoyment from current Firefox users, and seems unlikely to bring new users, and (again) seems to be prioritised over other things that could better use the money, like developers, so...

Why.

frezik ,

To go where, though? Lynx? Everything else is Chromium and that's not much better.

MeanEYE ,
@MeanEYE@lemmy.world avatar

Am smelling a Firefox fork. Though if AI is anything malicious you can rest assured Debian folks would declaw it.

rottingleaf ,

To a Gemini reader. Kristall is nice. Lagrange is ... interesting.

witx ,

Librewolf. If all else fails I'll pop my old Emacs config and browse whichever websites I can there

lemmylem ,

Librewolf is a nice fork of Firefox

kttnpunk ,
@kttnpunk@lemmy.world avatar

Okay, well I'm ready to write a angry email now who's with me? Anybody know the best address?

Patches ,
bigMouthCommie ,
@bigMouthCommie@kolektiva.social avatar

they have a mastodon instance, and many official accounts with job titles.

browse mastodon.social

mellowheat , (edited )
laverabe ,

paywall

laughterlaughter ,

as Firefox is the only browser that can't trace its lineage back to Apple and WebKit

What a slap on Konqueror's face.

neutron ,

And text only browsers too.

nixcamic ,

I don't really see it this way it's just marketing. Saying "all other browsers descend from big bad corporate Apple" is scary, saying "all other browsers descend from another open source project" is meh.

ulterno ,
@ulterno@lemmy.kde.social avatar

"It's all KHTML", I heard.

laughterlaughter ,

You have a point. But still, they could have added an "ackshually" footnote or something.

ReveredOxygen ,
@ReveredOxygen@sh.itjust.works avatar

Unfortunately, KHTML was discontinued in 2023 (according to Wikipedia)

laughterlaughter ,

That's quite the bummer. But still. Saying that almost all browsers can trace their lineage to Apple and Webkit is technically correct, but it's just a half-truth. As Apple and Webkit were once based on KHTML.

barsoap , (edited )

I mean yes no kinda Konqueror simply accepted a bunch of downstream patches, including a name change.

...more or less. It could for a long time use all three of KHTML, WebKit (fork of KHTML) and QtWebEngine (Blink wrapped for Qt, that is, a fork of WebKit), they recently removed KHTML support because noone was updating it and it hadn't been the default for ages.

If they hadn't implemented multi-engine support in the past they probably would've switched over to "whatever Qt provides" right-out, it's KDE after all. Ultimately they're providing a desktop, not a web browser. Back in the days they did decide to roll their own instead of going with Firefox but it was never a "throw project resources at it" kind of situation, there were simply KDE people who felt like working on it. Web standards were a lot less involved back around the turn of the millennium, and also new and shiny. Back in the days people thought that HTML 4.01 Strict and XHMTL would be a thing that servers actually would start to output instead of the usual tagsoup.

If you're that kind of person right now I'll point you in the direction of Servo. No, Firefox doesn't use it and it's not a Mozilla project any more, Firefox only included (AFAIK) parallel CSS handling, the rest is still old Gecko.

shotgun_crab ,

What a sad day to be alive. I want to believe nothing bad will happen but this is scary

kylexd ,

but its open source so someone can just fork it

Drewelite ,

I've been trying Arc browser that has a bunch of AI shoved in it and.... It's actually kinda nice. I think Firefox COULD possibly not fuck this up. Before you down vote me, I too believe that Firefox would be better off focusing on the core browser experience. And I really hope they have a good solution to AI being all cloud based right now. Like having a lightweight local model. This is why I was glad Arc was trying it, not Firefox.

shotgun_crab ,

I agree, AI can be good for a browser, which is why I still have some hope. We just have to wait and see

raspberriesareyummy ,

oh fuck off Mozilla....

Chakravanti ,

I double that. I'll go so far as to manually remove it from my fucking OS before I install it. Seriously, major Fuck You, Mozilla.

gapbetweenus ,

So, what is else out there? Can the guys making adblocker just make a browser or so?

atyaz ,

Making a browser from scratch would be very difficult. SerenityOS has one, but idk how usable it is.

Besides Firefox, I guess the least evil options would be something webkit-based. There are a few of those.

gapbetweenus ,

For me chrome just worked better (Mac) but since adblock is mandatory for me I switched. Afraid that if ff will implement AI my experience will be even worse.

laughterlaughter ,

Apple didn't even build a browser from scratch. They took Konqueror's KHTML engine, then built Webkit with it.

We could do the same with Firefox. As the matter of fact, there are already forks out there, like IceWeasel, Librewolf, Pale Moon, etc.

What's going to happen is that one of those will start raising to the top. That's exactly what happened with Firefox. Mozilla had the Netscape suite, open sourced it, then fucked it up. When it fucked it up, I decided to try one of the alternatives, Phoenix in this case. Then Mozilla decided to drop the Netscape Suite and adopted Phoenix, which eventually became Mozilla Firebird, and finally Firefox.

I guess the cycle shall soon repeat.

Xeroxchasechase ,

Whyyyy Mozilla? I want to love you, rally, but you wont let me.

Wiz ,

You may be in an abusive relationship with your browser. 💔

normalexit ,

I hate that they are laying people off. I do however want to use some machine learning powered adblock, for those harder to block ads. otherwise I don't feel like every app needs an AI assistant. It's bad for the Internet generally and for the power grid.

laughterlaughter ,

In theory, that sounds amazing.

In practice, it will most likely need to send the contents of your browser to some third-party server. No, thanks.

(Unless it's crowdsourced, like the first person to visit a page gets dinged, but then the next persons just downloads the set of rules instead of uploading content.)

abruptly8951 ,

Privacy preserving federated learning is a thing - essentially you train a local model and send the weight updates back to Google rather than the data itself....but also it's early days so who knows what vulnerabilities may exist

laughterlaughter ,

Oh interesting!

mellowheat ,

An exclusively locally running, opensource LLM might be a good thing though. In my amazing dreams where that's what they're planning to do.

venoft ,
@venoft@lemmy.world avatar

What if you don't have a decent graphics card? Wait 5 minutes for your URL completion to finish?

gentooer ,

Using an LLM is quite fast, especially if it's optimised to run on normal hardware

cley_faye ,

Decent models are huge; an average one requires 8GB to be kept in memory (better models requires something like 40 to 70 GB), and most currently available engines are extremely slow on a CPU and requires dedicated hardware (and even relatively powerful GPU requires a few seconds of "thinking" time). It is unlikely that these requirements will be easily squeezable in current computers, and more likely that dedicated hardware will be required.

barsoap ,

I don't think any inference engines have actually been optimised to run on CPUs. You're stuck with 32-bit floats but OTOH that just means that you can do gigantic winograd transformations with the excess precision, needing far fewer fmuladds in total and CPUs are better at dealing with the memory access patterns that come with transforming the convolution. Most people have at least around 1TFLOP of compute in their CPU (e.g. a Ryzen 3600 has that much) that's not ever seeing the light of day. About a fifth of what an RX 570 has, it's a difference but not a magnitude and you can run SDXL with that kind of class of card (maybe not the 570 dunno about software support but a 5500 works, despite AMD's best efforts to cripple rocm).

Also from what I gather they're more or less doing summarybot for your browsing history, that's not a ChatGPT or Llama-style giant model you can talk with.

Also to all those people complaining: There's already AI in firefox, the translation models are about 17MB per language pair, gzipped.

model_tar_gz , (edited )

ONNX Runtime is actually decently well optimized to run on CPUs; even with large models. However, the simple truth is that there’s really no escaping that Billion+parameter models need to be quantized and even pruned heavily to fit in memory and not saturate the CPU cache so inferences/generations don’t take forever. That’s a reduction in accuracy, so the quality of the generations aren’t great.

There is a lot of really interesting research and development being done right now on smart quantization and pruning. Model serving technologies are improving rapidly too—paged attention is a really cool technique (for transformer based models) for effectively leveraging tensor core hardware—I don’t think that’s supported on CPU yet but it’s probably not that far off.

It’s a really active field and there’s just as much interest in running huge models on huge hardware as there is big models on small hardware. I recently heard of layerwise inference for CPUs; load each layer of the network to the CPU cache on demand. That’s typically a bottleneck operation on GPUs but CPU memoery so bloody fast that it might actually work fine. I haven’t played with it myself, or read the paper all that deeply so I can’t really comment more than it’s an interesting idea.

__matthew__ ,

Sorry but has anyone in this thread actually tried running local LLMs on CPU? You can easily run a 7B model at varying levels of quantization (ie. 5 bit quantization) and get a generalized prompt-able LLM. Yeah, of course it's going to take ~4GB of RAM (which is mem-mapped and paged into memory), but you can easily fine tune smaller more specific models (like the translation one mentioned above) and have surprising intelligence at a fraction of the resources.

Take, for example, phi-2 which performs as well as 13B param models but with 2.7B params. Yeah, that's still going to take 1.5GB RAM which Firefox wouldn't reasonably ship, but many lighter weight specialized tasks could easily use something like a fine tuned 0.3B model with quantization.

cley_faye ,

Yes, I did. And yes, it is possible. It's terribly slow in comparison, making it less useful. It very quickly devolves into random mumbling or get stuck in weird loops. It also hogs resources that are actually used by other tasks you may be doing.

I mainly test dev AI solutions, and moving from 1B to 7B models made them vastly more pertinent. And moving from CPU implementation (Ryzen 7 3700X) to GPU (RTX 3080 Ti) made them fast enough to be used as quick completion and immediate suggestion without breaking workflow, in addition to freeing resources for IDE, building tools and the actual software being run, while running it on CPU had multi-seconds delay, which made this use case completely useless.

zwaetschgeraeuber ,

You can run a 7b model on cpu really fast even on a phone.

raspberriesareyummy ,

yeah, CPU vendors will love the increased sales thanks to an even more resource hogging shitty web browser

Etterra ,

Well it'll be fun trying to find a replacement that doesn't small use anything made by Google, like Opera does.

laughterlaughter ,

Librewolf, Pale Moon and Iceweasel.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • random
  • incremental_games
  • meta
  • All magazines