Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

DarkThoughts

@DarkThoughts@kbin.social

This profile is from a federated server and may be incomplete. Browse more on the original instance.

DarkThoughts ,

"AI" is funny anyway because you're basically gaslighting them the whole time to have them behave as they're supposed to.

DarkThoughts ,

I tried oobabooga and it basically always crashes when I try to generate anything, no matter what model I try. But honestly, as far as I can tell all the good models require absurd amounts of vram, much more than consumer cards have, so you'd need at least like a small gpu server farm to local host them reliably yourself. Unless of course you want like practically nonexistent context sizes.

DarkThoughts ,

The bots (what the actual girlfriends or whatever other characters are) aren't the problem. You can find them on chub.ai for example or write them yourself fairly easily. The issue the software, and even more so the hardware. You need something like the mentioned Kobold.ccp or oobabooga, and then you'd also need a trained LLM model that you can get on huggingface.co, which is already where it gets complicated (they'll be loaded within kobold or oobabooga). You also need to understand how they work in regards to context sizes & bytes, because they need a lot, and I mean A LOT of vram to work properly. Basically, the more vram you have, the better the contextual understanding, their memory is. Otherwise you'd have a bot that maybe knows to only contextualize the last couple messages. For paid services like novelai.net you basically have your bots run through big ass server farms with lots of GPUs that bundle their vram and processing power, giving you "decent" context sizes (imo the greatest weak point of LLMs and it is deeply rooted in how they work) and decent speed. NovelAI also supports front-ends like SillyTavern which is great for local bot management and settings, regardless if you self host or use a paid service (NOT EVERY PAID SERVICE HAS AN API FOR THIS! OpenAI's ChatGPT technically does too but they do not allow NSFW content and can ban you for that if caught).
There's a bunch of "free" online services too, like janitorai.com but most of them have slow speeds and the chat degrades significantly after just a few messages, because they have low context sizes. The better / paid models suffer from this degradation too but it is slower and less noticeable, at least at first. You can use that to get an idea of how LLMs work though.

Edit: Should technically self explanatory / common sense, but I would advise not to share ANY personal information through online service chats that could identify you as a person!

DarkThoughts ,

See my other reply for some basic info & pointers.

DarkThoughts ,

You need a little gpu server farm for proper models & context sizes though. Single consumer gpus don't have enough vram for that.

DarkThoughts ,

I don't know, I'm not a weeb.

DarkThoughts ,

Yeah, I heard they're also very privacy friendly.

DarkThoughts ,

I was being sarcastic. If you pay for a prostitute you might as well pay for an AI service such as novelai.

DarkThoughts ,

Yes, databases (saved on a hard drive). SillyTavern has Smart Context but that seems not that easy to install so I have no idea how well that actually works in practice yet.

DarkThoughts ,

The answers to both of those things depends very heavily on the details. I think focusing on their main products is a good thing, but adding AI sounds like one of those likely terrible decisions. We definitely need privacy friendly & open source based AI though, in all areas, so I hope this is Mozilla pushing for something sensible here.

DarkThoughts ,

There's no way it would be running locally.

DarkThoughts ,

All of those could be terrible to be honest, because AI is a data tracking vacuum. An AI adblocker or content filter sounds cool at first, but it would mean it reads and analyzes your data, just like the shit you do with chatbots too. Reading your mails? That's basically what Google does for years with gmail, that's why they have such a good spam filter. I agree that a chatbot would be kinda useless though, even if privacy friendly, which in of itself would be great but I just don't see the use. This could simply be outsourced to a website.

DarkThoughts ,

Yes, but what would a local model do for you in this case? Chatbots in browsers are typically used as an alternative / more contextualized search engine. For that you need proper access to an index of search results. Most people will also not have enough computing power to make use of any complex chatbot / larger context sizes.

DarkThoughts ,

If they're local they'd be basically useless due to a lack of computing power and potential lack of indexing for a search engine chatbot, so I doubt it. It would also have to be so polished that it wouldn't require further user knowledge / input, and that's just not a thing with any local LLM I've come across. Mozilla can gladly prove me wrong though. I certainly wouldn't mind if they generally can make the whole process of local LLMs easier and more viable.

DarkThoughts ,

And I replied to that comment, without any mouth foaming.

DarkThoughts ,

I honestly have no idea what you're referring to now. I never asked for any ideas.

DarkThoughts ,

Self hosting and then asking this on a tech community is kind of ironic.

DarkThoughts ,

Seriously, this is data representation gore.

DarkThoughts ,

Yep, this was a well known scheme. They tried to be this hip and overpriced designer brand that people flock to for the name, with a bunch of proprietary parts that requires their own service centers to maintain them. Complete garbage tier imo. One of the nice things about bicycles is that they're so piss easy to maintain by yourself even. The majority of it doesn't even require special tools and it is a fun experience to learn too. Imagine if brands like this became the norm and suddenly you could barely find bikes where you could do that? You'd always have to pay up to some corporate entity instead. Definitely good riddance.

DarkThoughts ,

If they had anti clickbait policies, then maybe. Until then I look at their video feed in disgust and close the tab. I really don't need a paid YouTube with all its garbage content.

DarkThoughts ,

They’re creators who sometimes use clickbaity thumbnails because that’s just essential to surviving the YouTube algorithm, and the Nebula version usually just takes the same title and thumbnail from YouTube.

Oh. So they're often just double dipping with the same content? That's even worse.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • incremental_games
  • meta
  • All magazines