AI is going to start writing entire fake research papers and books written by fake authors, just so it can be cited as a source for a high school kid using it to cheat on a 500 word essay.
I'm skeptical given how confident many recent AI models are at making wrong claims. Fact checking seems to be a rather poor use case for current AI models IMO.
This looks less like the LLM is making a claim so much as using an LLM to generate a search query and then read through the results in order to find anything that might relate to the section being searched.
It leans into the things LLMs are pretty good at (summarizing natural language; constructing queries according to a given pattern; checking through text for content that matches semantically instead of literally) and links directly to a source instead of leaning on the thing that LLMs only pretend to be good at (synthesizing answers).
While I see this as one of the rare nice use of IA, if the use is just to fact check some text found on the web. It could also just fetch it on the site instead of using an AI.
The problem arises when a site uses different wording. Wikipedia's search engine isn't that good, so that problem could make the extension fail enough times to stunt retention.
Obviously Wikipedia is not a definitive or 100% accurate source but this sounds like a genuinely positive use of AI to combat misinformation. The people it really needs to reach likely won't use it but it's still a good idea.
a quick web search uses much less power/resources compared to AI inference
Do you have a source for that? Not that I'm doubting you, just curious. I read once that the internet infrastructure required to support a cellphone uses about the same amount of electricity as an average US home.
Thinking about it, I know that LeGoog has yuge data centers to support its search engine. A simple web search is going to hit their massive distributed DB to return answers in subsecond time. Whereas running an LLM (NOT training one, which is admittedly cuckoo bananas energy intensive) would be executed on a single GPU, albeit a hefty one.
So on one hand you'll have a query hitting multiple (comparatively) lightweight machines to lookup results - and all the networking gear between. One the other, a beefy single-GPU machine.
(All of this is from the perspective of handling a single request, of course. I'm not suggesting that Wikipedia would run this service on only one machine.)
A simple web search is going to hit their massive distributed DB to return answers in subsecond time.
It's going to hit an index, not the actual data, it's going to return approximate and not accurate results. Tons of engineering been done around basic search precisely to get more data locality.
Read a blog post at some time (please don't ask me where) talking about Bing vs. Google when Bing started to use ChatGPT and it basically boiled down to "Google has the tech to do it, they don't roll it out because they don't want to eat the electricity bill this is MS spending money to get market share". The cost difference in providing search vs. having ChatGPT answer a question was something like 10x. It might not be that way forever what with beating models down to work in trinary and stuff, though (that's not just massive quantisation but also much easier maths, convolutions don't need much maths when all you deal with is -1, 0, 1 IIRC you can throw out the multiplication unit and work with nothing but shifts and adds)
I prefer a floating button next to the text than having to set my default search engine to Wikipedia or downloading an addon that only adds it to the context menu, lol. I need to complete my unholy trinity of levitating context buttons
Due to Russell's Teapot, I cannot be thoroughly sure of that, at least in article space. However, that does not mean your claim stands, unless you find a mention.
In principle, the media has no reason to cover this, so no mention should exist.
For now. Like I said, they're gauging interest in it for now, and it's currently basically a prototype. So kinda ironically, to support free knowledge, we've gotta inconvene (the proper verb is apparently "inconvenience"? sorry that's just too weird) and grovel to Google a bit. Someone can probably write a bs marketing passage about Google's auspices to supplant free knowledge through providing accessibility to a supernation of usuremongers.
Firefox conversion by-hand doesn't seem like it'll be that hard either, thanks to Manifest v3, which removed the stupid deviation from the standard of Chrome using callbacks instead of promises like everyone else does.
"Inconvenience" would be the verb for causing an inconvenience. So in the sentence you're going for, "inconvene" would have to be replaced with the passive "be inconvenienced" ("we've gotta be inconvenienced and grovel to google a bit"). I don't believe we have a separate word for "endure an inconvenience", although it seems like the kind of thing some languages might have a single word for. Stylistically I'd probably restructure the sentence to "we've gotta put up with the inconvenience" rather than just using the passive verb, but yeah.
I think you'd most often see this verb in the stock phrase "Sorry to inconvenience you".
meta.wikimedia.org
Hot