Latest nightly builds of Firefox 139 include an experimental web link preview feature which shows (among other things) an AI-generated summary of what that page is purportedly about before you visit it, saving you time, a click, or the need to ‘hear’ a real human voice.
Hate having to read an article to understand what it’s saying and would rather read what an AI says it (potentially) says instead?
This reads like satire.
Maybe because it is, an article says later “Saltiness aside” 😏
Oh! That makes more sense lol
this is the next step from just reading headlines and getting everything wrong
Hell yeah! My local news sites always go on and on about stupid stuff before getting to the point. Which is almost always “we don’t know”.
If it works around paywalls, it could actually be a useful feature.
It reportedly works entirely on your machine (as it meant to be privacy preserving by default). So it will probably see only the data you can see.
Awesome! Now half the sites on the web contain a tiny drop of information, buried under paragraphs and paragraphs of GPT-generated text, and now my browser uses the same LLM to reverse engineer the original information. What could possibly go wrong the LLMs talk to other LLMs and summarize, bloat, summarize and then bloat and then finally summarize every bit of information for us.
Do we actually still make websites for humans, or is it all just AIs and SEO?
Copy-paste from the Arc browser. It’s a hit-or-miss feature; it works 50% of the time.
This feels like windows recall…
I wonder if the preview does a pre-fetch which can be identified as such? As in, I wonder if I’d be able to serve garbage for the AI summarizer, but the regular content to normal views. Guess I’ll have to check!
Update: It looks like it sends an
X-Firefox-Ai: 1
header. Cool. I can catch that, and deal with it.I agree with your sentiment. It’s sad that your (or my) website can’t respond with a
X-This-Website-is-not-AI-Garbage
header to indicate that “Hey user, you can actually just open this website and read it and get the infos you need without an AI assistant.”.I’m pretty sure Firefox’s preflight request will also not load ads and so could be seen as a bad scraper.
Definitely won’t be visiting your website then if you intentionally fuck with people to make their browsing experience worse. I hate web hosters who are against the free and open internet.
Pray tell, how am I making anyone’s browsing experience worse? I disallow LLM scrapers and AI agents. Human visitors are welcome. You can visit any of my sites with Firefox, even 139 Nightly, and it will Just Work Fine™. It will show garbage if you try to use an AI summary, but AI summaries are garbage anyway, so nothing of value is lost there.
I’m all for a free and open internet, as long as my visitors act respectfully, and don’t try to DDoS me from a thousand IP addresses, trying to train on my work, without respecting the license. The LLM scrapers and AI agents do not respect my work, nor its license, so they get a nice dose of garbage. Coincidentally, this greatly reduces the load on my backend, so legit visitors can actually access what they seek. Banning LLM scrapers & AI bots improves the experience of my legit visitors, because my backend doesn’t crumble under the load.
LLM scrapers? What are you on about? This feature will fetch the page and summarize it locally. It’s not being used for training LLMs. It’s practically like the user opened your website manually and skimmed the content. If your garbage summary doesn’t work I’ll just copy your site and paste it in ChatGPT to summarize it for me. Pretty much the equivalent of what this is.
AI summaries are garbage anyway, so nothing of value is lost there.
Your ignorance annoys me. It has value to a lot of people including me so it’s not garbage. But if you make it garbage intentionally then everyone will just believe your website is garbage and not click the link after reading the summary.
This feature will fetch the page and summarize it locally. It’s not being used for training LLMs.
And what do you think the local model is trained on?
It’s practically like the user opened your website manually and skimmed the content
It is not. A human visitor will skim through, and pick out the parts they’re interested in. A human visitor has intelligence. An AI model does not. An AI model has absolutely no clue what they user is looking for, and it is entirely possible (and frequent) that it discards the important bits, and dreams up some bullshit. Yes, even local ones. Yes, I tried, on my own sites. It was bad.
It has value to a lot of people including me so it’s not garbage.
If it does, please don’t come anywhere near my stuff. I don’t share my work only for an AI to throw away half of it and summarize it badly.
But if you make it garbage intentionally then everyone will just believe your website is garbage and not click the link after reading the summary.
If people who prefer AI summaries stop visiting, I’ll consider that as a win. I write for humans, not for bots. If someone doesn’t like my style, or finds me too verbose, then my content is not for them, simple as that. And that’s ok, too! I have no intention of appealing to everyone.
Alternatively, you could make your response more useful, removing the UI to aid the AI. After all, the user should be allowed to choose how they navigate the web.
I am doing exactly that. AI turns my work into garbage, so I serve them garbage in the first place, so they have less work to do. I am helping AI!
I’m also helping AI using visitors: they will either stop that practice, or stop visiting my stuff. In either case, we’re both better off.