

That is just stupid. How about a slighly more complex markdown.
What I really want is a P2P archive of all the relevant news articles of the last decades in markdown like in firefox “reader view”. And some super advanced LLM powered text compression so you can easily store a copy of 20% of them on your PC to share P2P.
Much of the information on the internet could vanish within months if we face some global economic crisis.
I was thinking of the Gemini (protocol) - Wikipedia but a bit more elaborate, and yeah I’m not sure how far text compression can be pushed. But I think LLMs could be useful and help reach a critical mass of being able to download and store tons of articles.
Torrent V2 and other official extensions Updating Torrents Via DHT Mutable Items allow some ways to do this. Like hosting a youtube channel and updating it with new videos, without any new network protocol. Well theoretically since this isn’t yet supported well in torrent clients or lib.
I’ve been thinking how this would work for a while but it’s kind of frying my brain haha. Like a “P2P version control database” that is truly open source. For articles and blog posts, but also for metadata for manhwa, movies, tv, anime, books etc. Like anybody can download and use it and share, edit, fork it without needing to set up some complex server. Something that can’t be taken down, sold or if abandoned someone else can just pick it up and you can merge different curated versions and additions easily.
You’d basically want a “most popular items of the past X time” that almost everybody downloads, and then the whole database split into more and more exotic or obscure items. So everybody has the popular stuff but also has to host some exotic items so they don’t get lost. And it has to be easy to use and install.
But the whole database has to be small and compact and compressed enough that you can still easily host it on a normal HDD. It the current times with economic and political dangers lurking this would be a crucial bit of IT infrastructure.