I love RSS. I literally just bought an app on itch.io, and to my surprise the devlog page for it, which lists all the new updates, supports RSS. I love when it happens. [1]
RSS didn't stick for me until:
1. I decided to quit most social media, so without RSS I would miss stuff I actually care about.
2. I unsubscribed to all news sites. RSS fatigue is a thing. Don't subscribe to sites that make money the more they post. I used to subscribe to Phoronix, the top HN frontpage articles, OSNews, LWN, etc.: bad idea, you don't want to wake up to 50 unread posts per day and get overwhelmed. Now I mostly follow personal blogs, and I have one new post per day to read. Much more manageable and higher signal-to-noise ratio.
3. https://fetchrss.com/ is genius for everything else that doesn't support RSS. It allows to turn any website into an RSS feed, and the free plan is generous enough for my needs.
I pay for Feedbin, and it's great.
---
1: I wish Firefox still showed an RSS feed icon when a page has one. These days I have to "view-source" and search for feed or atom or rss to tell.
I'd suggest giving a go to lenns.io (shameless plug). It gives you source prioritisation control + number of items per source control + category prioritisation. In the end, you get exactly what you want without being overwhelmed.
This is an app/service that I've built for myself, but it's up for anyone go give it a go and use it.
I think #2 is a great tip. I’ve tried to use my feed reader to segregate by 'frequency' before, but I haven't really given it a full trial—it still feels a bit awkward.
> the top HN frontpage articles
I don't even really understand what the HN feed is. I looked in the FAQ, etc. the other day and couldn't find an explanation. The description ("Links for the intellectually curious, ranked by readers.") is nice PR, but it doesn't tell me what I'm seeing. Is it every post submitted? Every post that made it to the front page? Same, but stayed on the front page for a certain amount of time? Received at least X upvotes? I have no idea...
> I’ve tried to use my feed reader to segregate by 'frequency' before, but I haven't really given it a full trial—it still feels a bit awkward.
I'm in the middle of that myself. I have folders labelled rarely, weekly, frequent and social. Rarely and weekly I tend to read most of it, as they are the folders I open first. I only open frequent once I'm done with the others and I usually scroll through the titles and only read very few articles. Social is for mastodon and bluesky accounts, which I open when I only have 5 minutes to kill and I know I won't have time to finish reading long posts/articles.
I liked Newsblur's approach to this when I was using a firehose (I dropped most of my firehose-like feeds a couple years ago for various reasons including I didn't actually like most of them all that much). Newsblur has Focused versus Disliked and you can "train" all sorts of things to Like or Unlike about an article. You can Like an entire feed, but you can also Like things like specific authors or tags in a feed or words in a headline. Similarly you can use all the same tools to Unlike an article. If an article has more Likes than Unlikes it shows up in a Focused view and if an article has more Unlikes than Likes it shows up only in an "All" view, meaning it disappears from the default Unread view. When you have a limited amount of time you read Focused, when you have more time you read Unread, and if you want to check on spam or topics you dislike you can zoom out to "All" and spot-check feeds for Unliked articles.
Additionally, Newsblur added an automated "Infrequent Site Stories" for things it knows come from feeds that don't update all that frequently. (Which you can use in tandem with Focused view for even less time.)
The point 2 is an important one, I used RSS for years but had to stop using it as I was way too anxious trying to read everything.
I started using again, but I have a few rules: all the feeds only refresh once week; and any news feed (like hackers news) that generates too much content is purged also once a week, so I only have the latest one week articles.
In my mind, my RSS feed for me is like an old school weekly magazine. This solve the FOMO feeling of missing something interesting, but I don’t feel like I need to read something as soon as is published.
> https://hnrss.github.io/ lets you subscribe only to posts above a certain number of points, or other metrics.
Yes, that's the one I used, but I feel it's still too much noise. You don't want a firehose in your RSS feeds.
Just open https://news.ycombinator.com when you want to doomscroll through an almost endless stream of information. RSS doesn't work well with social media, and that's a feature.
One thing that I did to kickstart my RSS usage again was to revisit each site I was subscribed to and:
- Remove it if it posted more than once a day. I want thoughtful voices, not other people’s aggregation.
- Remove it if it hadn’t posted in the last few years. Some people blog extremely irregularly, but the likelihood is that most blogs that are 5+ years old aren’t coming back.
- Remove it if the overall tone of the blog is too negative.
I then added a bunch of new feeds from people I’m currently actively following on other platforms who are blogging. This was a massive breath of fresh air, that has got me actively engaging with my feed reader for the first time in a few years.
(Related to my second point: I’m not the first person to note this but there’s a real sadness to watching an old and beloved blog nova itself into your feed in a burst of gambling site spam. Better to get out before that happens.)
Good tips!
I have two busy feeds (which are my country's equivalent of Reuters or AP) but I keep them in a separate folder and if I wanted, I could exclude them from the main feed. Sources that post way too often can be a burden and it feels like you can never catch up.
> - Remove it if it hadn’t posted in the last few years. Some people blog extremely irregularly, but the likelihood is that most blogs that are 5+ years old aren’t coming back.
This I don't really understand. Following inactive feeds via RSS comes effectively at no cost for you. How does removing them improve the experience?
It can be preventative against spam for when old domains expire/get sold and/or old blog service passwords get hacked.
I think about doing that sort of proactive cleanup sometimes. There's nothing quite as disappointing as seeing an old friend's blog show a new post for the first time in years only for it to be some spammer that just hacked their old password or some expired domain squatter saw RSS logs and decided to sell advertising on it or a once major blog host was sold to a Russian oligarch who purged the user database so more Russians could have good usernames (LiveJournal, lol).
Most people complain about the signal-to-noise ratio in news consumption. I believe the issue isn’t the news sources themselves, but rather the lack of a proper RSS application.
A great RSS app should offer a powerful search function. It should support tagging, bookmarking, scoring or point systems, categories, and a "read later" feature, among other things.
You don’t need to eliminate news sources — just use filters and search tools to surface what matters to you.
An ideal RSS reader should also be smart enough to bypass things like Cloudflare and other unnecessary protections that break RSS functionality. Unfortunately, many mobile RSS apps fall short in this regard — and mobile is king these days.
To get something truly useful, you often need to self-host. But most people won’t go that far.
Personally, I self-host my RSS reader. I even built my own client, since I wasn’t aware of KaraKeep (formerly Hoarder) at the time. I’m still using my custom app because it’s now very versatile, and I’m not sure KaraKeep would meet all my needs.
It's strange how all these modern communication methods (blogs, forums, RSS readers) so often fail to have features that were available in Usenet newsreaders 30+ years ago. We had threading, searching, killfiling or scoring, marking posts to save, all pretty common features then. I'm not sure why there isn't more demand for them now.
I agree that most rss clients lack true power user features for searching and filtering.
I spent a bit of time proving out an idea to use Bleve indexing to allow scoring each article with weighted keywords but I haven’t had time to work on it lately. I’ll have a look at your links.
> As Facebook would push for more engagement, some bands would flood their pages with multiple posts per day
The causation is opposite, and it's the whole problem with chronological feeds, including RSS - chronological feeds incentivises spam-posting, posters compete on quantity to get attention. That's one of the main reasons fb and other sites implemented algorithmic feeds in the first place. If you take away the time component, posters compete on quality instead.
> The story we are sold with algorithmic curation is that it adapts to everyone’s taste and interests, but that’s only true until the interests of the advertisers enter the picture.
Yea, exactly, but as emphasized here: The problem is not curation, the problem is the curator. Feed algorithms are important, they solve real problems. I don't think going back to RSS and chronolgical feed is the answer.
I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.
> I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.
RSS is just a protocol. You could make a reader now with any algorithm you want that displays feeds.
In fact, I can’t imagine that no one is using the AI boom to say they will build a decentralized Twitter using rss plus ai for the algorithm
>RSS is just a protocol. You could make a reader now with any algorithm you want that displays feeds.
Your proposal to filter on the client side where the RSS reader runs can't do what the gp wants: algorithmic suggestions on the _server_ side.
The issue of an AI algorithm applied to client-side-RSS is it's limited to the closed set items of the particular feed(s) that RSS happened to download of whatever websites the user pre-defined in the white-list.
Example of inherent client-side limitation would be how Youtube works:
- a particular Youtube channel about power tools : can use RSS to get a feed of that channel. Then use further customized client filtering (local AI LLM) to ignore any videos that talk about politics instead of tools.
- the Youtube algorithm of suggested and related videos to discover unknown channels or topics : RSS can't subscribe to this so there's no ability to filter on the client side. Either Youtube itself would have to offer a "Suggested Videos as RSS feed" -- which it doesn't -- or -- a 3rd party SaaS website has to constantly scrape millions of Youtube videos and then offer it as a RSS feed. That's not realistic as Google would ban that 3rd-party scraper but let's pretend it was allowed... getting millions of XML records to filter it client-side to throw away 99% of it is not ideal. So you're still back to filtering it on the server side to make the RSS feed managable.
In the "explore-vs-exploit" framework, the "explore" phase is more efficiently accomplished with server-side algorithms. The "exploit" phase is where RSS can be used.
- "explore" : use https://youtube.com and its server-side algorithms to navigate billions of videos to find new topics and content creators. Then add interesting channel to RSS whitelist.
- "exploit" : use RSS to get updates of a particular channel
> Example of inherent client-side limitation would be how Youtube works:
> ...
I thought about this problem a long time ago but never did anything substantive with it. I guess I'll articulate it here, off-the-cuff:
People used to post a "blogroll" (and sometimes an OPML file) to their personal blogs describing feeds they followed. That was one way to do decentralized recommendations, albeit manually since there was no well-known URL convention for publishing OPML files. If there was a well-known URL convention for publishing OPML files a client could build a recommendation graph. That would be neat but would only provide feed-level recommendation. Article-level recommendation would be cooler.
One of the various federated/decentralized/whatever-Bluesky-is "modern" re-implementations of Twitter/NNTP could be used to drive article-level recommendations. I could see my feed reader emitting machine-readable recommendation messages based on ratings I give while browsing articles. I would consume these recommendations from others, and then could have lots of fun weighting recommendations based on social graph, algorithmic summary of the article body, etc.
GGP does express interest in Algorithm-as-a-Service (AaaS), but I don't see why AaaS or server-side anything would be required to have non-chronological feed algorithms. Client-side is perfectly suitable for the near-univeral case where feed servers don't overwhelm the client with spam (in which case you remove the offending server from your feed).
To your points about YouTube-style algorithmic discovery, I do agree that that would require the server to do things like you describe. So I think that there could be both client-side and server-side algorithms. In time, who knows? Maybe even some client-server protocol whereby the two could interact.
>, but I don't see why AaaS or server-side anything would be required to have non-chronological feed algorithms.
You assume gp's idea of "non-chronological" feed means taking the already-small-subset-downloaed-by-RSS and running a client-side algorithm on it to re-order it. I'm not debating this point because this scenario is trivial and probably not what the gp is talking about.
I'm saying gp's idea of "non-chronological" feed (where he emphasized "curation is not the problem") means he wants the huge list of interesting but unknown content filtered down into a smaller manageable list that's curated by some ranking/weights.
The only technically feasible way to do curation/filtering algorithm on the unexplored vastness out on the internet -- trillions of pages and petabytes of content -- is on servers. That's the reasonable motivation for why gp wants Algorithm-as-a-Service. The issue is that the companies wealthy enough to run expensive datacenters to do that curation ... want to serve ads.
Maybe you're right about what they meant. I'll not debate that.
I will say that, for my purposes, I would definitely like an RSS reader that has more dynamic feed presentation. Maybe something that could watch and learn my preferences, taking into account engagement, time of day, and any number of other factors.
What's more, with primarily text-oriented articles, the total number of articles can be extremely high before overwhelming the server or the client. And a sufficiently smart client needn't be shy about discarding articles that the user is unlikely to want to read.
I don't necessarily agree with the statement "chronological feeds incentivises spam-posting, posters compete on quantity to get attention" - if someone spam-posts, I am very likely to unsubscribe. This would be true both for chronological and algo feeds.
>> I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.
Now that is something I would be interested in. I believe some of the RSS aggregators are trying to offer this too, but mostly the SaaS ones, not self-hostable open-source ones.
Its all subjective. There is no clear quantification of X Attention consumed = Y Value produced. So saying what the algo does is important is like saying astrology is important. Or HN is important ;) At the end of the day most info produced is just entertainment/placebo. 3 inch chimp brains have upper limits on how much they can consume and how many updates to their existing neural net are possible. Since there is nothing signaling these limits to people, people(both producers and consumers of info) live in their own lala land about what their own limits are or when those limits have been crossed, mostly everyone is hallucinating about Value of Info.
The UN report on the Attention Economy says 0.05% of info generated is actually consumed. And that was based on a study 10-15 years ago.
I've patched miniflux with a different sorting algorithm that is less preferential to frequent posters. It did change my experience for the better (though my particular patch is likely not to everyone's taste).
It is a bit strange that RSS readers do not compete on that, and are, generally, not flexible in that respect.
Social media targets engagement, which is not a good target. Even a pure chronological sort is better.
> I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.
I thought about this back in 2017 (within the context of LinkedIn signal to noise) [1]. I hap hoped for a marketplace/app store for algos. For example:
"What if the filter could classify a given post as 30% advertorial 70% editorial, and what if you could set a threshold for seeing posts of up to 25% advertorial but no more?"
and
"What if the filter could identify that you’d already received 25 permutations of essentially the same thing this month, and handle it accordingly."
I think these are good points, and also a reason why I never understood people wanted digg and reddit to supply them with RSS feeds back in the heydays of RSS.
But that's a bullshit excuse, just like with email the answer to spam posting is the person gets un-followed/unsubbed.
When its an algorithm, the user is incentivized to produce content in order to increase their chances of getting a hit. Secondarily the loss of visibility increases value of advertising on the platform. It's a lose-lose for users, first they are forced to use the platform more for fear of missing something, second the user has to post more to get any reach. The platform wins on increased engagement, overall content depth, ad revenue, and the ability to sneak in a whole lot of shit the user never was interested in or followed. Facebook & Instagram now are functionally high powered spam engines.
Interestingly the FT has an article today about a drop in social media usage ( https://www.ft.com/content/a0724dd9-0346-4df3-80f5-d6572c93a... ) - one chart titled "Social media has become less social" shows a 40% decline since 2014 of people using social media to "share my opinion" and "to keep up with my friends" In many ways, what is being referred to as social media has become anti-social media.
An algorithmic email feed would be useless, as would any sort of instant messenger, yet that's exactly what social media turned in to. Twitter/X is teetering in that direction. The chronological feed still works and is great. Anyone who posts a lot and doesn't balance out the noise with signal I just unfollow.
I might be wrong about this one, but one outcome of generative AI might be an engagement cliff. Some users will be very susceptible to viewing fake photos and videos for hours (the ones still heavily using FB likely are), but others may just choose to mentally disengage and view everything they see on FB, IG, Tiktok as fake.
> If you take away the time component, posters compete on quality instead.
That is verifiably false simply by looking at the state of social media. What they compete on is engagement bait, and the biggest of them all is rage.
By your logic, social media would be a panacea of quality posts by now, but it’s going to shit with fast-paced lies. Quick dopamine hits prevail, not “quality”.
> I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.
So, another service dedicated to tracking you and mining your data. I can already smell the enshittifaction.
I meant quality as in a sense of quality-vs-quantity dillema, not objective quality. In other words, posters will start optimizing individual posts instead of optimizing their amount.
[edit] and indeed, this only solves the problem of excessive posting, this is just the begining.
That's nonsense. If the problem truly was spam then the "algorithm" would be a simple and transparent penalty proportional to the frequency of posts. The goal is not that (it's """engagement""") and the algorithm is not that either (it's a turbo-charged skinner box attacking your mind with the might of ten thousand data centres).
> The causation is opposite, and it's the whole problem with chronological feeds, including RSS - chronological feeds incentivises spam-posting, posters compete on quantity to get attention.
That doesn't make any sense. Quantity might make you more prominent in a unified facebook feed, but an RSS reader will show it like this:
Sam and Fuzzy (5)
Station V3 (128)
They've always displayed that way. You never see one feed mixed into another feed. This problem can't arise in RSS. There is no such incentive. Quantity is a negative thing; when I see that I've missed 128 posts, I'm just going to say "mark all as read" and forget about them. (In fact, I have 174 unread posts in Volokh Conspiracy A† right now. I will not be reading all of those.)
† Volokh Conspiracy is hosted on Reason. Reason provides an official feed at http://reason.com/volokh/atom.xml . But Volokh Conspiracy also provides an independent feed at http://feeds.feedburner.com/volokh/mainfeed . Some of their posts go into one of those feeds, and the rest go into the other. I can't imagine that they do this on purpose, but it is what they do.
> They've always displayed that way. You never see one feed mixed into another feed. This problem can't arise in RSS.
All readers I know have the option to display all feeds chronologically, or an entire folder of feeds chronologically. In most, that's the default setting when you open the app/page.
I always use it like that. If I'd want to see all new posts from a single author, I might as well just bookmark their blog.
If you bookmark Dave's blog, you have to check his blog every day to see if there's something new, even if Dave only posts monthly. Or you check less often, and sometimes discover a new post long after the discussion in the comments has come and gone.
If you put Dave's blog in your RSS reader, one day "Dave (1)" shows up in your list of unread sources and you can read his new post immediately, and you didn't need to think about Dave's blog any other day.
I could use the "all articles" feed in my RSS reader (TT-RSS), but I would never do such a thing unless all the blogs I follow had similar posting frequencies that would mesh well together, which they don't. I never use the front page of Reddit for the same reason: the busy subs would drown out the ones that get a post a week.
> All readers I know have the option to display all feeds chronologically, or an entire folder of feeds chronologically. In most, that's the default setting when you open the app/page.
The option might exist. It was certainly not the default in mainstream readers in the past and it still isn't now. I never encountered it in Google Reader (as mainstream as it gets), or in Yoleo (highly niche), or in Thunderbird (also as mainstream as it gets).
Whether a bunch of unused projects make something strange the default doesn't really have an impact on the user experience. This is not something you can expect to encounter when using RSS.
> If I'd want to see all new posts from a single author, I might as well just bookmark their blog.
That approach will fail for two obvious reasons:
1. The bookmark is not sensitive to new posts. When there is no new post, you have to check it anyway. When there are several new posts, you're likely to overlook some of them.
2. Checking one bookmark is easy; checking 72 bookmarks is not.
It was the default view in Google Reader, the "All Items" view.
A mix of all feeds, ordered chronologically, is the default view in tt-rss, miniflux, inoreader, feedly, netnewswire, and all RSS readers I've ever seen.
> the act of selling something (such as a newspaper column or television series) for publication or broadcast to multiple newspapers, periodicals, websites, stations, etc.
>> the syndication of news articles and video footage
> This article provides a simple guide to using RSS to syndicate Web content.
Note that this is a guide to creating an RSS feed from the publisher's perspective. It is not possible for two feeds to be displayed together, or at all, on the publisher's end. How do you interpret the verb syndicate?
Yea, dismiss a whole argument based on your specific experience with your specific reader and your specific taste. Not to mention your argument proves the point - you already got their attention even when you didn't read the post and even shared the name of the blog here. However is the feed arranged, posters who compete on attention will optimize for it and eventually bubble up. That's why "the algorithms" are complicated in practice, you're always fighting against Goodhart's law.
The key with RSS is curation, otherwise it stops being your "controlled feed". FOMO can make you add noisy feeds that basically are putting too much information that will dwarf the relevant feeds. In my case, I follow Hacker News and Slashdot which are ok but also thought it was a good idea to add "The Verge | All posts" to my feed reader and I find myself hitting "Mark all as read" continuosly. It's not The Verge's fault of course, it's my lack of strategy.
Agreed in the general. Though one useful feature in FreshRSS that I've made heavy use of (no idea if there are equivalents in other readers) is the ability to mark new entries in a feed as read immediately upon receiving them. Initially seemed counterintuitive when I first saw the option, but I've found it's actually quite useful for me. It allows for a separation of true feeds coming from a person that I typically will want to read and take manual action on. Or more shotgun feeds that are continuously putting out stuff that I will choose to dip into when I feel like it (HN front page or Slashdot being examples of that). Without the ever-increasing counter of unread articles to think about.
I just bought a reseller plan from verpex host for $5/month. Can host unlimited domains and bandwidth with WHM. Access everything through cPanel and ftp. SSH on occasion.
I built my own reader because I didn't want unread items to accumulate. It just shows what was published the last X days.
The result is that there is no need for persistent storage, so its real easy to host. If you're interested, its here: https://github.com/lukasknuth/briefly
Allows you to check your feeds from multiple devices. For example I usually read from my phone, but sometimes would like to check my feeds from my desktop.
You could just subscribe to the same feeds from multiple devices/apps, but then you have to manually keep track of what's already been read and that will quickly get out of hand.
I've ditched RSS feeds more than 10 years ago but I'm increasingly wanting to go back to them. Thank you for sharing this blog post, it'll help to get me started.
This is the key imho, adapting stuff to yourself / your needs, when and where the GloboHomoCorp allows you to.
E.g. I don't use Twitter directly due to toxicity and overwhelmingness of the central feed (thank you Nikita), and due to, for me, the biggest issue - how shit it is for reading / following individual user feeds - when you find someone who's really interesting, and you don't want to miss posts.
So I use nitter and bookmark each person's profile I find interesting and I have that in a separate folder. Then at my pace, daily or weekly I read through people's posts and can really keep up like God intended me to.
At first it was less engaging than just having Twitter (as it's less adictive), and I've paced from deleting / using actual Twitter back and forth, but due to recent changes and events I've actually come to a place where through my bookmarks I discover new profiles / people / interests / niches at an organic pace that I can only compare to how I've used to use RSS or web in the older times. It's quite cool.
I prefer FreeTube for YouTube since it maintains the good parts of YouTube's interface while giving you something to point a backup program at. At least, when it works. There's currently a major blocking bug the devs are aware of.
I love RSS. Like all the old web tech the user is in control. If I like a page/site I'll look for an RSS to keep up to date with it, if one doesn't exist I'll likely forget about it. I'm not signing up for email updates.
I recently made a little RSS feed reader, and its barebones lives on my machine, and is powered by python.
I never could get into any of the RSS reader software it all seemed very happy to put random things in the feed that i didn't care about. A strict timeline of things i want to read is all I want. If there is nothing new there is nothing new and I'm okay with that.
Something else i HIGHLY recommend is subscribing to youtube channels Via RSS. You ACTUALLY KNOW WHEN SOMETHING IS NEW. I've gone way down on my youtube watching rabbit holes since I only engage with it when I have a new video.
Shout out to Blogtrottr[1], which allows you to subscribe to RSS feeds and have the posts sent to you via email. Great service I've been using for years.
The root problem here is that a communication channel full of noise is not valuable - but on the other hand if you have a very selective channel - then nobody will subscribe because to subscribe you need repeated good interactions.
I am currently working on a [personal use] MagicMirror replacement. One of the things I like about MagicMirror is the RSS newsfeed at the bottom of the screen, so I have been getting into RSS more recently, and really enjoying it. The only "problem" is trying to narrow down all the great content.
One section is "Hacker News People". When I find someone on HN who writes well, I subscribe to their comments in a RSS feed, so I can read everything they write. Very often they comment on a link I don't see on my main HN page, which is useful.
App http://hnapp.com/ converts names to RSS feeds. Example: `author:nickjj`
Social media is the RSS feed and has been for like 15 years. Short form posts that link to long form posts. Social posts that link to the content you've published wherever. The change in recent years is ppl skipping the self-hosting/POS part of the POSSE and posting directly on the social media sites because they were convinced to do that and the social media sites were discouraging users from travelling off-site etc. We just need to get away from using social media sites as the hosts of our content and back to the POS part.
If someone can please figure out how to integrate a purchase/payment system into a similar protocol we would love you forever :)
I would so love to help my many artist/musician friends get set up direct-to-consumer with digital content, subscriptions etc — and with their own shops, that they can run, in whatever funky style.
Patreon and Spotify already implement subscription-based podcasts, and I am positive they use RSS/Atom under the hood. So the tech is already out there, you just need to turn it into a self-hosted solution.
Indeed, Patreon has private feeds for patrons for exclusive content. That's a decent solution but it's platform-specific, which is both a bad thing (not easily used elsewhere) and good (backwards compatible with good old RSS).
I find this to be misguided tech-nostalgia. What you control this way is the way information is brokered to you. It only controls the information reaching you itself to the extent that is reflected in the delivery method.
This is significant if you're a staunch subscriber to the idea that everything, and I really do mean everything, wrong with social and mass media is the "algorithms" (formerly: capitalism, sensationalism, etc.), but I'm not. I find that to be at most half the story.
In the end, you're consuming something someone else produced for you to consume. That's why it's available. So you're relying on that information to be something you don't find inherently objectionable, or at least be filterable in that regard, which is not a given. We consume arbitrary and natural language content. Most you can do is feed it through AI to pre-digest it for you, which can and will fail in numerous ways. And this is to say nothing about content that wasn't produced and/or didn't reach you.
The reason older technologies felt better wasn't necessarily just because of them per se, but also because of their cultural context. These are interwoven of course, but I wouldn't necessarily trust that reverting back to old technology is what's going to steer back this ship to a better course. I'm afraid this is a lot more like undropping a mug than it is like applying negation.
Author here. I do not necessarily think algorithmic feeds are the only thing wrong with social media, but it's certainly one of the major problems. More so if the platforms don't even allow me to revert to chronological feeds, or make it really user unfriendly.
Of course the cultural context has changed, but I think your view is quite cynical. I do believe that AI could, in theory, be a good steward and curator of news feeds (think Google News), but I haven't seen an implementation that would be open and customizable enough. I do not like the idea that someone could be manipulating what I'm being presented, or what reaches me and what doesn't.
Could you elaborate on why you think this is misguided tech-nostalgia? Most of your arguments seem to be true regardless of how you discover content (RSS, social media, link aggregators, ...)
I think you almost have a point in that you seem to be advocating for something along the lines of unbiased input (questioning the presented information because it was constructed for presentation, suggesting that an AI could somehow assist, presumably to help ground your information in a wider context etc)
I think what you may be missing is the role of trust. There is much to say about that, but in this instance, a nice thing about RSS is that I can trust the algorithm it uses to generate my feed. It is very simple, and I, myself choose the sources it draws from.
With some other systems, this is not the case.
Thank you for almost granting me the capability of having a point. That is very nice of you.
I am not missing the role of trust. I have instead simply had that trust betrayed countless of times by now, so I'm seeking a little more. It would be a great first step, but far from the whole journey. And so I'm wary of people mistaking the latter for the former, intentionally or otherwise.
Betrayal of trust is indeed serious, and a hard lesson for many of us. Consider also that progress is made one step at a time, over a long time. While a desire for sudden, wholesale changes is understandable, it may be counterproductive. YMMV
That is not what I'm advocating for, nor are incremental steps something I'm advocating against.
What I'm advocating for is for people to not lose sight of the prize. And what I'm advocating against is misleading claims, which is what I consider the title and the proclaimed motivation of the post to be.
I see now - your issue is with the "controlled feeds of information" part? I am not claiming they are "feeds of controlled information" (which is how you seem to be interpreting it). Of course, all the sources you subscribe to will have their own biases and issues, but you do not lose agency over what you select for consumption. That is the control I am seeking and what I like about RSS.
If you want to discover personal, human-written blogs with valid RSS feeds, check out the directory I'm building: https://minifeed.net/blogs
(it's also a reader of sorts, and has related discovery and full-text search across those feeds and posts, but the page I linked is just a big list of blogs with some recent posts and RSS links).
I use RSS a lot but it is not without it's own difficulties.
My collection of feeds is naturally geared to my own interests and world views. As a result I do find I miss out on some things I should pay attention to. To counter this I include a fact checking site which brings stories I would otherwise miss to my attention. Not ideal, but it works.
This is very much true, and one of the downsides of RSS is that you need to make effort to discover new sources, or make sure that what you're consuming is at least somewhat balanced. However you have no guarantees of the latter when you use algorithmic feeds.
Something atypical about my setup is that I wired up miniflux webhooks to n8n and gotify so that when I click "save" on an entry, I get a notification for it. It's a rudimentary way to setup a "read it later" list.
I love miniflux. Switched from inoreader when they started imposing more and more limits and changing the UI.
With miniflux I have not limits with unread items/feeds. I can set check frequency to what I want and I actually don't need the app as the www ui is just brillinat!
I built NewsBlur and it's always interesting to me to hear people talk about the UI as a retro feel, but when I look at the competitors, I feel they aren't dense enough with information. Minimalism has gutted our ability to process more than a bit of text at once and I think that's tragic.
Great post! Indeed, social media platforms optimize for engagement and ad revenue, not user needs.
Feeds are a user right, not a publisher favor. In that spirit: I recently built RSSible - a tiny tool that lets you turn any webpage into an RSS feed via CSS selectors. I've built this for myself; already using it for HN, Product Hunt, tldr.tech, r/science, IMDb latest shows, RubyOnRemote, and many more.
It's still early, but if anyone here is curious to try or test, I'd love feedback. (You can see live demos on the site)
Reeder Classic. Not a subscription, looks good, works well. But it’s only available for Apple operating systems, and the developer is not responsive to bug reports (fortunately, there aren’t many and they are tiny).
News Explorer is a decent alternative (but same OS restriction). That developer is responsive.
NetNewsWire, great open source client on ios, good for mac too.
Though I feed it via tiny tiny rss with the freshrss api plugin so I can have saner filtering and curation before it gets to the reader.
The 'awesome tiny tiny rss' docker image is pretty decent setup for getting it running if you have any interest, though personally I ended up rolling my own static binary using frankenphp somewhat based off their setup. The core dev of tiny tiny rss is a bit, opinionated lets just say.
I don't use docker, so I stuck with an ancient version of TT-RSS for years. But last week I couldn't get it to play nice with my new FreeBSD system and its updated PHP, so I installed from the git repo with surprisingly little trouble.
I may need to talk to my hosting provider :) Thanks for pointing this out. The site is indeed statically generated (Hugo), so this should not be happening.
As do JSON feeds. But in general “RSS” has become kind of the shorthand for all feeds of this type. Most feed readers support all formats, so the distinction doesn’t really matter that much.
I love RSS. I literally just bought an app on itch.io, and to my surprise the devlog page for it, which lists all the new updates, supports RSS. I love when it happens. [1]
RSS didn't stick for me until:
1. I decided to quit most social media, so without RSS I would miss stuff I actually care about.
2. I unsubscribed to all news sites. RSS fatigue is a thing. Don't subscribe to sites that make money the more they post. I used to subscribe to Phoronix, the top HN frontpage articles, OSNews, LWN, etc.: bad idea, you don't want to wake up to 50 unread posts per day and get overwhelmed. Now I mostly follow personal blogs, and I have one new post per day to read. Much more manageable and higher signal-to-noise ratio.
3. https://fetchrss.com/ is genius for everything else that doesn't support RSS. It allows to turn any website into an RSS feed, and the free plan is generous enough for my needs.
I pay for Feedbin, and it's great.
---
1: I wish Firefox still showed an RSS feed icon when a page has one. These days I have to "view-source" and search for feed or atom or rss to tell.
I'd suggest giving a go to lenns.io (shameless plug). It gives you source prioritisation control + number of items per source control + category prioritisation. In the end, you get exactly what you want without being overwhelmed.
This is an app/service that I've built for myself, but it's up for anyone go give it a go and use it.
Try https://addons.mozilla.org/en-GB/firefox/addon/feed-preview/
Perfect, thanks!
I think #2 is a great tip. I’ve tried to use my feed reader to segregate by 'frequency' before, but I haven't really given it a full trial—it still feels a bit awkward.
> the top HN frontpage articles
I don't even really understand what the HN feed is. I looked in the FAQ, etc. the other day and couldn't find an explanation. The description ("Links for the intellectually curious, ranked by readers.") is nice PR, but it doesn't tell me what I'm seeing. Is it every post submitted? Every post that made it to the front page? Same, but stayed on the front page for a certain amount of time? Received at least X upvotes? I have no idea...
> I’ve tried to use my feed reader to segregate by 'frequency' before, but I haven't really given it a full trial—it still feels a bit awkward.
I'm in the middle of that myself. I have folders labelled rarely, weekly, frequent and social. Rarely and weekly I tend to read most of it, as they are the folders I open first. I only open frequent once I'm done with the others and I usually scroll through the titles and only read very few articles. Social is for mastodon and bluesky accounts, which I open when I only have 5 minutes to kill and I know I won't have time to finish reading long posts/articles.
I liked Newsblur's approach to this when I was using a firehose (I dropped most of my firehose-like feeds a couple years ago for various reasons including I didn't actually like most of them all that much). Newsblur has Focused versus Disliked and you can "train" all sorts of things to Like or Unlike about an article. You can Like an entire feed, but you can also Like things like specific authors or tags in a feed or words in a headline. Similarly you can use all the same tools to Unlike an article. If an article has more Likes than Unlikes it shows up in a Focused view and if an article has more Unlikes than Likes it shows up only in an "All" view, meaning it disappears from the default Unread view. When you have a limited amount of time you read Focused, when you have more time you read Unread, and if you want to check on spam or topics you dislike you can zoom out to "All" and spot-check feeds for Unliked articles.
Additionally, Newsblur added an automated "Infrequent Site Stories" for things it knows come from feeds that don't update all that frequently. (Which you can use in tandem with Focused view for even less time.)
The point 2 is an important one, I used RSS for years but had to stop using it as I was way too anxious trying to read everything.
I started using again, but I have a few rules: all the feeds only refresh once week; and any news feed (like hackers news) that generates too much content is purged also once a week, so I only have the latest one week articles.
In my mind, my RSS feed for me is like an old school weekly magazine. This solve the FOMO feeling of missing something interesting, but I don’t feel like I need to read something as soon as is published.
> I used to subscribe to (…) the top HN frontpage articles
https://hnrss.github.io/ lets you subscribe only to posts above a certain number of points, or other metrics.
> These days I have to "view-source" and search for feed or atom or rss to tell.
Doesn’t your feed service/app auto-detect feeds if you just paste the webpage? That’s a common feature.
> https://hnrss.github.io/ lets you subscribe only to posts above a certain number of points, or other metrics.
Yes, that's the one I used, but I feel it's still too much noise. You don't want a firehose in your RSS feeds.
Just open https://news.ycombinator.com when you want to doomscroll through an almost endless stream of information. RSS doesn't work well with social media, and that's a feature.
"..lets you subscribe only to posts above a certain number of points.."
You prefer to let other people determine what you read?
>You prefer to let other people determine what you read?
There are multiple levels of delegation to the "wisdom of the crowd":
- visiting this HN website by itself is already letting people determine what you read. The stories are submitted by others.
- reading only the front page of just the top-voted 30 stories instead of doom scrolling the additional 1000 is another level of delegation
- inside of each story, only reading a subset of the most upvoted comments is another level
People do all 3 to various degrees because there's limited time to read.
> People do all 3 to various degrees because there's limited time to read.
_By far_, the best posts I have ever seen to this website were often ones that didn't get many comments, or may have even gotten reported.
Considering you define the heuristic seems like a daft argument for you to make
I prefer to also hear other people’s opinion about an article as a sort of a peer review.
I’ve recently quit all social media and been wanting to do the same thing. Thank you!
One thing that I did to kickstart my RSS usage again was to revisit each site I was subscribed to and:
- Remove it if it posted more than once a day. I want thoughtful voices, not other people’s aggregation.
- Remove it if it hadn’t posted in the last few years. Some people blog extremely irregularly, but the likelihood is that most blogs that are 5+ years old aren’t coming back.
- Remove it if the overall tone of the blog is too negative.
I then added a bunch of new feeds from people I’m currently actively following on other platforms who are blogging. This was a massive breath of fresh air, that has got me actively engaging with my feed reader for the first time in a few years.
(Related to my second point: I’m not the first person to note this but there’s a real sadness to watching an old and beloved blog nova itself into your feed in a burst of gambling site spam. Better to get out before that happens.)
Good tips! I have two busy feeds (which are my country's equivalent of Reuters or AP) but I keep them in a separate folder and if I wanted, I could exclude them from the main feed. Sources that post way too often can be a burden and it feels like you can never catch up.
> - Remove it if it hadn’t posted in the last few years. Some people blog extremely irregularly, but the likelihood is that most blogs that are 5+ years old aren’t coming back.
This I don't really understand. Following inactive feeds via RSS comes effectively at no cost for you. How does removing them improve the experience?
See my follow-on note at the end of the post, but also it’s just a psychological out-with-the-old-and-in-with-the-new point that marks a change.
It may come at a cost in some SaaS RSS readers (which may allow a limited number of feeds on the free plan, for example).
It can be preventative against spam for when old domains expire/get sold and/or old blog service passwords get hacked.
I think about doing that sort of proactive cleanup sometimes. There's nothing quite as disappointing as seeing an old friend's blog show a new post for the first time in years only for it to be some spammer that just hacked their old password or some expired domain squatter saw RSS logs and decided to sell advertising on it or a once major blog host was sold to a Russian oligarch who purged the user database so more Russians could have good usernames (LiveJournal, lol).
Most people complain about the signal-to-noise ratio in news consumption. I believe the issue isn’t the news sources themselves, but rather the lack of a proper RSS application.
A great RSS app should offer a powerful search function. It should support tagging, bookmarking, scoring or point systems, categories, and a "read later" feature, among other things.
You don’t need to eliminate news sources — just use filters and search tools to surface what matters to you.
An ideal RSS reader should also be smart enough to bypass things like Cloudflare and other unnecessary protections that break RSS functionality. Unfortunately, many mobile RSS apps fall short in this regard — and mobile is king these days.
To get something truly useful, you often need to self-host. But most people won’t go that far.
Personally, I self-host my RSS reader. I even built my own client, since I wasn’t aware of KaraKeep (formerly Hoarder) at the time. I’m still using my custom app because it’s now very versatile, and I’m not sure KaraKeep would meet all my needs.
Links:
https://github.com/rumca-js/Django-link-archive - my own project
https://github.com/AboutRSS/ALL-about-RSS - all about RSS
https://rssisawesome.com/
https://rssgizmos.com/
https://github.com/plenaryapp/awesome-rss-feeds
It's strange how all these modern communication methods (blogs, forums, RSS readers) so often fail to have features that were available in Usenet newsreaders 30+ years ago. We had threading, searching, killfiling or scoring, marking posts to save, all pretty common features then. I'm not sure why there isn't more demand for them now.
I agree that most rss clients lack true power user features for searching and filtering.
I spent a bit of time proving out an idea to use Bleve indexing to allow scoring each article with weighted keywords but I haven’t had time to work on it lately. I’ll have a look at your links.
> As Facebook would push for more engagement, some bands would flood their pages with multiple posts per day
The causation is opposite, and it's the whole problem with chronological feeds, including RSS - chronological feeds incentivises spam-posting, posters compete on quantity to get attention. That's one of the main reasons fb and other sites implemented algorithmic feeds in the first place. If you take away the time component, posters compete on quality instead.
> The story we are sold with algorithmic curation is that it adapts to everyone’s taste and interests, but that’s only true until the interests of the advertisers enter the picture.
Yea, exactly, but as emphasized here: The problem is not curation, the problem is the curator. Feed algorithms are important, they solve real problems. I don't think going back to RSS and chronolgical feed is the answer.
I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.
> I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.
RSS is just a protocol. You could make a reader now with any algorithm you want that displays feeds. In fact, I can’t imagine that no one is using the AI boom to say they will build a decentralized Twitter using rss plus ai for the algorithm
>RSS is just a protocol. You could make a reader now with any algorithm you want that displays feeds.
Your proposal to filter on the client side where the RSS reader runs can't do what the gp wants: algorithmic suggestions on the _server_ side.
The issue of an AI algorithm applied to client-side-RSS is it's limited to the closed set items of the particular feed(s) that RSS happened to download of whatever websites the user pre-defined in the white-list.
Example of inherent client-side limitation would be how Youtube works:
- a particular Youtube channel about power tools : can use RSS to get a feed of that channel. Then use further customized client filtering (local AI LLM) to ignore any videos that talk about politics instead of tools.
- the Youtube algorithm of suggested and related videos to discover unknown channels or topics : RSS can't subscribe to this so there's no ability to filter on the client side. Either Youtube itself would have to offer a "Suggested Videos as RSS feed" -- which it doesn't -- or -- a 3rd party SaaS website has to constantly scrape millions of Youtube videos and then offer it as a RSS feed. That's not realistic as Google would ban that 3rd-party scraper but let's pretend it was allowed... getting millions of XML records to filter it client-side to throw away 99% of it is not ideal. So you're still back to filtering it on the server side to make the RSS feed managable.
In the "explore-vs-exploit" framework, the "explore" phase is more efficiently accomplished with server-side algorithms. The "exploit" phase is where RSS can be used.
- "explore" : use https://youtube.com and its server-side algorithms to navigate billions of videos to find new topics and content creators. Then add interesting channel to RSS whitelist.
- "exploit" : use RSS to get updates of a particular channel
> Example of inherent client-side limitation would be how Youtube works: > ...
I thought about this problem a long time ago but never did anything substantive with it. I guess I'll articulate it here, off-the-cuff:
People used to post a "blogroll" (and sometimes an OPML file) to their personal blogs describing feeds they followed. That was one way to do decentralized recommendations, albeit manually since there was no well-known URL convention for publishing OPML files. If there was a well-known URL convention for publishing OPML files a client could build a recommendation graph. That would be neat but would only provide feed-level recommendation. Article-level recommendation would be cooler.
One of the various federated/decentralized/whatever-Bluesky-is "modern" re-implementations of Twitter/NNTP could be used to drive article-level recommendations. I could see my feed reader emitting machine-readable recommendation messages based on ratings I give while browsing articles. I would consume these recommendations from others, and then could have lots of fun weighting recommendations based on social graph, algorithmic summary of the article body, etc.
GGP does express interest in Algorithm-as-a-Service (AaaS), but I don't see why AaaS or server-side anything would be required to have non-chronological feed algorithms. Client-side is perfectly suitable for the near-univeral case where feed servers don't overwhelm the client with spam (in which case you remove the offending server from your feed).
To your points about YouTube-style algorithmic discovery, I do agree that that would require the server to do things like you describe. So I think that there could be both client-side and server-side algorithms. In time, who knows? Maybe even some client-server protocol whereby the two could interact.
>, but I don't see why AaaS or server-side anything would be required to have non-chronological feed algorithms.
You assume gp's idea of "non-chronological" feed means taking the already-small-subset-downloaed-by-RSS and running a client-side algorithm on it to re-order it. I'm not debating this point because this scenario is trivial and probably not what the gp is talking about.
I'm saying gp's idea of "non-chronological" feed (where he emphasized "curation is not the problem") means he wants the huge list of interesting but unknown content filtered down into a smaller manageable list that's curated by some ranking/weights.
The only technically feasible way to do curation/filtering algorithm on the unexplored vastness out on the internet -- trillions of pages and petabytes of content -- is on servers. That's the reasonable motivation for why gp wants Algorithm-as-a-Service. The issue is that the companies wealthy enough to run expensive datacenters to do that curation ... want to serve ads.
Maybe you're right about what they meant. I'll not debate that.
I will say that, for my purposes, I would definitely like an RSS reader that has more dynamic feed presentation. Maybe something that could watch and learn my preferences, taking into account engagement, time of day, and any number of other factors.
What's more, with primarily text-oriented articles, the total number of articles can be extremely high before overwhelming the server or the client. And a sufficiently smart client needn't be shy about discarding articles that the user is unlikely to want to read.
Scour lets you add feeds and topics that you’re interested in and then sorts posts by how similar they are to your interests.
It also works well for feeds that are too noisy to read through manually, like HN Newest.
https://scour.ing (I’m the developer)
Heya, author here.
I don't necessarily agree with the statement "chronological feeds incentivises spam-posting, posters compete on quantity to get attention" - if someone spam-posts, I am very likely to unsubscribe. This would be true both for chronological and algo feeds.
>> I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.
Now that is something I would be interested in. I believe some of the RSS aggregators are trying to offer this too, but mostly the SaaS ones, not self-hostable open-source ones.
Its all subjective. There is no clear quantification of X Attention consumed = Y Value produced. So saying what the algo does is important is like saying astrology is important. Or HN is important ;) At the end of the day most info produced is just entertainment/placebo. 3 inch chimp brains have upper limits on how much they can consume and how many updates to their existing neural net are possible. Since there is nothing signaling these limits to people, people(both producers and consumers of info) live in their own lala land about what their own limits are or when those limits have been crossed, mostly everyone is hallucinating about Value of Info.
The UN report on the Attention Economy says 0.05% of info generated is actually consumed. And that was based on a study 10-15 years ago.
I've patched miniflux with a different sorting algorithm that is less preferential to frequent posters. It did change my experience for the better (though my particular patch is likely not to everyone's taste).
It is a bit strange that RSS readers do not compete on that, and are, generally, not flexible in that respect.
Social media targets engagement, which is not a good target. Even a pure chronological sort is better.
> I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.
I thought about this back in 2017 (within the context of LinkedIn signal to noise) [1]. I hap hoped for a marketplace/app store for algos. For example:
"What if the filter could classify a given post as 30% advertorial 70% editorial, and what if you could set a threshold for seeing posts of up to 25% advertorial but no more?"
and
"What if the filter could identify that you’d already received 25 permutations of essentially the same thing this month, and handle it accordingly."
[1] https://blog.eutopian.io/building-a-better-linkedin/
I think these are good points, and also a reason why I never understood people wanted digg and reddit to supply them with RSS feeds back in the heydays of RSS.
But that's a bullshit excuse, just like with email the answer to spam posting is the person gets un-followed/unsubbed.
When its an algorithm, the user is incentivized to produce content in order to increase their chances of getting a hit. Secondarily the loss of visibility increases value of advertising on the platform. It's a lose-lose for users, first they are forced to use the platform more for fear of missing something, second the user has to post more to get any reach. The platform wins on increased engagement, overall content depth, ad revenue, and the ability to sneak in a whole lot of shit the user never was interested in or followed. Facebook & Instagram now are functionally high powered spam engines.
Interestingly the FT has an article today about a drop in social media usage ( https://www.ft.com/content/a0724dd9-0346-4df3-80f5-d6572c93a... ) - one chart titled "Social media has become less social" shows a 40% decline since 2014 of people using social media to "share my opinion" and "to keep up with my friends" In many ways, what is being referred to as social media has become anti-social media.
An algorithmic email feed would be useless, as would any sort of instant messenger, yet that's exactly what social media turned in to. Twitter/X is teetering in that direction. The chronological feed still works and is great. Anyone who posts a lot and doesn't balance out the noise with signal I just unfollow.
I might be wrong about this one, but one outcome of generative AI might be an engagement cliff. Some users will be very susceptible to viewing fake photos and videos for hours (the ones still heavily using FB likely are), but others may just choose to mentally disengage and view everything they see on FB, IG, Tiktok as fake.
What's an algorithm?
> If you take away the time component, posters compete on quality instead.
That is verifiably false simply by looking at the state of social media. What they compete on is engagement bait, and the biggest of them all is rage.
By your logic, social media would be a panacea of quality posts by now, but it’s going to shit with fast-paced lies. Quick dopamine hits prevail, not “quality”.
> I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.
So, another service dedicated to tracking you and mining your data. I can already smell the enshittifaction.
I meant quality as in a sense of quality-vs-quantity dillema, not objective quality. In other words, posters will start optimizing individual posts instead of optimizing their amount.
[edit] and indeed, this only solves the problem of excessive posting, this is just the begining.
That's nonsense. If the problem truly was spam then the "algorithm" would be a simple and transparent penalty proportional to the frequency of posts. The goal is not that (it's """engagement""") and the algorithm is not that either (it's a turbo-charged skinner box attacking your mind with the might of ten thousand data centres).
This is not the ONLY problem, it's just the first problem.
> The causation is opposite, and it's the whole problem with chronological feeds, including RSS - chronological feeds incentivises spam-posting, posters compete on quantity to get attention.
That doesn't make any sense. Quantity might make you more prominent in a unified facebook feed, but an RSS reader will show it like this:
They've always displayed that way. You never see one feed mixed into another feed. This problem can't arise in RSS. There is no such incentive. Quantity is a negative thing; when I see that I've missed 128 posts, I'm just going to say "mark all as read" and forget about them. (In fact, I have 174 unread posts in Volokh Conspiracy A† right now. I will not be reading all of those.)† Volokh Conspiracy is hosted on Reason. Reason provides an official feed at http://reason.com/volokh/atom.xml . But Volokh Conspiracy also provides an independent feed at http://feeds.feedburner.com/volokh/mainfeed . Some of their posts go into one of those feeds, and the rest go into the other. I can't imagine that they do this on purpose, but it is what they do.
> They've always displayed that way. You never see one feed mixed into another feed. This problem can't arise in RSS.
All readers I know have the option to display all feeds chronologically, or an entire folder of feeds chronologically. In most, that's the default setting when you open the app/page.
I always use it like that. If I'd want to see all new posts from a single author, I might as well just bookmark their blog.
If you bookmark Dave's blog, you have to check his blog every day to see if there's something new, even if Dave only posts monthly. Or you check less often, and sometimes discover a new post long after the discussion in the comments has come and gone.
If you put Dave's blog in your RSS reader, one day "Dave (1)" shows up in your list of unread sources and you can read his new post immediately, and you didn't need to think about Dave's blog any other day.
I could use the "all articles" feed in my RSS reader (TT-RSS), but I would never do such a thing unless all the blogs I follow had similar posting frequencies that would mesh well together, which they don't. I never use the front page of Reddit for the same reason: the busy subs would drown out the ones that get a post a week.
> All readers I know have the option to display all feeds chronologically, or an entire folder of feeds chronologically. In most, that's the default setting when you open the app/page.
The option might exist. It was certainly not the default in mainstream readers in the past and it still isn't now. I never encountered it in Google Reader (as mainstream as it gets), or in Yoleo (highly niche), or in Thunderbird (also as mainstream as it gets).
Whether a bunch of unused projects make something strange the default doesn't really have an impact on the user experience. This is not something you can expect to encounter when using RSS.
> If I'd want to see all new posts from a single author, I might as well just bookmark their blog.
That approach will fail for two obvious reasons:
1. The bookmark is not sensitive to new posts. When there is no new post, you have to check it anyway. When there are several new posts, you're likely to overlook some of them.
2. Checking one bookmark is easy; checking 72 bookmarks is not.
> I never encountered it in Google Reader
It was the default view in Google Reader, the "All Items" view.
A mix of all feeds, ordered chronologically, is the default view in tt-rss, miniflux, inoreader, feedly, netnewswire, and all RSS readers I've ever seen.
It's also what "syndication" means.
For your weirdest claim:
> syndication noun
> syn·di·ca·tion
> the act of selling something (such as a newspaper column or television series) for publication or broadcast to multiple newspapers, periodicals, websites, stations, etc.
>> the syndication of news articles and video footage
( https://www.merriam-webster.com/dictionary/syndication )
The "syndication" in RSS refers to distributing the same content to many different readers.
Here's MDN: https://devdoc.net/web/developer.mozilla.org/en-US/docs/RSS/...
> This article provides a simple guide to using RSS to syndicate Web content.
Note that this is a guide to creating an RSS feed from the publisher's perspective. It is not possible for two feeds to be displayed together, or at all, on the publisher's end. How do you interpret the verb syndicate?
Yea, dismiss a whole argument based on your specific experience with your specific reader and your specific taste. Not to mention your argument proves the point - you already got their attention even when you didn't read the post and even shared the name of the blog here. However is the feed arranged, posters who compete on attention will optimize for it and eventually bubble up. That's why "the algorithms" are complicated in practice, you're always fighting against Goodhart's law.
I think this post is the best place to "promote" my open-source pet project
It converts any dynamic website to rss feed
It's self-hosted and stateless
https://github.com/Egor3f/rssalchemy
It's however not in active development state, but if there will be some pull requests I'll review them, so it's not abandoned
Demo page is not working now, but if there will be some activity, I'll bring it back up
Also shout out to https://kill-the-newsletter.com/ for converting email subscriptions to RSS feeds
I don't use it personally but have heard good things! Thanks for mentioning it.
The key with RSS is curation, otherwise it stops being your "controlled feed". FOMO can make you add noisy feeds that basically are putting too much information that will dwarf the relevant feeds. In my case, I follow Hacker News and Slashdot which are ok but also thought it was a good idea to add "The Verge | All posts" to my feed reader and I find myself hitting "Mark all as read" continuosly. It's not The Verge's fault of course, it's my lack of strategy.
Agreed in the general. Though one useful feature in FreshRSS that I've made heavy use of (no idea if there are equivalents in other readers) is the ability to mark new entries in a feed as read immediately upon receiving them. Initially seemed counterintuitive when I first saw the option, but I've found it's actually quite useful for me. It allows for a separation of true feeds coming from a person that I typically will want to read and take manual action on. Or more shotgun feeds that are continuously putting out stuff that I will choose to dip into when I feel like it (HN front page or Slashdot being examples of that). Without the ever-increasing counter of unread articles to think about.
This is a great feature. And I found I can create a similar rule with Inoreader, thanks!
I'm self-hosting FreshRSS [1] with Docker on a Hetzner VM. Fast, clutter-free, has everything I need.
[1] https://freshrss.org/index.html
Edit: typo
I just bought a reseller plan from verpex host for $5/month. Can host unlimited domains and bandwidth with WHM. Access everything through cPanel and ftp. SSH on occasion.
Installed freshRSS in 1 min with softaculous
I built my own reader because I didn't want unread items to accumulate. It just shows what was published the last X days.
The result is that there is no need for persistent storage, so its real easy to host. If you're interested, its here: https://github.com/lukasknuth/briefly
What the purpose of hosting whole reader instead of local app? Or is that something else/more than reader?
Allows you to check your feeds from multiple devices. For example I usually read from my phone, but sometimes would like to check my feeds from my desktop.
You could just subscribe to the same feeds from multiple devices/apps, but then you have to manually keep track of what's already been read and that will quickly get out of hand.
Runs 24/7. Also, I already host other things on the server, so there is no added cost.
freshrss is excellent.
Is anyone hooking up RSS feeds to some cheap LLMs to do personalized content algorithms? Seems like a useful open source project
Not personalized yet, but categorized:
https://outerweb.org/explore
LLMs are helping with classification right now but in the backlog is letting them take a list of favorites and creating recommendations.
I've ditched RSS feeds more than 10 years ago but I'm increasingly wanting to go back to them. Thank you for sharing this blog post, it'll help to get me started.
The best thing about modern RSS is that you can retire entire accounts.
I don't have YouTube or Twitter accounts, I use RSS to subscribe to channels/users/tags.
This is the key imho, adapting stuff to yourself / your needs, when and where the GloboHomoCorp allows you to.
E.g. I don't use Twitter directly due to toxicity and overwhelmingness of the central feed (thank you Nikita), and due to, for me, the biggest issue - how shit it is for reading / following individual user feeds - when you find someone who's really interesting, and you don't want to miss posts.
So I use nitter and bookmark each person's profile I find interesting and I have that in a separate folder. Then at my pace, daily or weekly I read through people's posts and can really keep up like God intended me to.
At first it was less engaging than just having Twitter (as it's less adictive), and I've paced from deleting / using actual Twitter back and forth, but due to recent changes and events I've actually come to a place where through my bookmarks I discover new profiles / people / interests / niches at an organic pace that I can only compare to how I've used to use RSS or web in the older times. It's quite cool.
Shhh - they'll hear you!
Surely it will be deleted as soon as some manager remembers it exists.
The day YouTube removes RSS feeds is the day I stop watching entirely.
I prefer FreeTube for YouTube since it maintains the good parts of YouTube's interface while giving you something to point a backup program at. At least, when it works. There's currently a major blocking bug the devs are aware of.
Simple is use your thunderbird mail to read RSS feeds. Check https://support.mozilla.org/en-US/kb/how-subscribe-news-feed...
I love RSS. Like all the old web tech the user is in control. If I like a page/site I'll look for an RSS to keep up to date with it, if one doesn't exist I'll likely forget about it. I'm not signing up for email updates.
I recently made a little RSS feed reader, and its barebones lives on my machine, and is powered by python.
I never could get into any of the RSS reader software it all seemed very happy to put random things in the feed that i didn't care about. A strict timeline of things i want to read is all I want. If there is nothing new there is nothing new and I'm okay with that.
Thanks for the write up and read.
Something else i HIGHLY recommend is subscribing to youtube channels Via RSS. You ACTUALLY KNOW WHEN SOMETHING IS NEW. I've gone way down on my youtube watching rabbit holes since I only engage with it when I have a new video.
Shout out to Blogtrottr[1], which allows you to subscribe to RSS feeds and have the posts sent to you via email. Great service I've been using for years.
[1] https://blogtrottr.com
The root problem here is that a communication channel full of noise is not valuable - but on the other hand if you have a very selective channel - then nobody will subscribe because to subscribe you need repeated good interactions.
I am currently working on a [personal use] MagicMirror replacement. One of the things I like about MagicMirror is the RSS newsfeed at the bottom of the screen, so I have been getting into RSS more recently, and really enjoying it. The only "problem" is trying to narrow down all the great content.
I've paid for Feedly for years and use it daily.
One section is "Hacker News People". When I find someone on HN who writes well, I subscribe to their comments in a RSS feed, so I can read everything they write. Very often they comment on a link I don't see on my main HN page, which is useful.
App http://hnapp.com/ converts names to RSS feeds. Example: `author:nickjj`
This is a fantastic idea. Do you mind sharing a copy of your RSS comment feed?
Social media is the RSS feed and has been for like 15 years. Short form posts that link to long form posts. Social posts that link to the content you've published wherever. The change in recent years is ppl skipping the self-hosting/POS part of the POSSE and posting directly on the social media sites because they were convinced to do that and the social media sites were discouraging users from travelling off-site etc. We just need to get away from using social media sites as the hosts of our content and back to the POS part.
If someone can please figure out how to integrate a purchase/payment system into a similar protocol we would love you forever :)
I would so love to help my many artist/musician friends get set up direct-to-consumer with digital content, subscriptions etc — and with their own shops, that they can run, in whatever funky style.
Patreon and Spotify already implement subscription-based podcasts, and I am positive they use RSS/Atom under the hood. So the tech is already out there, you just need to turn it into a self-hosted solution.
Indeed, Patreon has private feeds for patrons for exclusive content. That's a decent solution but it's platform-specific, which is both a bad thing (not easily used elsewhere) and good (backwards compatible with good old RSS).
I find this to be misguided tech-nostalgia. What you control this way is the way information is brokered to you. It only controls the information reaching you itself to the extent that is reflected in the delivery method.
This is significant if you're a staunch subscriber to the idea that everything, and I really do mean everything, wrong with social and mass media is the "algorithms" (formerly: capitalism, sensationalism, etc.), but I'm not. I find that to be at most half the story.
In the end, you're consuming something someone else produced for you to consume. That's why it's available. So you're relying on that information to be something you don't find inherently objectionable, or at least be filterable in that regard, which is not a given. We consume arbitrary and natural language content. Most you can do is feed it through AI to pre-digest it for you, which can and will fail in numerous ways. And this is to say nothing about content that wasn't produced and/or didn't reach you.
The reason older technologies felt better wasn't necessarily just because of them per se, but also because of their cultural context. These are interwoven of course, but I wouldn't necessarily trust that reverting back to old technology is what's going to steer back this ship to a better course. I'm afraid this is a lot more like undropping a mug than it is like applying negation.
Author here. I do not necessarily think algorithmic feeds are the only thing wrong with social media, but it's certainly one of the major problems. More so if the platforms don't even allow me to revert to chronological feeds, or make it really user unfriendly.
Of course the cultural context has changed, but I think your view is quite cynical. I do believe that AI could, in theory, be a good steward and curator of news feeds (think Google News), but I haven't seen an implementation that would be open and customizable enough. I do not like the idea that someone could be manipulating what I'm being presented, or what reaches me and what doesn't.
Could you elaborate on why you think this is misguided tech-nostalgia? Most of your arguments seem to be true regardless of how you discover content (RSS, social media, link aggregators, ...)
I think you almost have a point in that you seem to be advocating for something along the lines of unbiased input (questioning the presented information because it was constructed for presentation, suggesting that an AI could somehow assist, presumably to help ground your information in a wider context etc)
I think what you may be missing is the role of trust. There is much to say about that, but in this instance, a nice thing about RSS is that I can trust the algorithm it uses to generate my feed. It is very simple, and I, myself choose the sources it draws from. With some other systems, this is not the case.
> I think you almost have a point
Thank you for almost granting me the capability of having a point. That is very nice of you.
I am not missing the role of trust. I have instead simply had that trust betrayed countless of times by now, so I'm seeking a little more. It would be a great first step, but far from the whole journey. And so I'm wary of people mistaking the latter for the former, intentionally or otherwise.
Betrayal of trust is indeed serious, and a hard lesson for many of us. Consider also that progress is made one step at a time, over a long time. While a desire for sudden, wholesale changes is understandable, it may be counterproductive. YMMV
That is not what I'm advocating for, nor are incremental steps something I'm advocating against.
What I'm advocating for is for people to not lose sight of the prize. And what I'm advocating against is misleading claims, which is what I consider the title and the proclaimed motivation of the post to be.
I see now - your issue is with the "controlled feeds of information" part? I am not claiming they are "feeds of controlled information" (which is how you seem to be interpreting it). Of course, all the sources you subscribe to will have their own biases and issues, but you do not lose agency over what you select for consumption. That is the control I am seeking and what I like about RSS.
If you want to discover personal, human-written blogs with valid RSS feeds, check out the directory I'm building: https://minifeed.net/blogs
(it's also a reader of sorts, and has related discovery and full-text search across those feeds and posts, but the page I linked is just a big list of blogs with some recent posts and RSS links).
Cool project! I use https://ooh.directory but will check your directory out too.
Love this! Something I've been working on that is a bit similar
https://blognerd.app/?qry=AI+research+llms&type=sites&conten...
you can search for blogs with feeds, and find blogs by semantic similarity
I use RSS a lot but it is not without it's own difficulties.
My collection of feeds is naturally geared to my own interests and world views. As a result I do find I miss out on some things I should pay attention to. To counter this I include a fact checking site which brings stories I would otherwise miss to my attention. Not ideal, but it works.
This is very much true, and one of the downsides of RSS is that you need to make effort to discover new sources, or make sure that what you're consuming is at least somewhat balanced. However you have no guarantees of the latter when you use algorithmic feeds.
Any recommendations for a self-hosted RSS reader with a good companion android app (or at least a decent mobile website)?
Miniflux is lovely, and the mobile website is decent https://miniflux.app/
Another +1 for miniflux.
Something atypical about my setup is that I wired up miniflux webhooks to n8n and gotify so that when I click "save" on an entry, I get a notification for it. It's a rudimentary way to setup a "read it later" list.
Big fan of miniflux with this dark theme https://github.com/catppuccin/miniflux
I love miniflux. Switched from inoreader when they started imposing more and more limits and changing the UI.
With miniflux I have not limits with unread items/feeds. I can set check frequency to what I want and I actually don't need the app as the www ui is just brillinat!
One more vote for miniflux, which I run on a "server" (old laptop with linux...) at my house and access from anywhere thanks to cloudflare tunnels.
We must be related, I'm doing the exact same thing.
Self-hosting freshrss and using CapyReader android app is my go-to solution for this
https://github.com/FreshRSS/FreshRSS
https://github.com/jocmp/capyreader
There are a couple of mobile apps for Miniflux, but even on its own it is reasonably mobile-friendly. I use Miniflutt on my phone.
i was using newsblur for a long while up until recently. supposedly they have a self hosted version but i never tried it.
the UI is an acquired taste i think, it has a very retro feel to it. i was mainly using it because it let you make nested folders of feeds
they also have an android app and its available in the fdroid store as well if you are into that kind of thing
I built NewsBlur and it's always interesting to me to hear people talk about the UI as a retro feel, but when I look at the competitors, I feel they aren't dense enough with information. Minimalism has gutted our ability to process more than a bit of text at once and I think that's tragic.
Great post! Indeed, social media platforms optimize for engagement and ad revenue, not user needs.
Feeds are a user right, not a publisher favor. In that spirit: I recently built RSSible - a tiny tool that lets you turn any webpage into an RSS feed via CSS selectors. I've built this for myself; already using it for HN, Product Hunt, tldr.tech, r/science, IMDb latest shows, RubyOnRemote, and many more.
It's still early, but if anyone here is curious to try or test, I'd love feedback. (You can see live demos on the site)
RSSible: https://rssible.hadid.dev/
Github: https://github.com/mhadidg/rssible
Impossible for me to read in dark mode.
Never meant to support dark mode. Looks like water.css (the CSS lib I'm using) supports it by default. My custom CSS is breaking things.
I'll try to force light mode initially and will definitely fix the dark mode later. Thanks a lot for flagging this!
It should look better now. I fixed the input text color.
I miss RSS too much, I have decided to start using it again today. For those who are in it, what is your favorite client?
Reeder Classic. Not a subscription, looks good, works well. But it’s only available for Apple operating systems, and the developer is not responsive to bug reports (fortunately, there aren’t many and they are tiny).
News Explorer is a decent alternative (but same OS restriction). That developer is responsive.
NetNewsWire, great open source client on ios, good for mac too.
Though I feed it via tiny tiny rss with the freshrss api plugin so I can have saner filtering and curation before it gets to the reader.
The 'awesome tiny tiny rss' docker image is pretty decent setup for getting it running if you have any interest, though personally I ended up rolling my own static binary using frankenphp somewhat based off their setup. The core dev of tiny tiny rss is a bit, opinionated lets just say.
I don't use docker, so I stuck with an ancient version of TT-RSS for years. But last week I couldn't get it to play nice with my new FreeBSD system and its updated PHP, so I installed from the git repo with surprisingly little trouble.
This app made me fall in love with RSS after years of trying: https://feeeed.nateparrott.com/
NetNewsWire is a great RSS client and one of the best designed apps, both on iOS and OS X.
Page seems to be hugged to death. Here is an archived version: https://web.archive.org/web/20251003062648/https://blog.burk...
I may need to talk to my hosting provider :) Thanks for pointing this out. The site is indeed statically generated (Hugo), so this should not be happening.
This looks like a static page. How much load do you need for a server hosting static page to go down?
Reminder that Atom exists:
* https://en.wikipedia.org/wiki/Atom_(web_standard)
As do JSON feeds. But in general “RSS” has become kind of the shorthand for all feeds of this type. Most feed readers support all formats, so the distinction doesn’t really matter that much.
NetNewsWire on macOS