"Plex added a paid license for remote streaming, a feature that was previously free. And then Plex decided to also sell personal data — I sure love self-hosted software spying on me."
How is it "self-hosted" if it's "remote streaming?" And if you're hosting it, you can throttle any outgoing traffic you want. Right?
The only other examples are Mattermost and MinIO... which I don't know much about, but again: Aren't you in control of your own host?
This article is lame. How about focusing on back-ends that pretend to support self-hosting but make it difficult by perpetuating massive gaps in its documentation (looking at you, Supabase)?
> How is it "self-hosted" if it's "remote streaming?" And if you're hosting it, you can throttle any outgoing traffic you want. Right?
You host the plex service with your media library. Plex allows you to stream without opening up your firewall to others. Not sure now it works exactly because I never hosted it myself.
> Plex allows you to stream without opening up your firewall to others.
It relies on their hosted services/infrastructure. I avoid Plex for that reason. I just host my media with nginx + indexing enabled. Wireguard for creating the tunnel between the server-client and Kodi as the frontend to view the media (you can add an indexed http server as a media source).
Works great, no transcoding like Plex, but that's less of an issue nowadays when hardware accelerated decoders are common for h264 & h265.
> It relies on their hosted services/infrastructure.
Only if you want it to. Your local Plex server is always available on port 32400 - which can be opened up for others as well. But using Plex’s authentication is more convenient, of course.
Im confused. There are two different streaming things on Plex. They support streaming inside the plex app of content from the usual streaming services, much like Apple TV or your TV’s built in media manager. They also support streaming your collection across the internet to wherever you are. Which is now behind a paywall?
I don't use Plex anymore, but not long before I cancelled my account they starting charging to access someone's library that had been shared with you if the sharing party did not have Plex pass, or something to that effect.
> This article is lame. How about focusing on back-ends that pretend to support self-hosting but make it difficult by perpetuating massive gaps in its documentation (looking at you, Supabase)?
that's one way of enshittifying, but what the article talks about is nonetheless very important.
People rely on projects being open source (or rather: _hosted on github_) as some sort of mark of freedom from shitty features and burdensome monetization.
As the examples illustrate, the pattern of capturing users with a good offering and then subsequently squeezing them for money can very easily be done by open source software with free licenses. The reason for that is that source code being available is not, alone, enough to ensure not getting captured by adversarial interests.
What you ALSO need is people wanting to put in the work to create a parallel fork to continuously keep the enshittification at bay. Someone who rolls a distribution with a massive amount of ever-decaying patches, increasingly large amounts of workarounds, etc. Or, alternatively, a "final release" style fork that enters maintenance mode and only ever backports security vulnerability fixes. Either of those is a huge amount of work and it's not even sure that people will find that fork on their own rather than just assume "things are like that now".
Given that the code's originating corporation can and will eagerly throw whole teams of people at disabling such efforts, the counter-efforts would require the same amount of free labor to be successful - or even larger, given that it's easy to wreck things for the code's originator but it's difficult to fix them for the restoration crew.
This pattern, repeated in many projects over the decades since GPL2 and MIT were produced, displays that merely being free and open source does not create a complete anti enshittification measure for the end user. What is actually necessary is a societal measure, a safety web made up of developers dedicated to conservation of important software, who would be capable of correcting any stupid decisions made by pointy-haired managers. There are some small projects like this (eg Apache, and many more) but they are not all-encompassing and many projects that are important to people are without such a safety net.
So for this reason, eg when people are upset that mattermost limits the messages to 10000, their real quarrel isn't really even with the scorpion, who is known to sting, it is with the lack of there being a social safety net for this particular software. Their efforts would be well spent on rapidly building such a safety network to quickly force the corporation's hand into increasingly more desperate measures, accelerating their endgame and self-induced implosion. Then, after the corpo's greed inevitably makes them eat themselves in full, the software can enter the normal space of FOSS development rather than forever remain this corporate slave-product that is pact-bound to a Delaware LLC by a chain of corporate greed.
Only once any free fork's competition backed by VCs burning their money on a ceremonial heap has been removed can the free version of the software become the central source for all users and therefore become successful, rather than continuously play catch up with a throng of H-2B holders.
Maybe I'm missing something here: the great thing about self-hosting is that you choose if and when you update your back-end software. What's stopping self-hosting admins from simply staying on a known good version and forking that if they so desire?
Where are you seeing that? From what I can tell, the 10k message limit applies to "Mattermost Entry":
> Mattermost Entry gives small, forward-leaning teams a free self-hosted Intelligent Mission Environment to get started on improving their mission-critical secure collaborative workflows. Entry has all features of Enterprise Advanced with the following server-wide limitations and omissions:
I don't really like that "enshittified" is being used here. You could argue that Plex, MinIO or Mattermost is being enshittified, but definitely not self hosting as a whole.
Enshittification also usually implies that switching to an alternative is difficult (usually because creating a competing service is near impossible because you'd have to get users on it). That flaw doesn't really apply to self hosting like it does with centralized social media. You can just switch to Jellyfin or Garage or Zulip. Migration might be a pain, but it's doable.
You can't as easily stop using LinkedIn or GitHub or Facebook, etc.
Same. I have been using Plex for 15 years. For my personal use case, it has not changed, ever. I don't encounter any "enshittification". For my purposes it continues to be exactly what I want, just as it always was.
in the last 5-10yrs...letsencrypt made ssl much easier..and its possible to host on small,cheap arm devices...
yes no more dyndns free accounts... but u can still use afraid or do cf tunnels maybe?
and in some cases nowadays u can get away with
docker-compose up
and some of those things like minio and mattermost are complaints about the free tier or complaints about self hosting? i can't tell
indeed the easiest "self hosting" ever was when ngrok happened.. u could get ur port listening on the internet without a sign up... by just running a single binary without a flag...
Nowadays for self hosted DNS the solution I use is Pihole + Tailscale (for the Pihole DNS anywhere) if I could figure it out in one afternoon it is pretty idiot proof.
It doesn't. There's seemingly no connection between the handful of examples of self-hosting software actually getting worse, and the earlier point about hardware costs.
On-premming your Internet services just seems like an exercise in self-flagellation.
Unless you have a heavy-duty pipe to your prem you're just risking all kinds of headaches, and you're going to have to put your stuff behind Cloudflare anyway and if you're doing that why not use a VPS?
It's just not practical for someone to run a little blog or app that way.
It's not that much headache, and this isn't necessarily about public-facing sites and apps.
Take file storage: Some folks find Google Drive and similar services unpalatable because they can and will scan your content. Setting up Nextcloud or even just using file sharing built into a consumer router is pretty easy.
You don't need to rely on Cloudflare, either. Some routers come with VPN functionality or can have it added.
The self-hosting most people talk about when they talk about self-hosting is very practical.
People are going to start doing this a lot more as agents improve. Most people only need a very small fraction of the features of SaaS, and that fraction is slightly different for everyone, so the economics of companies trying to use features to chase users is bad. Even worse, if you're on SaaS you can't modify the code, which will be crippling, so the whole SaaS model is cooked.
I think co-management is going to be the next paradigm.
Typical whining from this corner of the Internet. Maximalism around being owed any promised feature and ongoing open source development for life isn't compatible with a healthy and consumer-appealing self-hosting market.
I don't understand this fairly sparse "article."
"Plex added a paid license for remote streaming, a feature that was previously free. And then Plex decided to also sell personal data — I sure love self-hosted software spying on me."
How is it "self-hosted" if it's "remote streaming?" And if you're hosting it, you can throttle any outgoing traffic you want. Right?
The only other examples are Mattermost and MinIO... which I don't know much about, but again: Aren't you in control of your own host?
This article is lame. How about focusing on back-ends that pretend to support self-hosting but make it difficult by perpetuating massive gaps in its documentation (looking at you, Supabase)?
> How is it "self-hosted" if it's "remote streaming?" And if you're hosting it, you can throttle any outgoing traffic you want. Right?
You host the plex service with your media library. Plex allows you to stream without opening up your firewall to others. Not sure now it works exactly because I never hosted it myself.
> Plex allows you to stream without opening up your firewall to others.
It relies on their hosted services/infrastructure. I avoid Plex for that reason. I just host my media with nginx + indexing enabled. Wireguard for creating the tunnel between the server-client and Kodi as the frontend to view the media (you can add an indexed http server as a media source).
Works great, no transcoding like Plex, but that's less of an issue nowadays when hardware accelerated decoders are common for h264 & h265.
> It relies on their hosted services/infrastructure.
Only if you want it to. Your local Plex server is always available on port 32400 - which can be opened up for others as well. But using Plex’s authentication is more convenient, of course.
Do you have any recommendations for decoders? I've been using a fire stick for a bit but I wouldn't mind a better alternative.
Im confused. There are two different streaming things on Plex. They support streaming inside the plex app of content from the usual streaming services, much like Apple TV or your TV’s built in media manager. They also support streaming your collection across the internet to wherever you are. Which is now behind a paywall?
I don't use Plex anymore, but not long before I cancelled my account they starting charging to access someone's library that had been shared with you if the sharing party did not have Plex pass, or something to that effect.
> This article is lame. How about focusing on back-ends that pretend to support self-hosting but make it difficult by perpetuating massive gaps in its documentation (looking at you, Supabase)?
that's one way of enshittifying, but what the article talks about is nonetheless very important.
People rely on projects being open source (or rather: _hosted on github_) as some sort of mark of freedom from shitty features and burdensome monetization.
As the examples illustrate, the pattern of capturing users with a good offering and then subsequently squeezing them for money can very easily be done by open source software with free licenses. The reason for that is that source code being available is not, alone, enough to ensure not getting captured by adversarial interests.
What you ALSO need is people wanting to put in the work to create a parallel fork to continuously keep the enshittification at bay. Someone who rolls a distribution with a massive amount of ever-decaying patches, increasingly large amounts of workarounds, etc. Or, alternatively, a "final release" style fork that enters maintenance mode and only ever backports security vulnerability fixes. Either of those is a huge amount of work and it's not even sure that people will find that fork on their own rather than just assume "things are like that now".
Given that the code's originating corporation can and will eagerly throw whole teams of people at disabling such efforts, the counter-efforts would require the same amount of free labor to be successful - or even larger, given that it's easy to wreck things for the code's originator but it's difficult to fix them for the restoration crew.
This pattern, repeated in many projects over the decades since GPL2 and MIT were produced, displays that merely being free and open source does not create a complete anti enshittification measure for the end user. What is actually necessary is a societal measure, a safety web made up of developers dedicated to conservation of important software, who would be capable of correcting any stupid decisions made by pointy-haired managers. There are some small projects like this (eg Apache, and many more) but they are not all-encompassing and many projects that are important to people are without such a safety net.
So for this reason, eg when people are upset that mattermost limits the messages to 10000, their real quarrel isn't really even with the scorpion, who is known to sting, it is with the lack of there being a social safety net for this particular software. Their efforts would be well spent on rapidly building such a safety network to quickly force the corporation's hand into increasingly more desperate measures, accelerating their endgame and self-induced implosion. Then, after the corpo's greed inevitably makes them eat themselves in full, the software can enter the normal space of FOSS development rather than forever remain this corporate slave-product that is pact-bound to a Delaware LLC by a chain of corporate greed.
Only once any free fork's competition backed by VCs burning their money on a ceremonial heap has been removed can the free version of the software become the central source for all users and therefore become successful, rather than continuously play catch up with a throng of H-2B holders.
Maybe I'm missing something here: the great thing about self-hosting is that you choose if and when you update your back-end software. What's stopping self-hosting admins from simply staying on a known good version and forking that if they so desire?
Security updates is what's stopping them often.
You also realistically can't fork things unless multiple people do, and they all stay interested in the fork.
you choose if and when you update your back-end software
That's what we say it's about. But it's really about open source devs being our slaves forever. Get to work, Mattermost! (whip crack)
Did you read the Github issue? These guys are paying customers.
Where are you seeing that? From what I can tell, the 10k message limit applies to "Mattermost Entry":
> Mattermost Entry gives small, forward-leaning teams a free self-hosted Intelligent Mission Environment to get started on improving their mission-critical secure collaborative workflows. Entry has all features of Enterprise Advanced with the following server-wide limitations and omissions:
https://docs.mattermost.com/product-overview/editions-and-of...
If so that is indeed shitty. I thought they were crippling the free tier.
If you’re self-hosting, do you need 128GB of ram?
I suspect you don’t. I suspect a couple of beelinks could run your whole business (minus the GPU needs).
I self-host and generally put 64GB of RAM in servers (DDR3, thankfully). Certain arrangements of Docker-based services simply chew up a lot of RAM.
I run quite a few services with a used Dell Wyse 5070 thin client PC from 2018 with 4GB of ram.
> I suspect you don’t
...today.
If you're self-hosting, do you need 640K of ram?
And you can upgrade in future to match your actual needs instead of wasting money trying to front load costs for no benefit.
You can buy a “lightly used” Dell Optiplex with 8gb RAM for like $40 which will cover all your self hosting needs today.
I don't really like that "enshittified" is being used here. You could argue that Plex, MinIO or Mattermost is being enshittified, but definitely not self hosting as a whole.
Enshittification also usually implies that switching to an alternative is difficult (usually because creating a competing service is near impossible because you'd have to get users on it). That flaw doesn't really apply to self hosting like it does with centralized social media. You can just switch to Jellyfin or Garage or Zulip. Migration might be a pain, but it's doable.
You can't as easily stop using LinkedIn or GitHub or Facebook, etc.
Same. I have been using Plex for 15 years. For my personal use case, it has not changed, ever. I don't encounter any "enshittification". For my purposes it continues to be exactly what I want, just as it always was.
in the last 5-10yrs...letsencrypt made ssl much easier..and its possible to host on small,cheap arm devices...
yes no more dyndns free accounts... but u can still use afraid or do cf tunnels maybe?
and in some cases nowadays u can get away with
docker-compose up
and some of those things like minio and mattermost are complaints about the free tier or complaints about self hosting? i can't tell
indeed the easiest "self hosting" ever was when ngrok happened.. u could get ur port listening on the internet without a sign up... by just running a single binary without a flag...
Mattermost is infamous crippleware and they charge more than slack for a worse product if you pay. Use Zulip.
Nowadays for self hosted DNS the solution I use is Pihole + Tailscale (for the Pihole DNS anywhere) if I could figure it out in one afternoon it is pretty idiot proof.
2/3 of this article is about DRAM prices. How is that "enshittification" of self-hosting?
Maybe the remaining 1/3 answers the question.
It doesn't. There's seemingly no connection between the handful of examples of self-hosting software actually getting worse, and the earlier point about hardware costs.
This is a year-in-review article. A scattering of topics is the point.
I suppose writing an article title is hard. The article could be about a few different related things. The hardware and the software side of it.
That’s about all I’ll say though, not my article.
time marches forward but instead of progress we go backwards. expect to write your own software on limited resources like its 1990 again.
I hate to tell them but everything is being enshittified.
On-premming your Internet services just seems like an exercise in self-flagellation.
Unless you have a heavy-duty pipe to your prem you're just risking all kinds of headaches, and you're going to have to put your stuff behind Cloudflare anyway and if you're doing that why not use a VPS?
It's just not practical for someone to run a little blog or app that way.
It's not that much headache, and this isn't necessarily about public-facing sites and apps.
Take file storage: Some folks find Google Drive and similar services unpalatable because they can and will scan your content. Setting up Nextcloud or even just using file sharing built into a consumer router is pretty easy.
You don't need to rely on Cloudflare, either. Some routers come with VPN functionality or can have it added.
The self-hosting most people talk about when they talk about self-hosting is very practical.
Some of us have have LAN for our offices and TBs of data.
I don’t think you understand what on-premises means.
I don't fully understand the complaints about enshittification of open source permissively licensed software.
If the source code is available for you to fork, modify, and maintain as you see fit, what's the complaining really about?
People are going to start doing this a lot more as agents improve. Most people only need a very small fraction of the features of SaaS, and that fraction is slightly different for everyone, so the economics of companies trying to use features to chase users is bad. Even worse, if you're on SaaS you can't modify the code, which will be crippling, so the whole SaaS model is cooked.
I think co-management is going to be the next paradigm.
What's co-management?
Managed services that you have some ability to modify, to customize or add functionality.
Typical whining from this corner of the Internet. Maximalism around being owed any promised feature and ongoing open source development for life isn't compatible with a healthy and consumer-appealing self-hosting market.