this post was submitted on 22 Nov 2023
6 points (100.0% liked)
Self-Hosted Main
21 readers
1 users here now
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
For Example
- Service: Dropbox - Alternative: Nextcloud
- Service: Google Reader - Alternative: Tiny Tiny RSS
- Service: Blogger - Alternative: WordPress
We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.
Useful Lists
- Awesome-Selfhosted List of Software
- Awesome-Sysadmin List of Software
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Docker is the antithesis of „bloat“.
Docker is horrid for duplication. Unless you use a filesystem with good deduplication, docker can hurt a lot on your storage. and even then it still can just not work often due to due to already deduplicated extent stuff
WTF? You obviously don't understand Docker at all.
- Docker and Docker images provide the absolute mimimum environment which is necessary to run an application. Containers don't have reserved resources, so only what is really used is used. A VM has a lot more overhead, as a whole computer plus complete OS are emulated.
- There is not much to deduplicate because there is no redundant storage going on. Only the bare OS essentials plus the app are stored. There are some base OS containers (e.g. Alpine Linux) which are <10 Mbytes in size.
- If containers themselves are "big", you are doing Docker wrong and store data inside of a container and not externally of the container in volumes or the host filesystem. With the next container pull, that data would be lost.
- no idea what " just not work often due to due to already deduplicated extent stuff" is supposed to mean. That does not even make sense.
As someone who is wasting gigabytes upon gigabytes of data on Docker, get out of here with your stupid bullshit.
docker layers across containers are not duplicated properly if the images arent setup right which is the case for a significant amount of images.
The entire point of Docker is that way you can easily go and deploy these specific images that aren't set up right. One cannot just say, "Well, don't use those images, do something else, and do your own” because that completely defeats the point of Docker.
I had nearly 80 gigabytes of duplicated garbage. across my home system alone after after setting up things like surveillance, my nas, nitter etc.
Don't come here telling me I don't understand docker when you are the one who has no idea.
What do you even mean with „duplicated stuff“? Again, if your containers are too big, you are doing it wrong.