this post was submitted on 22 Nov 2023
6 points (100.0% liked)

Self-Hosted Main

21 readers
1 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

For Example

We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.

Useful Lists

founded 1 year ago
MODERATORS
 

Like the title says, I'm new to self hosting world. 😀 while I was researching, I found out that many people dissuaded me to self host email server. Just too complicated and hard to manage. What other services that you think we should just go use the currently available providers in the market and why? 🙂thank you

you are viewing a single comment's thread
view the rest of the comments
[–] TBT_TBT@alien.top 1 points 1 year ago (1 children)

Docker is the antithesis of „bloat“.

[–] Drwankingstein@alien.top 1 points 1 year ago (1 children)

Docker is horrid for duplication. Unless you use a filesystem with good deduplication, docker can hurt a lot on your storage. and even then it still can just not work often due to due to already deduplicated extent stuff

[–] TBT_TBT@alien.top 1 points 1 year ago (1 children)

WTF? You obviously don't understand Docker at all.

- Docker and Docker images provide the absolute mimimum environment which is necessary to run an application. Containers don't have reserved resources, so only what is really used is used. A VM has a lot more overhead, as a whole computer plus complete OS are emulated.

- There is not much to deduplicate because there is no redundant storage going on. Only the bare OS essentials plus the app are stored. There are some base OS containers (e.g. Alpine Linux) which are <10 Mbytes in size.

- If containers themselves are "big", you are doing Docker wrong and store data inside of a container and not externally of the container in volumes or the host filesystem. With the next container pull, that data would be lost.

- no idea what " just not work often due to due to already deduplicated extent stuff" is supposed to mean. That does not even make sense.

[–] Drwankingstein@alien.top 1 points 1 year ago (1 children)

As someone who is wasting gigabytes upon gigabytes of data on Docker, get out of here with your stupid bullshit.

docker layers across containers are not duplicated properly if the images arent setup right which is the case for a significant amount of images.

The entire point of Docker is that way you can easily go and deploy these specific images that aren't set up right. One cannot just say, "Well, don't use those images, do something else, and do your own” because that completely defeats the point of Docker.

I had nearly 80 gigabytes of duplicated garbage. across my home system alone after after setting up things like surveillance, my nas, nitter etc.

Don't come here telling me I don't understand docker when you are the one who has no idea.

[–] TBT_TBT@alien.top 1 points 1 year ago

What do you even mean with „duplicated stuff“? Again, if your containers are too big, you are doing it wrong.