399ddf95

joined 1 year ago
[โ€“] 399ddf95@alien.top 1 points 1 year ago

leave the site "exposed" to the internet without having a bunch of bots scanning it all the time

This is not a thing. Everything exposed to the public internet will be scanned constantly forever.

You can reduce your attack surface by limiting the software/ports you expose, or by having another service/computer act as a proxy. This is what Cloudflare does as their main business.

You can get 99% of the benefit you seek by just signing up for the free version of Cloudflare and putting them in front of your webserver. You can configure the firewall on your webserver to only allow access to Cloudflare's servers and let Cloudflare deal with the bots.

Another approach would be to store data on AWS S3 or similar with a time-limited URL with a short expiration date, that you provide only to approved parties. You wouldn't even need to run a public server to do this, and access will automatically be denied after the time period you specify. (This might make more sense if you're distributing big files, versus displaying a screen or two full of information.)

You could also do a web search for "port knocking; or configure client-side TLS certificates, which can be difficult to manage but would allow you to restrict full access to your server. (still vulnerable to DDoS, though)

[โ€“] 399ddf95@alien.top 1 points 1 year ago

I used a $0 tier Slack workspace for this in a small business, it worked OK. The Slack clients are enormous in memory.