this post was submitted on 17 Sep 2023
168 points (100.0% liked)

Chat

7498 readers
2 users here now

Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.


Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.

The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.

In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.

If the other admins want to give their opinions about this, then I am all ears.

I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.

top 50 comments
sorted by: hot top controversial new old
[–] Intelligence_Gap 53 points 1 year ago (4 children)

I’m not sure that’s possible with images being allowed. If Google, Facebook, Instagram, and YouTube all struggle with it I think it will be an issue anywhere images are allowed. Maybe there’s an opening for an AI to handle the task these days but any dataset for something like that could obviously be incredibly problematic

[–] thanevim@kbin.social 39 points 1 year ago (1 children)

Yeah, the key problem here is that any open forum, of any considerable popularity, since the dawn of the Internet has had to deal with shit like CSAM. You don't see it elsewhere because of moderators. Doing the very job Op does. It's just now, Op, you're in the position. Some people can, and have decided to, deal with moderating the horrors. It may very well not be something you, Op, can do.

[–] d3Xt3r 22 points 1 year ago* (last edited 1 year ago)

The thing is though, with traditional forums you get a LOT of controls for filtering out the kind of users who post such content. For instance, most forums won't even let you post until you complete an interactive tutorial first (reading the rules and replying to a bot indicating you've understood them etc).

And then, you can have various levels of restrictions, eg, someone with less than 100 posts, or an account less than a month old may not be able to post any links or images etc. Also, you can have a trust system on some forums, where a mod can mark your account as trusted or verified, granting you further rights. You can even make it so that a manual moderator approval is required, before image posting rights are granted. In this instance, a mod would review your posting history and ensure that your posts genuinely contributed to the community and you're unlikely to be a troll/karma farmer account etc.

So, short of accounts getting compromised/hacked, it's very difficult to have this sort of stuff happen on a traditional forum.

I used to be a mod on a couple of popular forums back in the day, and I even ran my own community for a few years (using Invision Power Board), and never once have I had to deal with such content.

The fact is Lemmy is woefully inadequate in it's current state to deal with such content, and there are definitely better options out there. My heart goes out to @Chris and the staff for having to deal with this stuff, and I really hope that this drives the Beehaw team to move away from Lemmy ASAP.

In the meantime, I reckon some drastic actions would need to be taken, such as disabling new user registrations and stopping all federation completely, until the new community is ready.

load more comments (3 replies)
[–] bermuda 43 points 1 year ago (1 children)

I'd be fine with not hosting images entirely. I don't think people come to beehaw primarily to look at pictures

load more comments (1 replies)
[–] liv 42 points 1 year ago* (last edited 1 year ago) (1 children)

I just want to say, I am so so so sorry you had to see that.

I accidentally saw some CSAM in the 1990s and you are right, it is burnt into your mind. It's the real limit case of "what has been seen cannot be unseen" - all I could do was learn to avoid accessing those memories.

If you can access counselling for this, that might be a good option. Vicarious trauma is a real phenomenon.

[–] remington 22 points 1 year ago (1 children)

If you can access counselling for this, that might be a good option. Vicarious trauma is a real phenomenon.

Thank you for the advice. I'm not sure that I'll need counseling but I'm open to it if need be. Time will tell.

[–] loops 7 points 1 year ago

Be sure to keep tabs on yourself, sometimes these things can really sneak up on you.

[–] furrowsofar 39 points 1 year ago* (last edited 1 year ago) (1 children)

People keep talking about going to another platform. Personally I think a better idea would be to develop lemmy to deal with these issues. This must be a fediverse wide problem. So some discussion with other admins and the developers is probably the way to go on many of these things. Moreover you work with https://opencollective.com/, can they help. Beyond this, especially CSAM, there must be large funding agencies where one could get a grant to get some real professional programming put into this problem. Perhaps we could raise funds ourselves to help with this too.

So frankly I would like to see Beehaw solve the issues with lemmy, rather then just move to some other platform that will have its own issues. The exception may be if the Beehaw people think that being a safe space creates too big a target that you have to leave the Threadiverse to be safe. That to me seems like letting the haters win. It is exactly what they want. My vote will always be to solve the threadiverse issues rather then run away.

Just my feeling. There may be more short term practical issues that take precedence and frankly it is all up to you guys where you want to take this project.

[–] snowe@programming.dev 11 points 1 year ago (9 children)

The solution is to use an already existing software product that solves this, like CloudFlare’s CSAM Detection. I know people on the fediverse hate big companies, but they’ve solved this problem already numerous times before. They’re the only ones allowed access to CSAM hashes, lemmy devs and platforms will never get access to the hashes (for good reason).

load more comments (9 replies)
[–] PreparaTusNalgasPorque@kbin.social 35 points 1 year ago (1 children)

I'm sure those repugnant assholes do it "for the lulz" and if they want to mess with you they'll do it anywhere.

There's this study that says playing Tetris helps ease recently acquired trauma https://www.ox.ac.uk/news/2017-03-28-tetris-used-prevent-post-traumatic-stress-symptoms

And the admin from his eponymous instance dbzero created an interesting script to get rid of CSAM without having to review it manually, take a look -> https://github.com/db0/lemmy-safety

[–] renard_roux 11 points 1 year ago* (last edited 1 year ago)

Just tagging @admin in case they don't see this ❤️

Edit: aaand I did it wrong 🙄 @admin@beehaw.org 👈 Better?

[–] lerba 34 points 1 year ago (1 children)

This post seems highly reactive to me. I'm sorry to hear of you being exposed to such disturbing material, but I fail to see at true connection of that happening and using Lemmy as the platform. I absolutely agree that nobody should have to experience what you did, but I disagree with the platform change proposition.

[–] potterman28wxcv 7 points 1 year ago

I don't know of any software platform where that would not happen.

Even with a text-only platform people can still post URLs to unsafe content.

I think OP is referring to some kind of automated scanner but I'm not sure there are publicly available ones. I guess using them would come at a cost - either computational or $$. And even so, there can be false positives so you would probably still have to check the report anyway someday.

[–] Kolanaki@yiffit.net 33 points 1 year ago* (last edited 1 year ago)

Sadly, the only 100% way to never have that kind of material ever touch your servers is to not allow image uploads from the public. Whether it's on Lemmy or another social site, or something you control entirely on your own. Maybe sooner than we think, AI could deal with the moderation of it so a human never has to witness that filth, but it's not quite there yet.

[–] AndreTelevise 25 points 1 year ago* (last edited 1 year ago)

Lemm.ee, another instance I am in, isn't hosting images anymore or letting people upload images directly due to this issue. When your platform is supposed to be 100% open source and decentralized, there are bound to be issues like this, and they should be dealt with, even if proprietary tech is necessary for it. I'm sorry to hear about this.

[–] Kangie@lemmy.srcfiles.zip 24 points 1 year ago (1 children)

A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.

I hate to say it, but you'll need to find a text-only platform. Allowing any image uploads opens the door to things like this.

Besides that, if your concern is that no moderator should be exposed to anything like that, well on a text-only site you might have to deal with disguised spam links to gore, scam, etc. You'll still have to click on links to effectively moderate.

Maybe you should consider if this is a position that you want to put yourself in again. It sounds like this may just not be for you.

[–] Chobbes 6 points 1 year ago* (last edited 1 year ago)

This was my immediate thought as well. It's unfortunate, but there will probably always be people who abuse online platforms like this. It's totally okay if you're not up to the task of moderating disturbing content like that


it sounds like it can be a really brutal job. I don't know what the moderation tools on Lemmy are like, but maybe there's a way to flag different kinds of moderation concerns for different moderators (so not everybody has to be exposed to this kind of stuff if they're not comfortable with it). And maybe there could also be a system where if user's flag the post it can be automatically marked as NSFW and images can be hidden by default so moderators and other users don't have to be exposed to it without warning (though of course such a system could potentially be abused as well). But beyond that I'm not sure what else you can do, aside from maybe limiting federation.

[–] Penguincoder 24 points 1 year ago (1 children)

Does that mean a platform that does not allow any images to be uploaded? Or a platform that has better access control and remediation controls?

[–] remington 13 points 1 year ago (3 children)

I'd be willing to consider either and would love your, particular, feedback on this as well.

[–] furrowsofar 15 points 1 year ago* (last edited 1 year ago)

By the way. I have always been surprised that Beehaw did host images. The extra cost (they are large and costly in both storage and bandwidth), added security and attack vector possibilities, IP issues, CSAM issues, etc.

[–] furrowsofar 8 points 1 year ago (6 children)

Also, I do not think this is a Lemmy specific issue. It is an image availability, and scale issue. Federation of course increases the scale a lot too.

load more comments (6 replies)
[–] furrowsofar 5 points 1 year ago* (last edited 1 year ago)

I think if a platform has image capabilities this is to be expected. I guess the only exception if there are filters that can be used, but this seems unlikely. So I think it is an image vs. no image decision. The other problem with images is they can be attack vectors from a security point of view. Any complex file format can be an attack vector as interpreters of complex file formats often have bugs.

Can you imagine that the large platforms have whole teams of people that have to look at this stuff all day and filter it out. Not sure how that works, but it is probably the reality. Notice R$ never hosted images.

[–] storksforlegs 23 points 1 year ago (2 children)

As others have suggested, I think temporarily suspending images until you guys can settle on a safe alternative to lemmy is a good idea.

Im sorry you had to see something like this, i hope you are able to seek out some counceling asap, talk to someone about it. Even something like https://www.7cups.com/ might be helpful.

load more comments (2 replies)
[–] mojo@lemm.ee 22 points 1 year ago

As long as you can post links or upload images, there is an avenue for CSAM to be spammed. Beehaw should probably start with a whitelist and slowly expand. Refuse to federate with anyone that has open registration.

[–] forestG 21 points 1 year ago* (last edited 1 year ago)

I don't think there is a way to have both the option to host images and have zero risk of getting such image uploads. You either completely disable image hosting, or you mitigate the risk by the way image uploads are handled. Even if you completely disable the image uploads, someone might still link to such content. The way I see this there are two different aspects. One is the legal danger you place yourself when you open your instance to host images uploaded by users. The other is the obvious (and not so obvious) and undeniable harmful effects contact with such material has for most of us. The second, is pretty impossible to guarantee 100% on the internet. The first you can achieve by simply not allowing image uploads (and I guess de-federating with other instances to avoid content replication).

The thing is, when you host an instance of a technology that allows for better moderation (i.e. allowing certain kinds of content, such as images, only after a user reaches a certain threshold of activity), actually helps in a less obvious manner. CSAM is not only illegal to exist on the server-side. It's also illegal and has serious consequences for the people who actually upload it. The more activity history you have on a potential uploader, the easier it becomes to actually track him. Requiring more time for an account before allowing it to post images, makes concealing the identity harder and raises the potential risk for the uploader to the extend that it will be very difficult to go through the process only to cause problems to the community.

Let me also state this clearly: I don't have an issue with disabling image uploads here, or changing the default setting of instance federation to a more limiting one. Or both. I don't mind linked images to external sites.

I am sorry you had to see such content. No, it doesn't seem to go away. At least it hasn't for me, after almost 2 decades :-/

[–] apis 20 points 1 year ago (2 children)

So, so sorry you had to see that, and thank you for protecting the rest of us from seeing it.

On traditional forums, you'd have a lot of control over the posting of images.

If you don't wish to block images entirely, you could block new members from uploading images, or even from sharing links. You could set things up so they'd have to earn the right to post by being active for a randomised amount of time, and have made a randomised number of posts/comments. You could add manual review to that, so that once a member has ostensibly been around long enough and participated enough, admin look at their activity pattern as well as their words to assess if they should be taken off probation or not... Members who have been inactive for a while could have image posting abilities revoked and be put through a similar probation if they return. You could totally block all members from sharing images & links via DM, and admin email accounts could be set to reject images.

It is probably possible to obtain the means to reject images which could contain any sexual content (checked against a database of sexual material which does not involve minors), and you could probably also reject images which could contain children and which might not be wholesome (checked against a database of normal images of children).

Aside from the topic in hand, a forum might decide to block all images of children, because children aren't really in a position to consent to their images being shared online. That gets tricky when it comes to late teens & early 20s, but if you've successfully filtered out infants, young children, pre-teens & early teens as well as all sexual content, it is very unlikely that images of teenagers being abused would get through.

Insisting that images are not uploaded directly, but via links to image hosting sites, might give admin an extra layer of protection, as the hosting sites have their own anti-CSAM mechanisms. You'd probably want to whitelist permitted sites. You might also want a slight delay between the posting of an image link and the image appearing on Beehaw - this would allow time for the image hosting site to find & remove any problem images before they could appear on Beehaw (though I'd imagine these things are pretty damn fast by now).

You could also insist that members who wish to post images or links to images can only do so if they have their VPN and other privacy preserving methods disabled. Most members wouldn't be super-enthused about this, until they've developed trust in the admin of the site, but anyone hoping to share images of children being abused or other illegal content will just go elsewhere.

Admin would probably need to be able to receive images of screenshots from members trying to report technical issues, but those should be relatively easy to whitelist with a bot of some sort? Or maybe there's some nifty plugin for this?

Really though, blocking all images is going to be your best bet. I like the idea of just having the Beehaw bee drawings. You could possibly let us have access to a selection of avatars to pick, or have a little draw plugin so members can draw their own. On that note, those collaborative drawing plugin things can be a fun addition to a site... If someone is very keen for others to see a particular image, they can explain how to find it, or they can organise to connect with each other off Beehaw.

[–] jarfil 11 points 1 year ago* (last edited 1 year ago) (1 children)

block new members from uploading images

I've tried those methods something like 10 years ago. It didn't work; people would pose as decent users, then suddenly switch to posting shit when allowed. I'm thinking nowadays, with the use of ChatGPT and similar, those methods would fail even more.

Modern filtering methods for images may be fine(-ish), but won't stop NSFL and text based stuff.

Blocking VPN access, to a site intended as a safe space, seems contradictory.

anyone hoping to share [...] illegal content will just go elsewhere

Like someone else's free WiFi. Wardriving is still a thing.

draw plugin so members can draw their own

That can be easily abused, either manually or through a bot. Reddit has the right idea there, where they have an avatar generator with pre-approved elements. Too bad they're pretty stifling (and sell the interesting ones as NFTs).

[–] apis 4 points 1 year ago (5 children)

Yup, as it gets ever easier to overwhelm systems, there are no good solutions to the matter, aside from keeping it text only + Beehaw's own drawings.

load more comments (5 replies)
[–] storksforlegs 8 points 1 year ago* (last edited 1 year ago)

I second everything you said here

[–] jarfil 19 points 1 year ago* (last edited 1 year ago)

Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible

I'm very sorry this happened to you, and I wish I could offer you some advice... but that's the main reason I stopped hosting open community stuff many years ago. I thought I was hardened enough, but nope; between the spam, the "shock imagery" (NSFL gore, CSAM), the doxxing, and toxic users in general.... even having some ads was far from making it all worthwhile. There is a reason why "the big ones" like Facebook or Google churn through 3rd world mods who can't take it for more than a few months before getting burnt out.

I wish I could tell you that you'll eventually forget what you've seen... but I still remember stuff from 30 years ago. Also don't want to scare you, but it's not limited to images... some "fanfiction" with text imagery is evil shit that I still can't forget either.

Nowadays, you can find automated CSAM identification services, like the one run by Microsoft, so if you integrated that, you could err on the side of caution and block any image it marks as even suspicious. This may or may not work in your jurisdiction, with some requiring you to "preserve the proof" and submit it to authorities (plus different jurisdictions having different definitions of what is an what isn't breaking the law, and laws against swamping them with false positives... so you basically can't win). This will also do nothing for the NSFL or text based imagery.

A way to "shield yourself" from all of this as an admin, is to go to an encrypted platform where you can't even see what's getting posted, so you never run the risk of seeing that kind of content... but then you end up with zero moderation tools, pushing all the burden onto your users, so not suitable for a safe space.

Honestly, I don't think there is an effective solution for this yet. It's been a great time ~~abusing the good will of the admins and mods~~ staying on Beehaw, but if you can't find a reasonable compromise... oh well.

[–] newtraditionalists 17 points 1 year ago

Beehaw is such a special effort. I am so regretful that people have to be subjected to the darkest parts of humanity in order to protect the beehaw project. I don't need images. If that is the necessary course of action, then so be it.

Most importantly, I am so sorry to you as one human to another. I'm sorry you saw that. I'm sorry humans are hurting each other like that. And I'm sorry that your good faith efforts have been taken advantage of.

[–] pemmykins 16 points 1 year ago (1 children)

I mentioned this in discord a while back, but there are image-matching databases for known instances of CSAM that you can apply for access to, as an admin of a forum or social media site. If you had access, you could scan each image uploaded or linked to in a post or comment, and compare to the database for matches. I think that mastodon is adding some hooks for this kind of checking during the upload phase, but I’m not sure what the status is with Lemmy.

I’m happy to help facilitate a solution like this, as it’s something I also care about. Feel free to find me on discord if you want to talk.

Also, as others have said - I’m sorry you had to go through that. The same thing happened to me many years ago and it definitely affected me for a long time.

[–] ShellMonkey@lemmy.socdojo.com 8 points 1 year ago

There are some automated options out there at the frontend already. One that looks simple if not absolute is putting the site through cloudflare with their csam engine. It'll even do some of the dirty work reporting to the appropriate agency and putting a legal bar up on the link until someone can delete it.

[–] Kajo 15 points 1 year ago* (last edited 1 year ago) (1 children)

First of all, I'm so sorry that you have been exposed to such horrors. I hope you can handle that, or find help to.

I don't have a solution, I'd just like to share some thoughts.

  1. Some people suggested that AIs could detect this kind of content. I would be reluctant to use such tools, because lots of AI projects exploit unprotected workers in poor countries for data labeling.

  2. An zero-image policy could be an effective solution, but it would badly impact @dyi@beehaw.org, @creative@beewah.org and @greenspace@beehaw.org.

  3. correct me if I'm wrong, but on the fediverse, when a picture is posted on an instance, it is duplicated on all federated instances? If I'm right, it means that even if beehaw found a way to totally avoid CSAM posting, you could still end up with duplicated CSAM on your server? (with consequences on your mental health, and possibly legal risks for owning such pictures)

[–] jarfil 6 points 1 year ago (1 children)

correct me if I'm wrong, but on the fediverse, when a picture is posted on an instance, it is duplicated on all federated instances?

Kind of. It duplicates on all instances that subscribe to the community where it was posted to. Behind the scenes, Lemmy makes each community a "user" that boosts everything posted to that community. That content, is only getting pushed to instances where at least one user has subscribed to that community/"user", then any included images get cached. So if nobody subscribes to a federated instance'a community, none of the content gets duplicated.

The biggest problem right now are users with "burner accounts" who exploit instances with free-for-all registrations, to push content to communities that have subscribers from as many different instances as possible... possibly "lurker" accounts created by the same attacker just to subscribe to the remote community they're attacking and have the content show in the default "All" feed of all instances.

There are some possible countermeasures for that:

  • Defederate from any instance with "free for all" registrations
  • Remove "lurker" accounts who only subscribe to non-local communities, particularly if they're the only subscriber for those communties
  • Limit the "All" feed, definitely DO NOT show it as the default for anonymous users (like on the web). Ideally, admins should be able to choose what to show in there, even from their own instance.
  • Run some image ID, AI, or other filtering on the content
[–] Kajo 4 points 1 year ago* (last edited 1 year ago)

Thank you so much for all these explanations! I didn't know the communities/users were so important in the system.

I thought that a duplicate of each post on a instance was automatically sent to all federated instances, and I wondered how the servers didn't get overloaded by the global activity.

[–] GunnarRunnar 14 points 1 year ago

I don't know what kind of platform would make it impossible to not host csam. Text only, without linking?

Or is the problem here that the post was available to the wider audience, so other than moderation front was able to see it? Lemmy doesn't do restricted posting? Jesus this platform is at its infancy then.

[–] Phantaminum@lemmy.zip 14 points 1 year ago

Hey, Im not a user from this instance, but I feel worried about you OP.

Perhaps take a break, even if you could not erase your memory's as some people have said please take care of yourself. If it is not restrictive please go to therapy or in case that it is too expensive there is always a free service in a lot of countries.

[–] Gaywallet 13 points 1 year ago (1 children)

A few observations/thoughts.

  • There's an awful lot of posts basically saying "this is a part of the job of moderation" and I don't think that's a particularly empathetic or useful observation. I've been on the internet and moderating for long enough to have been exposed to a lot of this, but this is not an inevitability. It's an outcome of the system we've designed, of regulation and law that we have, and of not prioritizing this as a problem strongly enough. Being dismissive of an emotional experience and trauma isn't particularly helpful.
  • I'm not technical enough to explain this, but there are technical and legal issues with CSAM and the lemmy platform that we've ran into. For one, there's no automated scanning tools for this kind of content. My understanding is that even implementing or creating said tools would be difficult because of the way pict-rs and rust are storing images in the first place. You cannot turn off image federation, at all. At best, you can clear the content, but doing so may violate CSAM laws depending on the country and reporting requirements. Someone on the technical side can explain better than I can.
  • This isn't a thread to discuss who's to blame for CSAM. Please cease all discussions fighting about religion in the comments. I will be removing these comments.
[–] Penguincoder 6 points 1 year ago* (last edited 1 year ago)

You cannot turn off image federation, at all.

This is correct for Lemmy codebase; but a WIP by the pictrs dev and upstream Lemmy itself.

For now, Beehaw users can go to their settings via the website, and uncheck Show images if they're so inclined. This should prevent all images in posts and comments from loading automatically for you. This does not translate to other instances, front-ends, or apps. Just the main website. EDIT: Because of the caching, you'll need to CTRL +F5 after saving this setting, to see it take affect.

[–] nlm 13 points 1 year ago (1 children)

Sorry to hear that mate! That's one of the biggest reasons I've never wanted to move towards IT forensics even though I think I'd enjoy the actual work. But having to regularly sift through the absolute worst humanity has to offer sounds awful.

Hope the immediate pain of it settles as soon as possible!

This might not be what people want but since beehaw is going to leave Lemmy anyway, couldn't you just completely defederate and run as an isolated instance? Then you'd have control of what her life gets published without having to deal with federated nastiness?

[–] Flax_vert@feddit.uk 5 points 1 year ago (1 children)

Or federate with the nicer communities on whitelist basis or have admins apply for federation

load more comments (1 replies)
[–] kobold 9 points 1 year ago

i am so sorry you had to see that kind of thing.

[–] noctisatrae 8 points 1 year ago* (last edited 1 year ago)

I can’t imagine how terrible you must feel. You should get some external help… I hear that hypnosis is used for traumatic experiences.

I need to think about what you said. I really want to see the community blossom. We should maybe make an IRC community?

EDIT: I think it’s called EMDR

[–] violetsareblue 7 points 1 year ago

I’m sorry this happened to you. I think you should disable images on beehaw. Not worth compromising your mental health and it’s too big a job to moderate that level of stuff - which you will have to do if images can be hosted.

I’m really sorry again that someone traumatized you and others with csam. Some people are beyond effed up - I’m really angry hearing about this.

[–] gaytswiftfan 7 points 1 year ago* (last edited 1 year ago)

Im sorry you had to witness that. I grew up on forums and message boards in the mid 2000s and onward and Ive still not forgotten a lot of the shock images and other vile things people would post on forums.

You essentially sacrificed a part of your innocence to aid the community and i find that incredibly selfless and respected, although again I am truly sorry it ever had to come to that.

I know it's not really my place as a stranger to give unsolicited advice but if you find yourself struggling, there is a form of therapy called EMDR that is supposedly very successful with getting the brain to fully digest traumatic events.

[–] baggins 6 points 1 year ago

You have my vote.

[–] Rentlar 5 points 1 year ago

I am very sorry you had to go through such a terrible experience.

It is my sincerest hope that you will be able to find a workable solution to this problem, from Lemmy or elsewhere.

I am (and have been) okay with admins taking any action necessary to accomplish the goals of the Beehaw project. So removing image hosting, implementing lemmy-safety, restricting federation severely, do whatever you need.

And please, also do whatever you need to care for yourself, including if it means needing to take a break from the site.

load more comments
view more: next ›