this post was submitted on 22 Jun 2023
251 points (100.0% liked)
Technology
37737 readers
45 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Respectfully, I think you're overreading the meaning of the Mavrix Photos case. That case involved the most popular LiveJournal community, moderated by a team led by a literal employee, where the mods reviewed the submissions by users before posting, and only posted about 1/3 of the submitted content. It was a human-required process for anything to be posted at all, and it went through the moderation team that was arguably controlled by LiveJournal. And even then, the appellate court sent it back down to the trial court to figure out whether a jury would determine whether that procedure counts as content being posted at the direction of a user, rather than at the direction of the company. It also made clear that some pre-posting review would still be OK even by the company's agents/employees, such as when they manually review for pornography/spam/etc.
And after this year's Supreme Court decision in Twitter v. Taamneh, which reversed the Ninth Circuit's ruling that Twitter and similar companies could be liable for user activity on those services, it's pretty clear that having paid/employed moderators doesn't actually make services liable for what they fail to stop on their platforms. Liability will only happen when an employee actually does the thing that gives rise to liability (e.g., posting infringing material themselves).
So no, I disagree with your analysis that paying or compensating moderators gives rise to risk of liability. Especially after the most recent Supreme Court cases on Twitter and Google, which call the Mavrix reasoning into question.