Bullshit. This assumes the people training LLMs are the same ones building the datasets. Once a dataset is created, it can be used to train multiple models, meaning that there's no further impact on API usage.
Yeah, the fuck? I've reloaded and I'm somewhere else than where I wrote the comment.
Bullshit. As if the people training the LLMs are the same ones building the datasets. And once the dataset is built it can be used to trained all models so it doesn't affect them.
There could be a Lemmy instance that imported the Reddit content from one of the archives like https://the-eye.eu/redarcs/
You don't need to write anything in the post body on Lemmy. No need to say title.
This is a bug that has existed for a while. You can go to GitHub and upvote the relevant issue.
Me too, in the web interface not the server, probably needs a github issue or some votes if there is one.
Oh man, it's going to be so much fun to have to group similar communities when there are hundreds of instances! ๐