How many of these books will just be totally garbage nonsense just so they could fulfill a prearranged quota.
Now the LLM are filled with a good amount of nonsense.
A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.
Rules
This is a science community. We use the Dawkins definition of meme.
How many of these books will just be totally garbage nonsense just so they could fulfill a prearranged quota.
Now the LLM are filled with a good amount of nonsense.
Just use the llm to make the books that the llm then uses, what could go wrong?
Someone's probably already coined the term, but I'm going to call it LLM inbreeding.
In computer science, garbage in, garbage out (GIGO) is the concept that flawed, biased or poor quality ("garbage") information or input produces a result or output of similar ("garbage") quality. The adage points to the need to improve data quality in, for example, programming.
There was some research article applying this 70s computer science concept to LLMs. It was published in Nature and hit major news outlets. Basically they further trained GPT on its output for a couple generations, until the model degraded terribly. Sounded obvious to me, but seeing it happen on the www is painful nonetheless...
It's quite similar to another situation known as data incest
It can only go right because corporations must be punished for trying to replace people with machines.
And they expect you to do this for free?
Do they not have to pay for the privilege? Or is this not referring to academic publishing? (It’s not super clear, but context indicates academic?)
If it is that makes it even worse. Academic publishers need to be abolished.
Nah, they get “Exposure”!
/s
Honestly sometimes I feel like I'm the only one on Lemmy who likes AI
AI as a technology is fascinating and can be extremely useful, especially in places like the medical field. AI as a product in its current state is nothing more than dystopian plagiarism.
The company I work for recently rolled up copilot and is have been a mixed bag of reactions, the less savvy user were first blowed up by the demonstration but then got exasperated when it didn't worked as they tough (one of them uploaded an excel file and asked to some analysis it couldn't do, and came to me to complain about it), but for me, and my team had worked great. I've been uploading some of my python and SQL scripts and asking for refactoring and adding comments, or uploading my SQL script and some example I found on stackoverflow and asking for it to apply the example method on my script.
I say to everyone that if they don't know shit, the AI isn't not going to help a lot, but if you have at least the basic, the AI would help you.
I like AI. But I'm not sure I like the way we use it if it's only to meet shareholders' expectations or to be a tool for greedy people. What is your opinion concerning the way we seem to use AI in academic research?
Found the black and white only guy.
What's the academic terminology for "go pound sand"?