this post was submitted on 05 Sep 2023
10 points (100.0% liked)

Humanities & Cultures

2534 readers
2 users here now

Human society and cultural news, studies, and other things of that nature. From linguistics to philosophy to religion to anthropology, if it's an academic discipline you can most likely put it here.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
top 3 comments
sorted by: hot top controversial new old
[–] CapedStanker 4 points 1 year ago

Just explain to the students that it's like a calculator except the answers might be wrong.

[–] Gaywallet 3 points 1 year ago (1 children)

AI is not a tool that is going to disappear. Like all tools, finding out where it can be useful in your life or career and putting it to use is a good thing. Just like any other tool, it might not be particularly useful in your life - many of us learn how to use a calculator in school, yet hardly use them. Similarly, we may learn to use a hammer yet rarely find a reason to. With something like AI, however, I can see a lot of potential uses in the future and also a slow expanding of how many tools are built upon AI as a foundation. Sensing this early and helping those in education to understand where and when it's appropriate to use AI is a useful thing to be teaching children, much like teaching basic computer skills.

Thinking of AI on a single dimension and using that to argue for its relevance in the classroom is a major oversight. AI is to be combined with your own thinking, in the same way that a calculator is. Telling AI to write the paper for you may result in a factual paper when it deals with something as monumental as the example of MLK, but in many cases as the first professor points out, will actually result in something full of errors and biases. Rather than thinking about this in black and white and advocate explicitly for/against something like ChatGPT, the professor should be advocating for its use as something more akin to a partner - someone you work with to get a finished product. ChatGPT can certainly refine the words you give it to make them easier to read or to have a more coherent thought process. ChatGPT can, at least in some circumstances, direct you towards additional resources to research which might help to bolster whatever you are writing.

The article brings up AI assisted plagiarism, which I think is an important concept to bring up. AI itself sits in a weird space right now in that it is trained off the words and minds of others without appropriately citing them. It also has a lot of issues with hallucinations which make them particularly problematic when you're looking for truth. The idea that the use of a tool would rob you of your ability to think at a higher level is just as misplaced as math teachers believing that calculators prevent people from conceptually understanding what is going on - it's simply incorrect. While the research does not currently exist to see if the use of AI alongside human thinking is problematic, I suspect we will see a similar outcome. Learning to use tools to accomplish a task do not make us dumber, and in fact enable us to do things with increased efficiency and skill. Speaking to any tradesman who learns to use a bunch of different tools should be more than enough to convince you that they know what they are doing and that their tools are not holding them back from knowledge in any meaningful way.

We should be cautious given the above. AI is not omniscient nor is it perfect. It makes things up, its prone to bias, and we can let it completely automate tasks for us. We should be careful to understand how and when it's appropriate to employ. When we take our own thinking out of the equation or we do not understand what AI brings to the table, this is where AI can be harmful. When we employ AI to accomplish tasks without human oversight or intervention it can be extremely problematic. Letting AI make decisions about child welfare for example, is not a good idea. But its a far cry from using AI to help us put our thoughts about a subject into the form of an essay. AI needs to be a thought partner for it to be most useful and to make the most out of this valuable tool we need to find a way to incorporate that into our schools and lives alongside some basic understanding of how these tools work.

[–] frog 3 points 1 year ago

A "thought partner" is actually the only time I've found ChatGPT to be legitimately useful. I neither want nor need it to write anything for me (writing something myself is part of my process), and when I've asked it for things in the past, the answers have been very "meh". But a couple weeks ago I asked it to generate a writing prompt for me, because I'd been looking for prompts online and just constantly going "no, no, no, not that one either, nope, these all suck, I want something with X, Y, and Z". So I asked ChatGPT to create a writing prompt with X, Y, and Z, and while the responses I got were still pretty derivative... a couple days later I suddenly went "oh, but if I changed it like this, it would actually be interesting". So I ended up with an idea I actually liked.

This isn't asking ChatGPT to write a story for me. I'm not going to use any of the actual words it generated (you know, except in the vaguest sense that obviously we both used a lot of common words without which it's impossible to write, like "the"). But as a "thought partner" who basically echoed my own ideas back to me in a way that sparked a thought process that eventually gave me something useful? Yeah, it kind of works.