this post was submitted on 30 Mar 2025
103 points (100.0% liked)

Technology

28 readers
24 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 1 year ago
MODERATORS
top 18 comments
sorted by: hot top controversial new old
[–] lvxferre@mander.xyz 33 points 4 days ago (1 children)

The root of the problem is way, way older than AI. It's a mix of

  • humans being naturally lazy, typically not developing skills or knowledge unless we're clearly getting something out of it
  • we have a thoooousand tools enabling us to do stuff without skill/knowledge
  • our education systems do not value self-improvement enough to promote the development of those skills and knowledge

So it's a lot like you not remembering phone numbers by heart because you can check them in your contact list, you know?

And, yes, text generators do play a role on that. But when it comes to critical thinking, it's a death of a thousand cuts.

[–] N0body@lemmy.dbzer0.com 11 points 4 days ago (1 children)

Exactly. Just the latest VC technological advancement to exacerbate existing problems. The lack of critical thinking is why the far right has room to breathe, let alone brainwash entire populations.

The sad part is that it’s likely all by design. Turn everyone into sheep then line them up for slaughter.

[–] lvxferre@mander.xyz 6 points 4 days ago

Dunno if it's by design, "bug turned into feature", or simply neglect. In any case, the result is the same, though - masses that are easy to manipulate, composed of dysfunctional individuals.

The lack of critical thinking is why the far right has room to breathe

100% this. People often say "you're not immune to propaganda", and that's true - complete immunity is impossible. However, critical thinking does raise your resistance, as it makes you less eager to swallow bullshit.

[–] Mac@mander.xyz 24 points 4 days ago* (last edited 4 days ago)

"System designed around teaching students to memorize shocked to learn students are not learning to think. More at 6."

[–] Shaper@lemm.ee 17 points 4 days ago (2 children)

Sure! Here's an analysis of how AI may impact critical thinking in students: A lot of students have been known to use AI to write their essays and homework, which may have a negative impact in their learning process, since they are not using their own skills to think about their assignments. This has been reported several times in the media, specially because sometimes students forget to erase the first lines of the AI answers, which are typically directed to the user, and make it easier to detect that the answer was produced by AI.

But don't worry, if you need anything else, I'm here for you!

[–] Wanpieserino@lemm.ee 2 points 3 days ago (1 children)

Ah, the age-old debate of AI in education—where the line between 'assistance' and 'assignment' gets blurrier than a chalkboard after a day of lectures! While it's true that AI can sometimes be the 'ghostwriter' for essays, let's not forget that it can also be a fantastic tutor, offering instant feedback and endless patience. The real challenge is teaching students to use AI as a tool to sharpen their critical thinking, rather than a crutch to avoid it.

Imagine if calculators had never been allowed in math class because they 'did the work for you.' We'd still be stuck on long division while the world moved on to algebra! The key is balance—using AI to enhance learning, not replace it. And as for those telltale AI intro lines, well, consider them a modern-day 'cheat sheet' detector—a gentle nudge to remind students that original thought is still the gold standard.

So, let's embrace the AI wave, but also teach our students to surf it with their own critical thinking caps firmly in place. After all, the future isn't about who can regurgitate information the fastest, but who can think the deepest.

[–] Shaper@lemm.ee 1 points 3 days ago (1 children)

AI is not like calculators. Calculators are simple, it's purpose is clear and it's easy to asses the extent to which it fulfills it, they are also open source, if anything because they are easy to reverse engineer. AI is a closed source product meant to be commodified or served as a service for a profit by private companies. They are monumental proyects built on tons of energy and patented material unduly acknowledged, nobody knows how they really work and there's neither public funding for research nor open source ecosystems to provide alternatives. Their owners dont care about kids and their skills, they care for money. So the problem is not that we are ignoring the ai wave, the problem is that the wave is being steered by a private actor over whom we as a society have no control. Even if you wanted to teach kids to use ai intelligently, nothing garantees you actually can, since its owner may declare banktupcy or just change it without saying and you will have a new problem to deal with. So yeah fuck ai. I'm a robotics teacher in middle school and I do teach ai btw, I just dont encourage it's use. I just teach how it works and how to use it as a better search engine. This is more so because I have to rather than because I want to.

[–] Wanpieserino@lemm.ee 1 points 3 days ago

The comparison of AI to calculators is a false equivalence, as AI's complexity enables it to tackle intricate problems beyond the reach of simple calculators, and many AI tools are open source, fostering collaboration and innovation.

Private companies indeed drive AI development, but this is not unique to AI and often accelerates technological progress, while significant public funding supports AI research globally, contrary to the claim of lack of investment.

Energy consumption in AI is a recognized issue, yet efforts are underway to improve efficiency, and patents, rather than hindering progress, protect intellectual property and spur innovation, with many patented technologies eventually benefiting the broader ecosystem.

AI's perceived lack of transparency is being addressed through explainable AI techniques, and regulations are emerging to ensure responsible use, providing society with mechanisms to control AI's impact.

Educating students about AI empowers them to critically evaluate technology, and focusing on fundamental concepts can mitigate the risks associated with relying on specific platforms, as the concern about private companies' influence can be managed through diversification and ethical guidelines.

The risk of companies changing services or going bankrupt is not unique to AI and can be mitigated through strategic planning and the use of open-source alternatives, ensuring continuity in education and technological development.

[–] possiblylinux127@lemmy.zip 4 points 4 days ago (1 children)

I think students tend to turn to AI when there workload is to much

[–] sin_free_for_00_days@sopuli.xyz 8 points 4 days ago (2 children)

Students will take whatever shortcut they can to be done with it as easily as possible.

[–] fluffykittycat@slrpnk.net 6 points 4 days ago

Honestly school sucks I don't blame them

load more comments (1 replies)
[–] mbtrhcs@feddit.org 9 points 4 days ago

I've literally done my own study on this with CS students and found a similar result. Students who reported using AI regularly couldn't recognize when it wasn't giving them any useful output

[–] arsCynic 5 points 3 days ago (1 children)

The average human's critical thinking skill was already low before AI, so...

[–] Geodad@lemm.ee 3 points 3 days ago

That's largely thanks to religion.

[–] Bob_Robertson_IX@discuss.tchncs.de 2 points 4 days ago (1 children)

It is impacting their critical thinking because the teachers aren't teaching the kids how to use AI!

My kids came home talking about an interview she did with one of her heroes. She knew all kinds of facts, including what type of dogs her hero has. I had to explain to my 8 year old that an AI doesn't know very much, but it will never tell you that it doesn't know something.

I don't mind that my kid's school uses AI for learning, but I am pissed at how they are using it the exact wrong way. It should go side by side with learning critical thinking.

[–] 0x0@lemmy.dbzer0.com 2 points 4 days ago

It's illuminating to ask an LLM a question, and then say it was wrong. In my experience they will do a 180 every time.

[–] SplashJackson@lemmy.ca 1 points 4 days ago

Yeah but that's the plan

[–] Zaleramancer 1 points 3 days ago

Intellectual labor is hard and humans don't like doing difficult things, paired with a culture that's increasingly hostile to education and a government that wants you ignorant- it's easy to see how this happens in the US.