this post was submitted on 30 Nov 2024
211 points (88.6% liked)
Technology
59774 readers
3092 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It’s not so much that they don’t give a damn, but that they can’t tell. I taught some basic English courses with a research component (most students in their first college semester), and I’d drag them to the library each semester for a boring day on how to generate topics, how to discern scholarly sources, then use databases like EBSCO or JSTOR to find articles to support arguments in the essays they’d be writing for the next couple years. Inevitably, I’d get back papers with so-and-so’s blog cited, PraegerU, Wikipedia, or Google’s own search results. Here’s where a lot of the problem lies: discerning sources, and knowing how to use syntax in searches, which is itself becoming irrelevant on Google etc. but NOT academic databases. So why take the time to give the “and” and “or” and “after: 1980” and “type: peer-reviewed” when you can just write a natural-language question into a search engine and get an answer right away that seems legit in the snippet? I’d argue the tech is the problem because it encourages a certain type of inquiry and quick answers that are plausible, but more often than not, lacking in any credibility.
Is it the tech? Or is it media literacy?
I've messed around with AI on a lark, but would never dream of using it on anything important. I feel like it's pretty common knowledge that AI will just make shit up if it wants to, so even when I'm just playing around with it I take everything it says with a heavy grain of salt.
I think ease of use is definitely a component of it, but in reading your message I can't help but wonder if the problem instead lies in critical engagement. Can they read something and actively discern whether the source is to be trusted? Or are they simply reading what is put in front of them then turning around to you and saying "well, this is what the magic box says. I don't know what to tell you.".
No, they also dont give a shit.
I think my kid is gonna be just fine. He rarely believes anything i tell em without follow up evidence....He's 5.
But Ive always focused on critical thinking skills from as early as possible.
You must understand that you are part of an infinitesimal minority.
It makes me sad as I observe the wider population 😭
Youre gonna have to write fully cited and sourced academic papers to convince him to clean his room.
Heh, it's been a struggle of complying with authority on simple/mundane task. But slowly its getting away from it