twicetwotimes

joined 1 year ago
[–] twicetwotimes@lemmy.world 1 points 1 year ago

The thing about Arsenal is, they always try to walk it in.

[–] twicetwotimes@lemmy.world 13 points 1 year ago* (last edited 1 year ago) (2 children)

Fall really far a lot. Stick sticks to big sticks. Throw fruit to avoid confrontation. Frequent fashion changes. Still can’t pet dogs.

[–] twicetwotimes@lemmy.world 1 points 1 year ago

Leah is best ~girl~.

[–] twicetwotimes@lemmy.world 3 points 1 year ago

I am not a D&D a person and my husband very much is. He loves actual play podcasts and desperately wants me to love them too, but I just can’t.

Dungeons & Daddies is the solution. It’s pure gold. It’s the first and only podcast I’ve ever joined a Patreon for. (I admittedly do also appreciate many parts of The Adventure Zone, but the daddies are comedy start to finish. No weird sappy awkward dramatic improv taking itself too seriously.)

[–] twicetwotimes@lemmy.world 6 points 1 year ago

When we talk to each other face to face, the words we choose are only one of an enormous toolkit of resources we have to communicate the nuances of our messages. When I call someone a piece of shit, you can tell how I really feel by everything else that goes with it. How loud am I being? Am I smiling? Am I shrugging? Am I talking fast or slow? Nonverbal cues (e.g., gesture, facial expression, eye gaze, posture) and paralinguistic cues (e.g., sighs, laughter, pitch, speed, volume, breathiness) make it really easy for the same exact words to communicate a million different things. If I say “he’s a piece of shit,” you can infer from all this other stuff whether I’m really upset, whether I’m expressing empathy but not investment, whether I’m being entirely sarcastic, whether I’m just having fun swearing…whatever. AND if you’re not sure what I mean, all it takes is a slightly confused expression from you and I can immediately clarify.

When you take the conversation online, we lose all that. When I write “he’s a piece of shit,” in my head it still comes with the million flavors of nuance it could have in conversation. When you read it, you get none of them. Everything comes off literal and straightforward. This is the problem that things like emoji and \s are attempting to solve, but nothing will ever really replace all the context of conversation.

I’m not saying it’s good or bad. Maybe we need to learn to use a wider range of “linguistic colors” to be more effective communicators online. And maybe there’s an element of cultural reproduction too: nobody starts out meaning to sound a extreme s they do but then the internet just starts to feel like an extreme place so we expect that that’s how we should talk in this context. I’m not sure about how we SHOULD talk online, but I do believe the cause of what you’re describing isn’t malicious, lazy, or otherwise ill-intended. I think it’s just things lost in translation.

[–] twicetwotimes@lemmy.world 6 points 1 year ago

Honestly I feel like at this point its unreliability is kind of helpful for students. They have to learn how to use it most effectively as a tool for producing their own work and not a replacement. In my classes the more relevant “problem” for students is that GPT produces written work that on the surface feels composed and sensible but is actually straight up garbage. That’s good. They turn that in, it’s extremely obvious to me, and they get an F (because that’s the grade AI earned with the garbage paper).

But they can and should use it for things it’s great at: reword this long sentence I’m having trouble phrasing concisely, help me think of a title for my paper, take my pseudocode and help me turn it into a while loop in R, generate a list of current researchers on this topic and two of their most recent publications, translate this paragraph of writing from Foucault/Marx/Bourdieu/some-good-thinker-and-bad-writer into simpler wording…

I have a calculator in my pocket even though my teachers assured me I wouldn’t. Students will have access to and use AI forever now. The worry should be that we fail to teach them the difference between a homework-bot and an incredible, versatile tool to leverage.

[–] twicetwotimes@lemmy.world 24 points 1 year ago (4 children)

Agreed. I think being between academic years is likely a much bigger factor than we realize. I’m a college professor, and at the end of spring quarter we had a lot of conversations with undergrads, grad students, and faculty about how people are actually using AI.

Literally every undergrad student I spoke with said they use it for every written assignment (for the large part in non-cheating legit educational resource ways). Most students used it for all or most of their programming assignments. Most use it to summarize challenging or long readings. Some absolutely use it to just do all their work for them, though fewer than you might expect.

I’d be pretty surprised if there isn’t a significant bounce-back in September.