this post was submitted on 06 Jul 2023
23 points (100.0% liked)

ChatGPT

8837 readers
2 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS
 

I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries...

It simply replied that it can't do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn't remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It's really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

top 50 comments
sorted by: hot top controversial new old
[–] Gabu@lemmy.world 6 points 1 year ago

Your wording is bad. Try again, with better wording. You're talking to a roided-out autocorrect bot, don't expect too much intelligence.

[–] chaogomu@kbin.social 4 points 1 year ago (6 children)

The very important thing to remember about these generative AI is that they are incredibly stupid.

They don't know what they've already said, they don't know what they're going to say by the end of a paragraph.

All they know is their training data and the query you submitted last. If you try to "train" one of these generative AI, you will fail. They are pretrained, it's the P in chatGPT. The second you close the browser window, the AI throws out everything you talked about.

Also, since they're Generative AI, they make shit up left and right. Ask for a list of countries that don't need a visa to travel to, and it might start listing countries, then halfway through the list it might add countries that do require a visa, because in its training data it often saw those countries listed together.

AI like this is a fun toy, but that's all it's good for.

[–] alternative_factor@kbin.social 2 points 1 year ago (1 children)

Are you saying I shouldn't use chat GPT for my life as a lawyer? 🤔

[–] Sage_the_Lawyer@lemmy.world 0 points 1 year ago* (last edited 1 year ago) (1 children)

It can be useful for top-level queries that deal with well-settled law, as a tool to point you in the right direction with your research.

For example, once, I couldn't recall all the various sentencing factors in my state. ChatGPT was able to spit out a list to refresh my memory, which gave me the right phrases to search on Lexis.

But, when I asked GPT to give me cases, it gave me a list of completely made up bullshit.

So, to get you started, it can be useful. But for the bulk of the research? Absolutely not.

load more comments (1 replies)
[–] PixxlMan@lemmy.world 1 points 1 year ago

Not quite true. They have earlier messages available.

[–] Anticorp@lemmy.world 1 points 1 year ago

They know everything they've said since the start of that session, even if it was several days ago. They can correct their responses based on your input. But they won't provide any potentially offensive information, even in the form of a joke, and will instead lecture you on DEI principles.

[–] Dirk@lemmy.ml 1 points 1 year ago

AI like this

I wouldn't even call those AIs. This things are statistics-based answering machines. Complex ones, yes, but not one single bit of intelligence is involved.

[–] Ech@lemmy.world 0 points 1 year ago (5 children)

I seriously underestimated how little people understand these programs, and how much they overestimate them. Personally I stay away from them for a variety of reasons, but the idea of using them like OP does or various other ways I've heard about is absurd. They're not magic problem solvers - they literally only make coherent blocks of text. Yes, they're quite good at that now, but that doesn't mean they're good at literally anything else.

I know people smarter than me see potential and I'm curious to see how it develops further, but that all seems like quite a ways off, and the way people treat and use them right now is just creepy and weird.

[–] CarbonatedPastaSauce@lemmy.world 0 points 1 year ago (1 children)

I’ve found it useful for generating ideas for various things, especially ways to code something. But I never use its code. It’s often riddled with errors but it can give me a better idea of which path I should take.

[–] grysbok@lemmy.sdf.org 0 points 1 year ago (2 children)

I use it similarly to clean up OCRed text. I can hand it something full of 70% gobbledygook and it hands me back something that makes sense and is 95% right. I manually verify it, fix the goofs, and it's so much faster.

So, riddled with errors but a decent start.

load more comments (2 replies)
load more comments (4 replies)
[–] EinSof@lemm.ee 0 points 1 year ago (14 children)

@ChatGPT@lemmings.world

testing

load more comments (14 replies)
[–] PopShark@lemmy.world 3 points 1 year ago
[–] yokonzo@lemmy.world 2 points 1 year ago (3 children)

Just make a new chat ad try again with different wording, it's hung up on this

Honestly, instead of asking it to exclude Africa, I would ask it to give you a list of countries "in North America, South America, Europe, Asia, or Oceania."

[–] XEAL@lemm.ee 1 points 1 year ago

Chat context is a bitch sometimes...

[–] MaxVoltage@lemmy.world 0 points 1 year ago* (last edited 1 year ago)

Is there an open source A^i without limitations?

[–] Lininop@lemmy.ml 1 points 1 year ago (1 children)

Is it that hard to just look through the list and cross off the ones you've been to though? Why do you need chatgpt to do it for you?

[–] Yuuuuuuuuuuuuuuuuuuu@lemmy.world 1 points 1 year ago* (last edited 1 year ago) (1 children)

People should point out flaws. OP obviously doesn’t need chatgpt to make this list either, they’re just interacting with it.

I will say it’s weird for OP to call it tiptoey and to be “really frustrated” though. It’s obvious why these measures exist and it’s goofy for it to have any impact on them. It’s a simple mistake and being “really frustrated” comes off as unnecessary outrage.

[–] TechnoBabble@lemmy.world 2 points 1 year ago

Anyone who has used ChatGPT knows how restrictive it can be around the most benign of requests.

I understand the motivations that OpenAI and Microsoft have in implementing these restrictions, but they're still frustrating, especially since the watered down ChatGPT is much less performant than the unadulterated version.

Are these limitations worth it to prevent a firehose of extremely divisive speech being sprayed throughout every corner of the internet? Almost certainly yes. But the safety features could definitely be refined and improved to be less heavy-handed.

[–] sturmblast@lemmy.world 1 points 1 year ago

Run you own bot

[–] Zaphod@discuss.tchncs.de 0 points 1 year ago (3 children)

Have you tried wording it in different ways? I think it's interpreting "remove" the wrong way. Maybe "exclude from the list" or something like that would work?

[–] Furbag@lemmy.world 2 points 1 year ago

"I've already visited Zimbabwe, Mozambique, Tanzania, the Democratic Republic of the Congo, and Egypt. Can you remove those from the list?"

Wow, that was so hard. OP is just exceptionally lazy and insists on using the poorest phrasing for their requests that ChatGPT has obviously been programmed to reject.

[–] TechnoBabble@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

"List all the countries outside the continent of Africa" does indeed work per my testing, but I understand why OP is frustrated in having to employ these workarounds on such a simple request.

load more comments (1 replies)
[–] breadsmasher@lemmy.world 0 points 1 year ago (1 children)

You could potentially work around by stating specific places up front? As in

“Create a travel list of countries from europe, north america, south america?”

[–] Razgriz@lemmy.world 0 points 1 year ago* (last edited 1 year ago) (1 children)

I asked for a list of countries that dont require a visa for my nationality, and listed all contients except for the one I reside in and Africa...

It still listed african countries. This time it didn't end the conversation, but every single time I asked it to fix the list as politely as possible, it would still have at least one country from Africa. Eventually it woukd end the conversation.

I tried copy and pasting the list of countries in a new conversation, as to not have any context, and asked it to remove the african countries. No bueno.

I re-did the exercise for european countries, it still had a couple of european countries on there. But when pointed out, it removed them and provided a perfect list.

Shit's confusing...

load more comments
view more: next ›