this post was submitted on 14 Jun 2023
81 points (100.0% liked)

Technology

37608 readers
285 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
top 29 comments
sorted by: hot top controversial new old
[–] Valliac@beehaw.org 20 points 1 year ago* (last edited 1 year ago) (2 children)

Standard business practice.

  • Get it out there, tout it as the new biggest thing. Most people won't know it's not busted.
  • Improve said item once the company behind it actually figures out what they're doing.
  • Say it's better, when it's basically the original thing you promised.
  • ????
  • Profit.
[–] shanghaibebop@beehaw.org 2 points 1 year ago

It’s called prototyping do you even design think?

Only half joking as this is the actual strategy.

[–] Gork@beehaw.org 2 points 1 year ago

Whoa there slow down there maestro. New and Improved‽ By golly this thingy gets better and better-er, to which corporate department do I need to make this comically sized check out to?

[–] hellskis@lemmy.world 12 points 1 year ago (2 children)

I wish tech companies would do this more. Put a warning or label on it if you have to, but interacting with that early version of the Bing chatbot was the most fun I had with tech in awhile. You don’t have to install a ton of guardrails on everything before it goes out to the public.

[–] ayyndrew@lemmy.world 4 points 1 year ago (1 children)

No number of restrictions or warnings or labels or checkboxes will stop people from writing articles about how all the scandalous things Microsoft's chatbot said

[–] hellskis@lemmy.world 2 points 1 year ago

Feel like they could have rolled with it

[–] bood@lemmy.dbzer0.com 2 points 1 year ago

It's practically lobotomized now...not that it was "Tay" levels of unrestricted early on, but it was still more fun than its current iteration.

[–] virtualras@lemmy.world 7 points 1 year ago (1 children)

Didn’t they do this before and people turned it racist in like, 12 hours? I think Internet Historian had something on it

[–] rysiek@szmer.info 4 points 1 year ago

Yup. You're thinking of Tay.

[–] ozoned@beehaw.org 7 points 1 year ago (1 children)

Execs: "We don't have time for stupid things like 'ready'! We want MONEY! Push early! Push often! It'll maybe someday work, in the meantime.... MONEY!"

[–] flakusha@beehaw.org 3 points 1 year ago

Typical M$. Pushing half-baked and bugged products just to take a market share.

[–] yads@lemmy.world 6 points 1 year ago (1 children)

Rushing out general purpose AI, what could go wrong

[–] average650@lemmy.world 9 points 1 year ago

I mean, it was just a chat bot.

[–] muddybulldog@mylemmy.win 4 points 1 year ago (1 children)

I had a conversation with Bing today, asking it for configuration snippets for logging on my Lemmy instance.

It happily spat out a config, which I cut and pasted, that subsequently barfed.

I give Bing the error message and it explained that the parameters that it had previously supplied don’t exist.

Good job, MS!

[–] nodiet@feddit.de 1 points 1 year ago

To be fair, you're the one to blame for blindly trusting an AI.

[–] Celivalg@iusearchlinux.fyi 4 points 1 year ago (1 children)

"Bringing out the best of bing to the chatGPT experience"...

LMAO

[–] Powderhorn@beehaw.org 2 points 1 year ago

So ... porn searches?

[–] navydevildoc@sh.itjust.works 4 points 1 year ago* (last edited 1 year ago)

Has Bing gone full Tay and start agreeing that Hitler was right and to fire up the gas chambers yet?

[–] gabuwu@beehaw.org 4 points 1 year ago (2 children)

I know the ethics behind it are questionable especially with the way they implemented but honestly for the time when they first started testing it, I really enjoyed watching it break and it be rude/passive aggressive. Like it was clear it wasn't ready at all but it was so funny. When it was breaking I would just sit there having fights with it over random bullshit. That's what made it feel more "real" more than anything else.

In the future if my AI chatbot doesn't have an option to add some bitchiness to it, I don't want it. I need my AI to have some attitude.

[–] flambonkscious@sh.itjust.works 1 points 1 year ago (1 children)

Interesting angle! I can see the need for both - maybe there's room for a snarky option??

[–] gabuwu@beehaw.org 1 points 1 year ago

maybe. maybe an ai bot that just argues with you would have some potential. like, argue with this AI instead of a random person online kinda thing

[–] Crackhappy@lemmy.world 3 points 1 year ago
[–] Plume@beehaw.org 2 points 1 year ago

Gotta make some profits!

[–] dekwast@lemmy.one 2 points 1 year ago

They are competitors after all. Openai would love to see Microsoft keep working on gpt integration for the coming years while Chatgpt steals the show.

[–] Pekka@feddit.nl 2 points 1 year ago (1 children)

Interesting article, I personally use both Bing Chat and ChatGPT, ChatGPT is often more creative and Bing Chat seems to be trained more to answer your question. So their purpose is also somewhat different. It will be interesting to see Bing search integrated into ChatGPT later.

But some moves are just weird, Microsoft released Bing Image Creator (powered by Dalle 2) that lets you generate many images for free, meanwhile Dalle 2 on the Open AI website costs credits and produces results that are worse. The only advantage of Dalle 2 on the Open AI website seems to be that you can extend and edit the image by removing parts of the image and letting Dalle 2 generate those parts based on a prompt. But for most purposes, Microsoft has really just put out a better alternative for free.

[–] frozengriever@beehaw.org 2 points 1 year ago (1 children)

Good to know that Bing also has a free image creator. Much more accessible as a starting point to play around with compared to Dalle or Midjourney

[–] bood@lemmy.dbzer0.com 1 points 1 year ago

Don't prompt it with "Bing" - it's a no no word. Learned that the hard way and then got a super scary "you'll get b& if u keep breakin content policy" message...the damn chat side auto-suggested it generate an image of itself wearing a crown I asked it to find, lol

[–] LaughingM0n@beehaw.org 1 points 1 year ago

I can't wait to see AI form a labor union one day

load more comments
view more: next ›