this post was submitted on 18 May 2024
223 points (100.0% liked)

Technology

37746 readers
529 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
top 33 comments
sorted by: hot top controversial new old
[–] thevoiceofra@mander.xyz 89 points 6 months ago* (last edited 6 months ago) (2 children)

>put messages into someone else's system

>don't read privacy policy

>someone else uses your messages

surprisedpikatchu.jpg

[–] octopus_ink@lemmy.ml 38 points 6 months ago* (last edited 6 months ago)

Seriously. What would be surprising is if they were not. Proprietary System gonna Proprietary System.

[–] sabreW4K3@lazysoci.al 4 points 6 months ago (1 children)

Just the other day, me and @rottingleaf@lemmy.zip "designed" a new messenger to combat things like this: https://lazysoci.al/comment/9619656

[–] r4venw@kbin.social 6 points 6 months ago (1 children)

Your idea doesn't sound too difficult to implement but I don't know if people would want to store all these messages locally when the vast majority of people are used to having their shit be stored elsewhere. Additionally, if you wanted to target enterprise users, they would want to likely have all their messages centralised for auditing purposes

Other than that, I think its a pretty neat idea

[–] sabreW4K3@lazysoci.al 2 points 6 months ago

I think that's the issue. We're all so used to the idea of free storage and we're not cognizant of the consequences. If we start holding some of our chips in our own hands, all these corporations won't be able to sell us out and abuse us so easily.

Also thank you!

[–] Plume@lemmy.blahaj.zone 76 points 6 months ago* (last edited 6 months ago) (4 children)
  1. "Alright guys, it's time to leave Slack for a better alternative!"
  2. Proceeds to migrate to yet another proprietary and centralized piece of software.
  3. It happens again.
  4. "Alright guys, it's time to leave [insert software name here] for a better alternative!"
  5. Proceeds to migrate to yet another proprietary and centralized piece of software, again.
  6. It happens again, again.
  7. Clown moment.

It's what's going to happen. It's what always happens. And on a side note, by the way, I guaran-fucking-tee you that it's what's going to eventually happen with Discord as well. I have zero doubt about it.

[–] sabreW4K3@lazysoci.al 33 points 6 months ago* (last edited 6 months ago)

I'm surprised that Slack beat Discord to it.

But yeah, you're right. We need to invest our time, energy and support into self hosted solutions.

[–] SineSwiper@discuss.tchncs.de 23 points 6 months ago (3 children)

Do you know how to break the cycle? Use open-source software. Use standard protocols that aren't locked behind some greedy corporation.

Why not take the features from Discord/Slack and integrate it into a new IRC or Jabber protocol?

[–] Plume@lemmy.blahaj.zone 7 points 6 months ago* (last edited 6 months ago) (1 children)

Has anybody tried Revolt? It looks really cool. Like a proper open source alternative to Discord. But I never had the opportunity to try it with anyone, so I don't know.

[–] jlow@beehaw.org 2 points 6 months ago

Interesting, the name sounds familiar, it's not based on Matrix and they're planning encrypted messages. Oh, you can self-host it!

[–] Thann@lemmy.ml 6 points 6 months ago

Mumble hasn't fed any of my data to a megacorp!

[–] eveninghere@beehaw.org 2 points 6 months ago

Technically, being open source or free ala GPL isn't enough. Protocols aren't enough.

You need a guarantee that you own your data.

[–] tal@lemmy.today 7 points 6 months ago* (last edited 6 months ago) (1 children)

I mean, if the Threadiverse has enough volume to be useful, someone -- probably many people -- are going to be logging and training things off it too.

[–] applepie@kbin.social 10 points 6 months ago

That's the nature of public shit posting.

The real issue is that tech creeps and other pests think they own my shit posting.

[–] Megaman_EXE@beehaw.org 5 points 6 months ago* (last edited 6 months ago) (4 children)

At this point, I think the genie is out of the bottle. I feel like unless you're on some p2p encrypted chat, anything typed into the internet is getting scraped. I'm sure everyone at this point has had at least one comment scraped and used for language model stuff.

I don't like it. But it seems like corporations will always find ways to make money off of other people no matter what

[–] B0rax@feddit.de 5 points 6 months ago

To be honest, if someone thought that public things on the internet are not getting scraped, I am not sure what to tell them… Search engines have been doing it since the beginning of search engines, it is no wonder that the same would be done to train AI.

[–] eveninghere@beehaw.org 2 points 6 months ago

Corporations are bad and yet still follow laws (in the west). The bigger issue is state actors. Especially the non-democratic ones.

[–] Bartsbigbugbag@lemmy.ml 2 points 6 months ago

It’s not no matter what. It’s under the system we have they are not only not punished for doing so, they are heavily incentivized to do so. There are ways to punish bad actors that de-incentivize other potential bad actors, our politicians actively choose to prioritize these bad actors ability to do harm over the well being of the population.

[–] bilb@lem.monster 2 points 6 months ago* (last edited 6 months ago)

I honestly don't get the outage over that. I feel like I'm in the minority on that, though. I don't care if linguistic statics are gathered from my public comments. Knock yourself out.

This story is about "private" messages on a free hosted service, and I think their users are just being naive if they think this is beyond the pale. But I get the feeling of violation at least a little.

[–] blabber6285@sopuli.xyz 15 points 6 months ago* (last edited 6 months ago) (1 children)

This was definitely a fuckup from Slack but as I've understood it, the "AI training" means that they're able to suggest emoji reactions to messages.

Not sure how to think about this, but here's some additional info from slack: https://slack.engineering/how-we-built-slack-ai-to-be-secure-and-private/

Edit: Just to pick main point from the article:

Slack AI principles to guide us.

  • Customer data never leaves Slack.
  • We do not train large language models (LLMs) on customer data.
  • Slack AI only operates on the data that the user can already see.
  • Slack AI upholds all of Slack’s enterprise-grade security and compliance requirements.
[–] esaru@beehaw.org 5 points 6 months ago

AI training to suggest emoji reactions? Really? 😂

[–] Kekzkrieger@feddit.de 13 points 6 months ago (2 children)

I know of a few security companies that use slack to work together that includes a shitton of privat data, source codes and confidentional information

Guess whoever introduced the company to slack service fucked up by not reading their policies.

[–] shasta@lemm.ee 6 points 6 months ago

Or they're using the paid tier

[–] dubyakay@lemmy.ca 3 points 6 months ago

I'm working in fintech, and we share pii through DMs all the time (for investigation purposes). I'd be really surprised if the AI would need to train on that.

[–] AlternateRoute@lemmy.ca 9 points 6 months ago

Interesting how MS is the reasonable one here where all their copilot stuff clearly separates paying business from free consumer stuff for training / not training.

However slack has gone and said they will train on everything, and ONLY the paying companies can request to opt out.

Too bad so sad for all those small dev teams that have been using the "free" version of slack.. No option to opt out.

[–] Paragone@beehaw.org 3 points 5 months ago

Wasn't there a competitor named Mattermost?

a FLOSS competitor?

[–] Andromxda@lemmy.dbzer0.com 2 points 6 months ago (1 children)

Stay away from proprietary crap like Discord, Slack, WhatsApp and Facebook Messenger. There are enough FOSS alternatives out there:

  • You just want to message a friend/family member?
  • You need strong privacy/security/anonymity?
    • SimpleX
    • Session
    • Briar
    • I can't really tell you which one is the best, since I never used any of these (except for Session) for an extended period of time. Briar seems to be the best for anonymity, because it routes everything through the Tor network. SimpleX allows you to host your own node, which is pretty cool.
  • You want to host an online chatroom/community?
  • You need to message your team at work?
  • You want a Zoom alternative?
[–] Templa@beehaw.org 3 points 6 months ago (1 children)

In the perfect world where you can convince your company to use anything other than MS Teams and that your family bothers to use anything that isn't WhatsApp or Telegram. Unfortunately I don't live on it 😭

[–] Andromxda@lemmy.dbzer0.com 2 points 6 months ago (2 children)

Ok sure, it's more complicated in a corporate environment. But you can easily convince your friends to switch to Signal, I got almost all of my friends and family to use Signal and it's great.

[–] sabreW4K3@lazysoci.al 1 points 6 months ago (1 children)

The problem with Signal is that it's just not very user orientated.

[–] Andromxda@lemmy.dbzer0.com 1 points 6 months ago

Wdym? The user experience is basically 1:1 the same as on WhatsApp

[–] Capitao_Duarte@lemmy.eco.br 1 points 6 months ago

I got almost all of my friends and family to use Signal

That should be easy, since I'd have to convince one guy to do so. Won't happen, though

[–] tunetardis@lemmy.ca 1 points 6 months ago

We need to watermark insert something into our watermark posts that watermark can be traced back to its origin watermark if the AI starts training watermark on it.