this post was submitted on 26 Sep 2023
8 points (83.3% liked)

Technology

58138 readers
6183 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

I'm never putting one of these in my home.

top 6 comments
sorted by: hot top controversial new old
[–] gamer@lemm.ee 1 points 11 months ago

These types of projects are driven by metrics, and teams have some kind of quota/goal that they need to reach by a certain date to keep the project on schedule. Bonuses or job security may be on the line here, and so you may see some desperate employees "going the extra mile" to reach their goals.

Relatedly, Alexa's voice activation sensitivity is essentially a tunable number. It can be changed to be more sensitive, so that it will activate more easily (e.g. maybe you say "Alex" instead of "Alexa"). The people who control this are likely on the team with that deadline, so the incentives are there to lower this value in order to collect more data by recording personal conversations "accidentally". Maybe a bad update goes out that causes Alexa to activate randomly, and they quickly fix it after a few days when they collected all the non-Alexa personal conversations they need for their AI.

That's maybe a bit too deep into the paranoia/tinfoil hat spectrum for some, but history has shown that you can't give big tech the benefit of the doubt. Especially when you see some of the documents from the Google trial, where executives discuss rolling back new features to improve arbitrary metrics in the short term so that they can get their bonuses for the quarter, even if it hurts consumers.

[–] MusketeerX@lemm.ee 1 points 11 months ago

Is this a surprise to anyone?

This was already my understanding when I got the first pre-release one in 2014.

In that time, it has mainly learned how to"dim the living area lights to 50%" and "set the AC to 22 degrees". That is about 99% of it's use.

Wonder if that's helped it's AI much...

[–] lntl@lemmy.ml -1 points 11 months ago (1 children)

haven't we all known this since product launch ?

[–] lloram239@feddit.de 0 points 11 months ago* (last edited 11 months ago) (1 children)

I think most people, me included, underestimate the scale of the operation. When you hear "company will use private data to do X", you imagine what a reasonable person would do, like random sample a few conversations here and there. In reality they record everything permanently over months and years, far beyond what would be necessary to run the service.

It's kind of crazy how we get this level of surveillance while still having software that will lose your data if you don't hit Save often enough.

[–] lntl@lemmy.ml -2 points 11 months ago

that's fair. i work with data for a living so that probably biases my perspective

[–] Scrof@sopuli.xyz -1 points 11 months ago

Would've been newsworthy if it wasn't the case