this post was submitted on 30 May 2024
210 points (94.1% liked)

Asklemmy

43963 readers
2407 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

So my company decided to migrate office suite and email etc to Microsoft365. Whatever. But for 2FA login they decided to disable the option to choose "any authenticator" and force Microsoft Authenticator on the (private) phones of both employees and volunteers. Is there any valid reason why they would do this, like it's demonstrably safer? Or is this a battle I can pick to shield myself a little from MS?

you are viewing a single comment's thread
view the rest of the comments
[–] Nighed@sffa.community 29 points 6 months ago (4 children)

The ms authenticator works in 'reverse' in that you type the code on the screen into the phone. I assume this is preferable to corporate as you can't be social engineered into giving out a 2fa token. It also has a "no this wasn't me" button to allow you to (I assume) notify IT if you are getting requests that are not you.

I don't believe that the authenticator app gives them access to anything on your phone? (Happy to learn here) And I think android lets you make some kind of business partition if you feel the need to?

[–] Max_P@lemmy.max-p.me 15 points 6 months ago (1 children)

And the authenticator is configurable and they can enforce some device security like not rooted, bootloader locked, storage encryption is on through the Intune work profile. If you work on a bank, you don't want the 2FA to even live on a device where the user gives root access to random apps that could extract the keys (although at this point come on you can probably afford Yubikeys).

As a user, not a fan, but as an IT department it makes complete sense.

[–] ramble81@lemm.ee 3 points 6 months ago (1 children)

You’re thinking of Intune and the Company Portal app. That’s where the device enforcement comes into play. Authenticator can be installed on any system regardless of its state and their enforcement policies.

[–] deweydecibel@lemmy.world 0 points 6 months ago (1 children)

For now.

The point is, the patterns in software security are pretty clear. People will keep finding ways around the authenticator, eventually someone will get their account compromised, and at some point it will get more restrictive.

It doesn't matter how it works now, because once it's normalized that this Microsoft app must be on your phone so you can work, and it must operate exactly as it wishes to, Microsoft will be able to start pushing more restrictions.

At a certain point, the device simply has to be verified as secure in and of itself before it can keep another device secure. Meaning your phone will be brought under your workplace's security policies.

[–] ramble81@lemm.ee 4 points 6 months ago

What? No. This is complete hyperbole and speculation, and off at that too. Their Authenticator is used for personal accounts as well as managing 3rd party TOTP tokens. It’s no different than Google Authenticator, DUO Authenticator or Okta Authenticator. I could see that on a far end if they come out with a business only version, but given that everything is backed on their same platform it doesn’t behoove them to do that.

[–] lemmylommy@lemmy.world 4 points 6 months ago (1 children)

Hello, this is your IT department/Microsoft/the popes second mistress. We need you to test/revalidate/unfuckulate your Microsoft Authenticator by entering this code….

[–] Carighan@lemmy.world 3 points 6 months ago (1 children)

Yeah and that wouldn't work, as they would not be able to generate a valid 2FA code.

[–] Nighed@sffa.community 0 points 6 months ago (1 children)

Bad actor goes to super secret page while working on 'fixing' and issue for the user. They then get the 2 digit request code and ask the user to input it to 'resolve' the issue.

Mostly the same as any other 2fa social engineering attack I guess, but the users phone does display what the code is for on the screen which could help.... But if your falling for it probably not.

[–] Carighan@lemmy.world 2 points 6 months ago (1 children)

Yeah but that's a wholly different attack, and oodles more complex to pull off. Doable, sure. But it's absolutely not the same thing as phishing for a valid 2FA code that is generated user-side.

And don't get me wrong, both are overall very security. But there is a case to be made for push auth.

[–] Nighed@sffa.community 0 points 6 months ago (1 children)

It's not that different is it? You still need to get a user to share/enter a live code?

[–] AtariDump@lemmy.world 1 points 6 months ago (1 children)

One requires the user to go to a bad page and get a spoofed 2FA code so the bad guy can log in.

Do you know how hard that is? Not worth it for 99% of hacks.

The other requires that the user read off their six digit code on their device.

Trivial easy since they already have the user’s password.

[–] Nighed@sffa.community 0 points 6 months ago (1 children)

It requires the bad guy to go to the page and ask the user to enter the code the bad guy gets

[–] AtariDump@lemmy.world 2 points 5 months ago (1 children)

How does the bad guy get to the page?

Then how does he get the user to enter in that code into their mobile device?

[–] Nighed@sffa.community 1 points 5 months ago

You can probably get the URL for a companies SharePoint pretty easily, but you need a login. You are able to get a PAs credentials through a phishing link etc but need the 2fa code.

You do the IT phishing attack (enter this code for me to fix your laptop being slow...), get them to enter the code and now you have access to a SharePoint instance full of confidential docs etc.

I'm not saying it's a great attack vector, but it's not that different to a standard phishing attack.

You could attack anything that's using the single sign on. Attack their build infrastructure and you now have a supply chain attack against all of their customers etc.

It helps but its not enough to counter the limits of human gullibility.

[–] Carighan@lemmy.world 2 points 6 months ago (1 children)

I mean the only real issue I see with this is that they require people to use their personal phones for this. Should not mix work and private data, and this should be in the interest of the corp, too. As in, issue work phones!

[–] Nighed@sffa.community 2 points 6 months ago (1 children)

From a practical PoV - most people have their phone on them all the time. A work phone or a physical token can (and will) get forgotten, a personal phone much less.

[–] Carighan@lemmy.world 2 points 6 months ago

Yeah but legally it's a bit more iffy once something gets breached and then it turns out that no, private phones are not covered by the stuff you signed for work security (because they usually cannot be, rather most written stuff explicitly forbids people from using their private phones for stuff like this, even in company who expect workers to do it).

[–] englislanguage@lemmy.sdf.org -1 points 6 months ago

If it is just TOTP, you can use any other TOTP app, such as Aegis or FreeOTP+.

And no, Microsoft cannot be trusted on not doing anything bad. The app is full of trackers and has an excessive list of permissions it "requires".

For comparison, Aegis and FreeOTP+ work without trackers and way less permissions.

Microsoft has a long track record of leaks. Just naming the 2 most prominent:

  1. Microsoft Edge leaks every single URL to Microsoft servers (source)
  2. There are lots of reports that Microsoft had their general key stolen and not even notify it for months. It is unclear who had acces to that key. This is putting anyone at risk who uses any Microsoft product. (See for example here)