Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption Security

Worries Arise About Security of New WebAuthn Protocol (zdnet.com) 57

An anonymous reader writes: "A team of security researchers has raised the alarm about some cryptography-related issues with the newly released WebAuthn passwordless authentication protocol," reports ZDNet. "The new WebAuthn protocol will allow users of a device -- such as a computer or a smartphone -- to authenticate on a website using a USB security key, a biometric solution, or his computer or smartphone's password." But researchers say that because WebAuthn uses weak algorithms for the operations of registering a new device, they can pull off some attacks against it.

"If converted into a practical exploit, the ECDAA attacks discussed in the article would allow attackers to steal the key from a [server's] TPM, which would allow attackers to effectively clone the user's hardware security token remotely," Arciszewski, one of the researchers, told ZDNet. "The scenarios that follow depend on how much trust was placed into the hardware security token," he added. "At minimum, I imagine it would enable 2FA bypasses and re-enable phishing attacks. However, if companies elected to use hardware security tokens to obviate passwords, it would allow direct user impersonation by attackers." Attacks aren't practical, and experts say the root cause relies in badly written documentation that may fool some implementers into supporting the old algorithms instead of newer and more solid ones. The FIDO Alliance was notified and has started work on updating its docs so it won't look like it's recommending ECDAA or RSASSA-PKCS1-v1_5. "PKCS1v1.5 is bad. The exploits are almost old enough to legally drink alcohol in the United States," Arciszewski said.

This discussion has been archived. No new comments can be posted.

Worries Arise About Security of New WebAuthn Protocol

Comments Filter:
  • by thegarbz ( 1787294 ) on Monday September 10, 2018 @03:21AM (#57282946)

    If algorithms are known to be weak, why are they included in new standards? Are they expensive or are there compatibility reasons why we don't implement the "best" in the newest standards?

    I know nothing about this, but the way the summary was written would imply only the registration of the devices is weak, does that mean the actual authentication uses a strong algorithm?

    • by Anonymous Coward

      Because the standard is effectively written by Google

      • by Zocalo ( 252965 ) on Monday September 10, 2018 @03:43AM (#57282996) Homepage
        In part but, as with most security SNAFUs where people really should have known better, I'm also wondering how much involvement the intelligence services like the NSA, GCHQ, etc. may have had behind the scenes. It's well documented that governments have been looking to get backdoors in secure web protocols one way or another (legislation being the means du jour), and what better way to do that than with an end-run around the whole problem by compromising users' accounts and simply acquiring their login details? Sure, the researchers might be claiming that some of the attacks are not really practical for typical attackers, but the NSA etc. are not really typical attackers, and especially so since they have things like NSLs in their toolbox.

        If so, it's good to see that they are *still* only paying lip service to the notion that if only $friendly_governments has knowledge of the backdoor and necessary computation resources, then it's just a matter of time before $not_so_friendly_governments, $very_unfriendly_governments, $cyber_criminals, and (eventually) $every_script_kiddie_and_their_dog will have the necessary knowledge and resources too. Perhaps they think Snowden was a one-off or something?
        • NSA etc. are not really typical attackers

          I think they are very typical attackers. Kids fooling around won't operate on a large scale. Anyone else does. It does not matter if these actors are intelligence agencies vacuuming up all internet traffic, half-legal copyright "cops", or downright criminal organizations that are only after your *coins and banking details. The main difference is the amount of time it takes for these data to fall into the wrong hands.

          That said, it might be perfectly possible that criminal organizations are trying to promote

    • by Anonymous Coward on Monday September 10, 2018 @03:25AM (#57282964)

      Because no actual experts were involved in making the standards.

      This is the usual case when "web"-anything is involved, as evidenced by, for example, the entire works of the W3C.

    • the algorithm is fine its the encoding... i.e. this RFC which to give the authors credit they updated...

      To be honest I suspect that this was driven by hardware and the "standard" defines other algorithm's so its simply a case of getting rid of that combination
      Its a good thing that people are auditing these before they become enshrined and I hope WebAuthn becomes stronger because of it

      WebAuthn is so so much better than the current in place alternatives... we really really needs to give an alternative

      • by Anonymous Coward on Monday September 10, 2018 @05:34AM (#57283116)

        The could link webauthn to classic certificates because they are directly available and even we could use today as SSL option.
        But, for some reason, browser developers (MS,Google, Mozilla Foundation...) has removed signature functionality without develop an alternative and WebAuthn seems to go into the same logic pushing new hardware without put an option to use classical cryptographic standards ( RSA tokens, PKCS11, CryptoAPI on Windows...) from the same WebAuthn.
        It seems that they try to destroy PKI instead of create a true replacement of passwords. PKI is available now, altough WebAuthn could add PKI as an option. But It seems that they don't want to do it and It seems even probably to me that if they are success, they will try to kill PKI later supressing SSL client certificates as a authentication system in a future when WebAuthn was well adopted.

        PKI authentication and signature directly available on Browser should be a high priority functionality, but they refuse to do it. Why?

      • by nasch ( 598556 )

        How is the encoding the problem? Generally encoding is not a security measure, but an interoperability measure. If you're relying on encoding rather than encryption, that doesn't sound like security at all. Maybe I'm misunderstanding something.

    • by AmiMoJo ( 196126 )

      From TFA:

      "In subsequent email exchanges with the Paragon team, ZDNet understands that at the heart of the issue may be the confusing WebAuthn documentation released by the FIDO Alliance team, which, for legacy purposes, categorizes both algorithms as "required" (for RSASSA-PKCS1-v1_5) and "recommended" (two ECDAA-based algorithms)."

      TL;DR they adopted an older standard to avoid building something from scratch, and there is some old and poorly worded advice in the documentation.

  • by Anonymous Coward on Monday September 10, 2018 @04:16AM (#57283020)

    Yes, I know. It's easy to choose a weak password. And then you write it with a sharpie on your smartphone's back and things.

    But there is one enormous advantage to a password: it's in *my* head. When I pass away, then it is gone too -- unless I've left a copy to someone I trust. This is a feature I won't give up on.

    So: use a password generator (that's the only way to really put a controlled amount of entropy on that). Fucking memorize it (the first times it seems impossible, but my most important three to four passwords, like HD LUKS password, backup encryption are pwgen -n 16 -- no problem memorizing that after a modicum of training).

    Don't trust schemes like this that *make people dumber*. Rather make people smarter.

    • by ledow ( 319597 )

      The memorisation thing is always used as an excuse.

      You want to memorise it? Set your important passwords to it. Make yourself type it in a thousand times a day. Guess what, you'll be able to screw up the piece of paper with it on by next week because you'll be so frustrated at having to refer to it and you'll have typed it so often that you'll get it stuck in your head.

      The problem with it being just "in your head" is that if you're hit by a bus and forget it - all your stuff is gone forever. Even though

      • by Megol ( 3135005 )

        So why exactly didn't the hypothetical person store the paper (or a copy of it) in a secure place instead? Bank, at home hidden or in a safe?

      • by Anonymous Coward

        > he problem with it being just "in your head" is that if you're hit by a bus and forget it [...]

        Yes, but exactly this is *the* feature. My head is gone, the credentials are gone. Unless I take provisions towards the contrary. That's exactly how I want it to work.

      • Set your important passwords to it.

        Passwords... plural. You just failed security 101.

        • by ledow ( 319597 )

          Depends.

          Are you really stupid enough to think that having unique passwords for absolutely everything gives you security?

          Or you do you believe that two services which basically access identical amounts of information about you can happily share a password because any breach of either only means compromise of the exact same information / access?

          For example, I really don't care that my Slashdot password is the same as other web forums where the worst that can happen is someone posts something under my name. I

          • Are you really stupid enough to think that having unique passwords for absolutely everything gives you security?

            Something giving security. You just failed security 101 again, and looked quite the fool calling someone stupid in the process.

            For example, I really don't care that my Slashdot password is

            So maybe you have a different definition of "important" (your words not mine) than I do.

            At the highest tier, yes, unique passwords per service

            Cool so we are in agreement. Next time be more verbose and just paste this in the reply box rather than giving poor security advice publically and hoping others will read your mind to get the complete picture.

      • Then: someone decides your password must have 1 upper and lower case letter, 1 number, and 1 odd-looking thing, with 8 characters, and change every 30 days.

        • Rules that blindly enforce arbitrary combinations of specific characters piss me off to no end.

          A rule that passwords must have an uppercase letter adds, at best, 1 to {len-1} bits of entropy. Rules that mandate 1+ digits add (at best) another bit of entropy (1/4-1/2 bit x {len-2}). Ditto for "special characters" if the set of acceptable characters is limited to a subset of those in ASCII that can be directly typed, as opposed to "any unicode value".

          If any rule has to be enforced, it should be based on some

    • by grumbel ( 592662 ) <grumbel+slashdot@gmail.com> on Monday September 10, 2018 @05:35AM (#57283118) Homepage

      The problem with a password isn't just that you have to remember it, but that you give your secret away every time you try to log in to a service. So if there is a man-in-the-middle or you just accidentally entered the password into the wrong server, your password is now compromised. There are plenty of better ways to do authentication that don't require exposing your secret.

    • by nasch ( 598556 )

      Rather make people smarter.

      How do you propose to do that?

  • The exploits are almost old enough to legally drink alcohol in the United States,

    Exploitability is the #1 thing I look for in drinking buddies.

  • by account_deleted ( 4530225 ) on Monday September 10, 2018 @05:58AM (#57283152)
    Comment removed based on user account deletion
    • by ljw1004 ( 764174 )

      It's like all of the healthcare and insurance websites that say you can only use 8-16 characters and provide a tiny whitelist of special characters. Why are those even allowed to be online if that's all they can handle? That's all but an admission you're too cheap to update anything to make it really secure.

      I think "restrict to tiny whitelist of special characters" sounds like a really good idea. As soon as we delve into variable-length characters, I worry that some (any) part of the stack might not handle them right. Maybe there's code which has the wrong computation about how many bytes a given character takes. Maybe some code is vulnerable to a string which terminates half way through a multibyte sequence. Maybe some code indexes into a string, which you can't do with variable-length.

      Yeah it is a matter of

  • I was wondering if this kind of system could be used to enable admin access. Admin access would be enabled remotely from a Microsoft server using two factor authorization, say, and no malware would be able to access admin privileges as this would be the only way to escalate. Made robust, it would destroy malware ability to take over machines by escalation of privileges. It could be a feature across all users, or, an enterprise addition for a fee with the purpose of increasing security for critical systems.
  • Do you remember? That was the standard that split an 8 digit key, which was actually 7 digits and a checksum digit. into two parts in the protocol, one with 4 digits and one with 3 digits and the checksum digit. The documentation was suggestively written in a way that made implementers check the parts separately and return a right-or-wrong information for the first part separately from the second part, not just for both parts together. This reduced the necessary attempts from 10,000,000 possible keys down t

  • by najajomo ( 4890785 ) on Monday September 10, 2018 @07:09AM (#57283304)
    By any chance, did the NSA help them with the algorithms?
    • When PKCS#1 v1 and 1.5 were written, RSA Laboratories was working closely with the NSA. At the RSA Data Security Conference, it was made very clear to those attending that they were there.

      That being said, PKCS#1v1.5 has been deprecated. PKCS#1v2.1 and 2.2 are out there. PKCS#1v2.0 corrected the major issues with 1.5.

      Why TLS is still using the old standard is beyond me other than effort to safely transition to the newer standards. Allowing for a transition to the newer standard while still supporting the

Utility is when you have one telephone, luxury is when you have two, opulence is when you have three -- and paradise is when you have none. -- Doug Larson

Working...