Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption Math Linux

Linux RNG May Be Insecure After All 240

Okian Warrior writes "As a followup to Linus's opinion about people skeptical of the Linux random number generator, a new paper analyzes the robustness of /dev/urandom and /dev/random . From the paper: 'From a practical side, we also give a precise assessment of the security of the two Linux PRNGs, /dev/random and /dev/urandom. In particular, we show several attacks proving that these PRNGs are not robust according to our definition, and do not accumulate entropy properly. These attacks are due to the vulnerabilities of the entropy estimator and the internal mixing function of the Linux PRNGs. These attacks against the Linux PRNG show that it does not satisfy the "robustness" notion of security, but it remains unclear if these attacks lead to actual exploitable vulnerabilities in practice.'" Of course, you might not even be able to trust hardware RNGs. Rather than simply proving that the Linux PRNGs are not robust thanks to their run-time entropy estimator, the authors provide a new property for proving the robustness of the entropy accumulation stage of a PRNG, and offer an alternative PRNG model and proof that is both robust and more efficient than the current Linux PRNGs.
This discussion has been archived. No new comments can be posted.

Linux RNG May Be Insecure After All

Comments Filter:
  • Yawn (Score:5, Interesting)

    by Anonymous Coward on Monday October 14, 2013 @10:16PM (#45128453)

    The output of a software RNG, aka PRNG (pseudo random number generator), is completely determined by a seed. In other words, to a computer (or an attacker), what looks like a random sequence of numbers is no more random than, let's say,

    (2002, 2004, 2006, 2008, 2010, 2012...)

    However, the PRNG sequence is often sufficiently hashed up for many applications such as Monte Carlo simulations.

    When it comes to secure applications such as cryptography and Internet gambling, things are different. Now a single SRNG sequence is pathetically vulnerable and one needs to combine multiple SRNG sequences, using seeds that are somehow independently produced, to provide a combined stream that hopefully has acceptable security. But using COTS PC or phone doesn't allow developers to create an arbitary stream of independent RNG seeds, so various latency tricks are used. In general, these tricks can be defeated by sufficient research, so often a secure service relies partly on "security through obscurity", i.e. not revealing the precise techniques for generating the seeds.

    This is hardly news. For real security you need specialized hardware devices.

  • by Anonymous Coward on Monday October 14, 2013 @10:24PM (#45128509)

    So, with all the 'revelations' and discussion surrounding this and encryption over the past several weeks, I've been wondering if a local hardware based entropy solution could be developed. By 'solution', I mean an actual piece of hardware that takes various static noise from your immediate area, ranging from 0-40+kHz( or into MHz or greater?), both internal and external to case, and with that noise builds a pool for what we use as /dev/random and /dev/urandom. Perhaps each user would decide what frequencies to use, with varying degrees of percentage to the pool, etc.. etc...

    It just seems that with so much 'noise' going on around us in our everyday environments, that we have an oppurtunity to use some of that as an entropy source. Is anyone doing this, cause it seems like a fairly obvious implementation.

  • by Anonymous Coward on Monday October 14, 2013 @10:38PM (#45128605)

    I swear, if I worked for the NSA I'd be pushing out headlines like this to make people ignore real security issues...

    The article is a highly academic piece that analyzes the security of the linux rng against a bizarre and probably pointless criteria: What is an attacker's ability to predict the future output of the RNG assuming he knows the entire state of your memory at arbitrary attacker selected points in time and can add inputs to the RNG. Their analysis that the linux rng is insecure under this (rather contrived) model rests on an _incorrect_ assumption that Linux stops adding to the entropy pool when the estimator concludes that the entropy pool is full. Instead they offer the laughable suggestion of using AES in counter mode as a "provably secure" alternative.

    (presumably they couldn't get a paper published that said "don't stop adding entropy just because you think the pool is at maximum entropy", either because it was too obviously good a solution or because their reviewers might have noticed that Linux already did that)

  • by mark_reh ( 2015546 ) on Monday October 14, 2013 @10:41PM (#45128629) Journal

    zener diodes biased into avalanche mode to generate random noise? I don't think even the NSA has figured out how to hack laws of thermodynamics.

  • by Anonymous Coward on Monday October 14, 2013 @11:13PM (#45128799)

    He "confirmed" he'd been asked to backdoor linux, he never confirmed whether or not he agreed... :)

  • by Anonymous Coward on Monday October 14, 2013 @11:33PM (#45128911)

    Right up until someone figures out how to use those "ultimately not all that important" technical weaknesses to good use. And then suddenly every single last deployed system might turn out to be vulnerable. Which might already be the case, but has been kept carefully secret for a rainy day.

    Of course, usually it's a lot of "theoretical" shadow dancing, and given the nature of the field some things will indubitably remain unclear forever (unless another Snowden stands up, who just happens to give us a clear-enough answer), but there are an awful lot of people working exactly on finding this sort of entirely theoretical stuff and turning it into practical usability, while keeping their findings secret.

    So, unfortunately, this sort of mathematical tin foil hattery is necessary, because all we know is that we're behind the curve (we only know what's made public), not how much behind the curve, and we have the disadvantage of the cost of fixing a large installed base.

    We more or less can't afford not to peep pixels, because we simply don't know how much is too much, and how much is too little. This despite a foul-mouthed "looking ahead is hard!" populist figure with a noted lack of technical innovation skill.

  • by Anonymous Coward on Monday October 14, 2013 @11:41PM (#45128955)

    First of all, not all computers are PCs. A server running in a VM has no audio input, fan speed, keyboard, mouse, or other similar devices that are good sources of entropy. A household router appliance running Linux not only has no audio input, fan, keyboard, or mouse -- it doesn't even have a clock it can use as a last resort source of entropy.

    Second, there are many services that require entropy during system startup. At that point, there are very few interrupts, no mouse or keyboard input yet, and some of the sources of entropy may even be initialized yet.

    One problematic situation is a initializing a household router. On startup it needs to generate random keys for its encryption, TCP sequence numbers, and so on. Without a clock, a disk, a fan, or any peripherals, the only good source of entropy it has is network traffic, and there hasn't been any yet. A router with very little traffic on its network may take ages to get enough packets to make a decent amount of entropy.

    dom

  • by WaywardGeek ( 1480513 ) on Tuesday October 15, 2013 @12:17AM (#45129061) Journal

    No, RNGs are easy. Super easy. Just take a trustworthy source of noise, such as zener diode noise, and accumulate it with XOR operations. I built a 1/2 megabyte/second RNG that exposed a flaw in the Diehard RNG test suite. All it took was a 40 MHz 8-bit A/D conversion of amplified zener noise XORed into an 8-bit circular shift register . The Diehard tests took 10 megabytes and said if it found a problem. My data passed several times, so I ran it thousands of times, and found one test sometimes failed on my RNG data. Turns out the Diehard tests had a bug in that code. Sometimes the problem turns out to be in the test, not the hardware.

  • by steelfood ( 895457 ) on Tuesday October 15, 2013 @12:45AM (#45129193)

    Useless for you. But the NSA might disagree. The math is what keeps them at bay. If the math shows cracks, it'd be certain that the NSA has figured out some kind of exploit. Keep in mind that the NSA doesn't rely on just one technique, but can aggregate multiple data sources. So those interrupts that the RNG relies on can be tracked, and the number that results can be narrowed to a searchable space. Keep in mind that 2^32, which is big by any human standard, is minuscule for a GPU.

  • by gman003 ( 1693318 ) on Tuesday October 15, 2013 @12:52AM (#45129237)

    Well, Intel and VIA have such things integrated into their processors now. Unfortunately, they (at least Intel - not sure how VIA's implementation worked) decided to whiten the data in firmware - you run a certain instruction that gives you a "random" number, instead of just polling the diode. With all the current furor over the NSA stuff, many people are claiming that it *is* hacked.

  • by jalopezp ( 2622345 ) on Tuesday October 15, 2013 @06:46AM (#45130473)

    I don't know of any distros that completely eschew GNU. The two are very tightly integrated, originally the kernel was written to run gnu, and even now, it cannot be built but with gcc. While I believe that they have done much to contribute to the free software we care about, loads of people here on /. have some disdain for gnu, and with that in mind might want to ban it from their systems. You can do this by replacing the individual components. BSD userland has been successfully ported to linux, and with Plan 9 from User Space you can take advantage of some great ideas that came with Plan 9 (though if you care enough for the benefits of Plan 9 plumbing, filesystem namespaces, and networking, I might recommend you go with the whole thing - Plan 9 is much, much better designed than unix, we just need to port our software).

    The main GNU components in all distros are usually glibc, gcc, and coreutils. If you don't compile anything, you don't need gcc, and if you're really into compiling, you probably use the llvm. Soon, it is said, LLVM will be able to build the linux kernel. As for the others, there are several packages available in most distributions that will replace them. Most likely, you will replace glibc with uClib and coreutils with BusyBox. You will lose some functionality doing this, but it is definitely possible to run a free system without GNU. I might point out that I don't think this is a great idea, and should go for it only if for some inexplicable reason, you dislike GNU.

    Check out this answer [stackexchange.com] for more info.

  • Re:Yawn (Score:2, Interesting)

    by Anonymous Coward on Tuesday October 15, 2013 @06:52AM (#45130483)

    So does Intel CPUs.

    The previous discussion was about whether or not to remove the Intel hardware RNG from the equation, because Intel is an American company, and as such subject to NSA requirements. I.e, nobody knows if the Intel hardware RNG is a true random number generator, or a pseudo random number generator with a secret key that only the NSA knows.

    (Some would claim that such a suggestion is tinfoil hat material, but lately, Edward Snowden has been making the tinfoil had crowd say "damn, it's worse that I feared" multiple times).

"A car is just a big purse on wheels." -- Johanna Reynolds

Working...