Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Encryption Security Science

Ultra-low-cost True Randomness 201

Cryptocrat writes "Today I blogged about a new method for secure random sequence generation that is based on physical properties of hardware, but requires only hardware found on most computer systems: from standard PCs to RFID tags." Basically he's powercycling memory and looking at the default state of the bits, which surprisingly (to me anyway) is able to both to fingerprint systems, as well as generate a true random number. There also is a PDF Paper on the subject if you're interested in the concept.
This discussion has been archived. No new comments can be posted.

Ultra-low-cost True Randomness

Comments Filter:
  • by eldavojohn ( 898314 ) * <eldavojohn@gm a i l . com> on Monday September 10, 2007 @10:07AM (#20539003) Journal
    A slightly more expensive but somehow even more random method is to seed the generator against the words and phrases that come out of the mouth of South Carolina's Miss Teen USA [youtube.com].

    But in all seriousness, I wonder how this compares to the Mersenne Twister [wikipedia.org] (Java implementation [gmu.edu] & PDF [hiroshima-u.ac.jp])that I use at home? I am almost sure this new proposed method is more efficient and faster, when will there be (I know, I'm lazy) a universal implementation of it? :)

    Also, this may be a stupid question, but I wonder how one measures the 'randomness' of a generator? Is there a unit that represents randomness? I mean, it would be seemingly impossible to do it using observation of the output so I guess all you can do is discuss how dependent it is on particular prior events and what they are, theoretically. Can you really say that this is 'more random' than another one because you have to know so much more before hand about the particular machine & its fingerprint in order to predict its generated number?
    • by BadAnalogyGuy ( 945258 ) <BadAnalogyGuy@gmail.com> on Monday September 10, 2007 @10:13AM (#20539091)
      Randomness is definable.

      Why, take a look at this Wikipedia link [wikipedia.org]. You can never tell whether it represents the truth or some crackpot's claim to it or just some troll's malicious vandalism.

      Voila! Randomness!
      • Re: (Score:3, Insightful)

        by Mc1brew ( 1135437 )
        That link brought me to the conclusion that randomness doesn't exist as much as I thought. It uses the example of rolling dice, random right? Not really... Just too many variables to consider over the given amount of time. *Density of dice *Placement of dice in hand *Distance of hand from table *Number of dice *Potential values of dice *Density of table *etc..... By the time you write down all the variables a value has been generated. Just because you didn't have enough time to evaluate the scenario, doe
        • by abes ( 82351 )
          It really depends on whether you are looking from Einstein's world view (God doesn't play dice) or Quantum mechanics world view, in which everything is assigned a probability (and therefore random). IANQM, but my understanding is that the interesting thing is how to scale the very small to the very big -- things which by their nature are ruled almost entirely to probability (particles) to things which have no randomness attached to them (>> particles).

          You could in theory construct a truly random numbe
          • by imgod2u ( 812837 )
            See http://en.wikipedia.org/wiki/Hardware_random_number_generator [wikipedia.org]

            The method of generating random numbers that this article proposes follows the same principle. And the "randomness" that it provides becomes more random as process feature size shrinks. At 45nm, the steady-state bias (if there is one) that a circuit comes to (depending on the circuit design) is truly random in two ways:

            1. If, in the case of a CMOS circuit in which the state of a flip-flop depends on charge accumulated during power-up, small
        • by Surt ( 22457 )
          Indeed, there is very little to possibly zero actual randomness in the universe. Many scientists think there may be no actual randomness (just stuff for which we don't know how to examine the internal state yet).
          • That's basically the deterministic view of the universe, where given all the rules and the initial state of the universe, you'd theoretically be able to run a simulation and predict the entire future of the whole universe (of course, to simulate the entire universe, you'd need a computer the size of the universe).

            Of course, then quantum mechanics and all that jazz has come along, and one of the things it fundamentally says is that some things are entirely based on probability, and as such are not determinis
            • by Surt ( 22457 )
              Indeed, and the open question is whether quantum mechanics is the bottom, and there is no more we can know that would allow us to predict quantum mechanical outcomes, or if at some deeper level, it all becomes fixed again.
    • I think the easiest way to measure "randomness" is to (whilst keeping the environment the same) generate a massive number of "random" numbers, and check the number of occurrences of values to their expected number of occurrences. Probability would dictate that a true random number generator would return values to within a tiny margin of what would be expected. The "unit" would probably be "standard deviations" (ie, the bad random number generator has a bias for $SOME_VALUE of 2 standard deviations)
      • Re: (Score:2, Insightful)

        123456789123456789123456789123456789123456789

        That's how to test uniformity, but not randomness.

      • Re: (Score:3, Insightful)

        by Matje ( 183300 )
        it is a lot more tricky than that. Test your method against the following string:

        12345678901234567890

        See? The distribution of digits doesn't tell you a whole lot about the randomness of a stream.

        A nice way to define randomness is using Kolmogorov Complexity. A random number then is a number that cannot be represented by a program (in some code language) that is shorter than the random number itself. In other words: if the smallest program that outputs number X is the program "print X" then X is considered a
      • I think you'd also need some means of defining the randomness of distribution in your sequence. For instance, 01010101010101010101010101010101 doesn't look very random.
        • Compress it and measure the compression ratio (entropy). It can be used to determine a probability distribution with confidence intervals so that there is a 90%, 99% etc. probability that the source is truly random.
          • Every time some article mentions "Random numbers" the question is raised: What exactly is a random number?

            and every time, everybody disagrees...

            once again... for the record... From My CS classes:

            A number sequence can be defined as "Sufficiently Random" if the simplest algorithm to generate said sequence is shorter than the sequence itself.

            ergo: 012346567890123465678901234656789012346567890123465678901234656789 Ain't Random
            for i=0, i50,i++{
            print (i mod 10);
            }
            • by Surt ( 22457 )
              The problem everyone has with that definition is that it's unmeasurable, because we don't know how to generate the shortest program for a given sequence.
              • by MCraigW ( 110179 )
                And it is language dependent.
                • by Surt ( 22457 )
                  An excellent point. On reflecting on this comment, I realized it would be relatively trivial to design a language for which the shortest program expressing any given sequence was a single character.
      • by Intron ( 870560 )
        int random()
            i++;

        Every value of i is equally represented. Must be a perfect random number generator.
      • Would you consider the digit sequence of pi to be a random sequence? AFAIK up to now it has passed every normality test thrown to it. Yet I'm not willing to consider that sequence as random. After all, you can calculate any digit you want (it may take some time and computing power, though). So there's absolutely nothing unpredictable about them.
    • Re: (Score:3, Informative)

      by NetCow ( 117556 )
      Mersenne Twister is not a random number generator, it's a pseudo-random number generator.

      Randomness is measured as entropy. See here for details: http://mathworld.wolfram.com/Entropy.html [wolfram.com]
    • by benhocking ( 724439 ) <benjaminhocking.yahoo@com> on Monday September 10, 2007 @10:17AM (#20539161) Homepage Journal

      The Mersenne Twister is a pseudo-random number generator. For many uses, this is preferable to a true random number generator as it is easily repeatable. (One can also repeat the results of a true random number generator by storing the output, but depending on how many random numbers you're generating, this might be space intensive.)

      That said, although this might be "true" randomness, what kind of randomness it is? Uniform over a range? Gaussian? Weibull? Most likely, none of the above if it can be used for fingerprinting systems. (No, I did not RTFA.)

    • Re: (Score:3, Insightful)

      by solafide ( 845228 )
      Randomness is measured statistically using multiple tests: see Knuth, Art of Computer Programming Volume 2, Chapter 3 for a thorough discussion of common statistical randomness tests, or here [fourmilab.ch] for a practical testing tool.

      I don't expect this to be statistically random: they claim it's based on thermal noise. But the startup temperature of a computer does not have that much entropy, so the thermal noise isn't reliable. Just because something's garbage doesn't mean it's statistically random.

      • by Cyberax ( 705495 )
        Thermal noise random number generators do not depend on temperature (unless cooled to liquid nitrogen temperatures). Normal room temperature provides quite enough random fluctuations for good generators.
      • Randomness is measured statistically using multiple tests: see Knuth, Art of Computer Programming Volume 2, Chapter 3 for a thorough discussion of common statistical randomness tests,

        Keep in mind that there are several 'grades' of randomness. Something that is good enough for your average Monte Carlo analysis is likely sub par for serious cryptographic purposes.
    • by sholden ( 12227 )
      You can do some stats on the output, for example: http://citeseer.ist.psu.edu/maurer92universal.html [psu.edu]

    • Re: (Score:2, Informative)

      Well, the theory goes something like this: the more wide and varied the seeds you feed to a random number generator, the more truly random your results. Many programs use a timestamp from the system clock as a seed, or even a timestamp as seed to put through the random number generator to get another random number that is used as a seed, etc. ad finitum. Of course, the system clock has only so much granularity, so based on that granularity there are a finite number of seeds for each 24 hour period. If yo
    • Also, this may be a stupid question, but I wonder how one measures the 'randomness' of a generator? Is there a unit that represents randomness? I mean, it would be seemingly impossible to do it using observation of the output so I guess all you can do is discuss how dependent it is on particular prior events and what they are, theoretically. Can you really say that this is 'more random' than another one because you have to know so much more before hand about the particular machine & its fingerprint in o

    • Re: (Score:2, Interesting)

      by kevmatic ( 1133523 )
      Yeah, it can be measured. There is no unit, though, as its a measure of entropy. So things are more or less random than something else. I imagine randomness studying program assign numbers to it. a random number is just a number; '1' might be randomly selected out of 1 through 6, but its still just 1. But random number sets are considered random if, for every number, the chances of a the number after it being, say 4, are 1 in 10. So if you have a random set and come across a 1, the probability the next nu
      • Re: (Score:3, Interesting)

        by ThosLives ( 686517 )

        There is no unit, though, as its a measure of entropy.

        Eh, well, the unit of entropy is actually "energy per temperature"*, so there are physical units associated with it. Of course, that's physical entropy, and I don't know that it's the same as "information entropy." If they're different, then I blame the folks that overload technical words.

        That said, I always thought "random" simply meant "the next outcome is not predictable based on all previous measurements." Therefore the measure of "random" would b

        • Of course, that's physical entropy, and I don't know that it's the same as "information entropy."

          You are probably referring the thermodynamical entropy, which is based on continuous state distributions. While in most cases continuous and discrete solutions are closely related (allowing summations to be replaced with integrals and vice versa), it has been shown in this case that these notions of entropy are not comparable (the limit of the discrete entropy as the number of divisions goes to infinity also go

        • Re: (Score:3, Interesting)

          by sdedeo ( 683762 )
          Entropy is fascinating. It's proportional to the logarithm of the number of microstates, but until the advent of quantum mechanics, there was not good way to number the microstates of a given physical system. Once you have the uncertainty principle, you can divide the phase space up into little chunks of volume (Planck's constant)^(dimensions) and count it that way.

          Another way to put it is that before the advent of quantum mechanics, every measurement of entropy was only meaningful in a relative, differe
    • Dieharder http://www.phy.duke.edu/~rgb/General/dieharder.php [duke.edu] is what I used.

      I learned a heck of a lot working with dieharder especially considering my lack of mathematical acumen. The author and friends were unbelievably patient and helpful. In my book it's the best tool ever.

      Debian package too!
    • Also, this may be a stupid question, but I wonder how one measures the 'randomness' of a generator?

      Read James Gleick's Chaos.

      There is a method in that book that describes how they extracted attractors from a stream of data. Here's how it works.

      A team of researchers had a bathtub with a dripping faucet. They tuned the dripping until the drips fell at random intervals. Nothing rhythmic about it. As the drop broke away from the faucet, it was setting up a vibration in the remaining water that would

    • "Also, this may be a stupid question, but I wonder how one measures the 'randomness' of a generator?" There are lots of places that talk about this. A simplistic explanation of what it means to be a good PRNG is simply to provide a sequence of numbers with no correlations that matches the desired distribution. (http://mathworld.wolfram.com/RandomNumber.html [wolfram.com]). Books on modeling and simulation often have good explanations of this. This page (http://csrc.nist.gov/rng/ [nist.gov]) has a good overview, including simpl
    • by Fweeky ( 41046 )
      From the Wikipedia article:

      Observing a sufficient number of iterates (624 in the case of MT19937) allows one to predict all future iterates
      A proper cryptographically secure PRNG like Yarrow [wikipedia.org] shouldn't be guessable like this, and neither should this RNG, though they may not be "more effecient and faster", just "better".
  • Oblig. XKCD (Score:5, Funny)

    by IcedTeaisgood ( 1148451 ) on Monday September 10, 2007 @10:15AM (#20539133)
  • OK, do we in this world have a problem with not sufficient randomness in our keys or something?
  • Is pretty low cost to get some randomness. Some friends like jug wine though. Although I'm extremely cheap, so I just go for 100 proof stuff. $12.00 and you can get a whole bottle of randomness.
  • The contents of a power-cycled DRAM cell are highly correlated to whatever was stored in it before power was lost. Geez, think about how a DRAM works... it's a capacitor (aka an integrator)! That's the last place I'd ever look for randomness.
    • Yeah, but the headline of the PDF says: "Initial SRAM state..."
    • by AmIAnAi ( 975049 ) *
      TFA talks about SRAM, rather than DRAM. So there's no capacitor involved for data storage - each cell is a transistor-based state machine.
    • by imgod2u ( 812837 )
      That actually depends on how long you power it off for....

      Also depends on how the sense/charge refresh circuit is designed.
  • by G4from128k ( 686170 ) on Monday September 10, 2007 @10:25AM (#20539281)
    I wouldn't assume that these fingerprints are as unique or pattern-less as one might hope (a fact discussed in the pdf). All of the RAM chips from a given wafer or given mask may share tendencies toward some patterns of the probability of a 0 or 1. These patterns may appear as correlations between rows and columns of a given chip. Location on the wafer (in the context of nonuniformities of exposure to fab chemicals) might also systematically affect the aggregate probabilities of 0 or 1 or the repeatability of the fingerprint. The quality of these fingerprints to be consistent or random might change from run to run and from manufacturer to manufacturer. Finally, I'd bet that the probabilities vary systematically with temperature -- e.g., the probability of a 1 increases for all bits as the chip's temperature increases.

    This is a very interesting phenomenon, but a lot more data is needed to show that it provides consistent behavior.
  • Will vary with the length of time the computer has been off. There is a suprising amount of non-volatileness in DRAM. I liked Alan Touring's suggestion that all computers come equiped with a small radioactive source and detector. The random breakdown and emission of the source is an almost ideal random number generator. It wouldn't take a source any bigger than we now have in a smoke detector.
  • Pick a question. Then keep asking that question to a politician. You should get truly randomized results. If you doubt me, just take a politician with an opposite stance, and repeat the process. The answers will not be polar opposites.
    And consequently no information could be extracted that scenerio -- wow, I think I just proved that you can't transmit information across a quantum entanglement..
  • by nweaver ( 113078 ) on Monday September 10, 2007 @10:29AM (#20539357) Homepage
    the true RNG properties rely on the fact that:

    a: Many of the bits are sorta random, but physically random. So very biased coins, but true randomness.

    b: With the right reduction function, you can turn a LOT (eg, 512 Kb) of cruddy random data to a small amount (128b-512b) of very high quality, well distributed random.

    And the fingerprinting relies on the fact that:

    a: Many other of the bits are physically random, but VERY VERY biased. So map where those are and record them and it is a very good fingerprint. And since it is all silicon process randomness going into that, it is pretty much a physically unclonable function.

    Kevin Fu has some SMART grad students.
    • Re: (Score:3, Funny)

      by Dirtside ( 91468 )

      Kevin Fu has some SMART grad students.

      I wonder how often they go around saying to people, "Whoa. I know Kevin Fu."
  • by gillbates ( 106458 ) on Monday September 10, 2007 @10:33AM (#20539421) Homepage Journal

    As an embedded engineer, I've encountered numerous cases where power cycling RAM did not alter the contents.

    In fact, I've seen systems boot and run even after the power was cut for several seconds. Some types of SRAM and SDRAM have the ability to retain an (imperfect) memory image even at very low voltage levels. Sure, it's not guaranteed to be accurate by the manufacturer, but RAM "images" are a pretty well known phenomenon. In some cases, the contents of memory can be reconstructed even after the computer has been powered off and removed to a forensic laboratory.

    This is not random at all. In fact, it's more likely to produce an easily exploitable RNG than anything else; I would not be at all surprised if the standard UNIX random number generator provided better security.

    • by tlhIngan ( 30335 ) <slashdot@worf.ERDOSnet minus math_god> on Monday September 10, 2007 @11:54AM (#20540781)

      As an embedded engineer, I've encountered numerous cases where power cycling RAM did not alter the contents.

      In fact, I've seen systems boot and run even after the power was cut for several seconds. Some types of SRAM and SDRAM have the ability to retain an (imperfect) memory image even at very low voltage levels. Sure, it's not guaranteed to be accurate by the manufacturer, but RAM "images" are a pretty well known phenomenon. In some cases, the contents of memory can be reconstructed even after the computer has been powered off and removed to a forensic laboratory.

      This is not random at all. In fact, it's more likely to produce an easily exploitable RNG than anything else; I would not be at all surprised if the standard UNIX random number generator provided better security.


      I've had this bite me, and exploited it.

      It bit me when booting into Windows CE - you'd power cycle the thing, and the OS would boot with the old RAM disk you had - we'd gotten to the point where we'd have the bootloader wipe the kernel memory so the data structures were all corrupted by the time the OS was trying to decide between mounting the RAM disk (object store) and starting fresh. It turns out that the longer an image is unchanged in RAM, the more likely the cells woudl be biased such that if you cycle the power on them, they're more likely to lean towards the way they were before power was cut.

      The time I exploited it, I didn't have any way of logging. Logging to serial port caused issues (timing-sensitive code), so I logged to memory (and no, I had no filesystem running, so I couldn't log to file). My trick was to simply log to a circular RAM buffer. When it crashed, I would just power cycle and dump the RAM buffer. Even though the data was fresh, it was enough to make out what my debug message was trying to say (almost always perfect). This was readable after a brief power cycle, and was still readable after turning power off for nearly a minute. The characters got corrupted, but since it was regular ASCII, you could still make out the words.
    • by nickovs ( 115935 ) on Monday September 10, 2007 @12:16PM (#20541137)
      There are a couple of things to note here. Firstly, SDRAM and SRAM behave very differently. Synchronous dynamic RAM can retain charge in the capacitors for quite some time after being powered down and there is very little one can do about it, but the paper discusses static RAM. With static RAM there is a difference between being "powered off" and having the Vcc rail clamped to ground. Active clamping of the power line is much more effective at clearing the RAM than even just disconnecting it from the power supply, for reasons which become obvious when you look at a classic six transistor CMOS RAM circuit [wikipedia.org]. Without clamping, bias will remain for exactly the same reason that SRAM doesn't consume much power; current only flows when the data changes.

      As for it being a good RNG; the state of RAM on power-up is probably a lousy "random number generator", but the statistics in the paper suggest it is a fairly good "source of randomness". There's a big difference between bias and unpredictability (think about dice with '1' on five of the sides and '0' on the remaining side). You wouldn't want to use the state without putting it through a compression function first, but it's a much better seed than using clock() [berkeley.edu]!
  • This technique is only usefull for deeply embedded systems where you have control of the hardware from power-on and are able to fingerprint your SRAM. PCs don't really have user-accessible SRAM (except for on-chip memory in SCSI or Ethernet controllers). Even if this were applicable to DRAM, by the time your OS loads, your DRAM state is already defined. It's a shame Intel's hardware RNG implemented in their firmware hubs (82802Ax) didn't catch on more.
  • by operagost ( 62405 ) on Monday September 10, 2007 @10:43AM (#20539605) Homepage Journal
    You can never be sure [random.org].
  • by Quila ( 201335 ) on Monday September 10, 2007 @10:43AM (#20539615)
    Learn How To Use Capital Letters At The Beginning Of Sentences!
  • by fubob ( 7626 ) on Monday September 10, 2007 @10:44AM (#20539633) Homepage
    We were surprised to suddenly get attention to this paper, but apparently Slashdot readers are watching the security seminar at UMass Amhest.

    Anyhow, we will be answering questions in this thread. So if you have any questions, post them here and Dan Holcomb will get back to you as soon as he can.

    Cheers,
    -Kevin Fu
  • by rpp3po ( 641313 ) on Monday September 10, 2007 @10:55AM (#20539883)
    The original paper is much better than CmdrTaco's quick conclusions.
    The described method is ONLY for SRAM (statical RAM), no DRAM, no SDRAM. You can find this on RFID chips and in a CPU'S cache, not in RAM. As there is no way to access a CPU's cache uninitialized, I can't see why this should be useful.
    If you have to modify a CPU first, to allow access to it's unitialized caches (think about all the unwanted implications), it's much cheaper to just give it a thermal diode and register to poll (as most modern CPU's already have).
    After all the described method is just another way of collecting thermal noise. As RFID's are custom designs most of the time, also there it would be cheaper to just use a thermal diode.
    The only application for this would be if you had to develop strong crypto for legacy RFID chips.
    Slashdot stories get worse by the day.
    • It's conceivable that at some point in the future there would be some kind of memory that would become popular on general-purpose PC's for which this technique would work. But even then, there are some other reasons why I'm thinking you wouldn't want to use this technique for a PC (as opposed to an RFID):

      1. Users don't want to power-cycle their machines several times, and modify their BIOSes, so that they can install someone's crypto software.
      2. The code to use this technique would either work or not work for
  • It's an interesting paper, and as a means of getting randomness for RFID processors it probably works well, but I'm not sure it's that cheap for most purposes. If you're having to build a new circuit to generate randomness then using an op-amp to compare the noise out of a pair of zener diodes is likely to be cheaper. If you already have an audio input you're not using you can keep that really cheap. Of course if you want really good randomness you should use an old smoke detector [slashdot.org].
  • by PPH ( 736903 )
    Post a question on 'Ask Slashdot'.
  • If not implemented in hardware i do see some problems like how to get access to the page as a normal user.. In software there are easier ways...

    1. IRQ's on the system since it started, for one or more more devices.
    2. cpu-serial
    3. check the number of cpu-cycles that passed since the seed-generator started (a bit like the chaos theory if you have many processes running)
    4. uptime of the system
    5. local time (do have have a little randomness due to variations op the local clock-circuit)
    6. serial of the harddrive
  • Linux already uses hardware entropy sources like hard drive seek times and peripheral input to generate secure random numbers:

    http://en.wikipedia.org/wiki//dev/random [wikipedia.org]

    Another existing method of generating secure random numbers, used by the java VM, is starting and stopping threads and gathering statistics about how the OS allocates time to those threads, which has been shown to be fairly unpredictable.

    Given the flaws with the method outlined by other posters... sounds like this really doesn't offer anything
  • butterflies and artichokes to me.
  • HotBits (Score:3, Informative)

    by The -e**(i*pi) ( 1150927 ) on Monday September 10, 2007 @01:36PM (#20542457)
    The only way I know of generating truly random numbers (not psudorandom) is hot bits which works on the principle of single radioactive atoms decaying after a perfectly random, in every sense of the word, time. http://www.fourmilab.ch/hotbits/ [fourmilab.ch]
    • Re: (Score:2, Insightful)

      by JoelKatz ( 46478 )
      There are quite a few ways of generating truly random numbers. They all basically boil down to three basic mechanisms. One is radioactive decay. Another is thermal noise. The last is direct quantum effects.

      They vary in quality, but it really doesn't matter. With the proper post-processing, they all provide true randomness that is basically as good as randomness is possible to be.

      Radioactive decay tends to be the most expensive and the least practical. Shot noise in a reverse-biased zener diode is generally
  • Just because it is hardware does not mean it is "true" randomness.
    TFA did not say just how random it is.
    The idea that the data is compromised enough that significant portions of it can be used to fingerprint, which is close to the opposite of the meaning of true randomness since it is based on a pattern, sounds to me like the more random components are in fact not so random.
    In other words the jitter down near the threshold of signal discrimination may look real random but actually may look a lot like the sa
  • by ironring ( 598705 ) on Monday September 10, 2007 @03:01PM (#20543831)
    This is a bit of old news. I have already authored and been granted several patents in this area.
    6,906,962 Method for defining the initial state of static random access memory
    6,828,561 Apparatus and method for detecting alpha particles
    6,738,294 Electronic fingerprinting of semiconductor integrated circuits
    I have several other ideas for application of this technology and would be happy to discuss if someone is interested.
    Paul
    • (1) please tell me that you're joking, I don't have time to look up patent numbers right now.

      (2) if you're not joking, were you some key figure at NE2 encryption, noted by an AC a few posts up?

  • I've assembled a small SRAM memory circuit as part of a computer and I've observed at least part of the behaviors he describes. In the circuit, both my address counters always initialize to zero. The first byte of the ram chip is always 11111111, the second always 00000000, and after that they are apparently random. But if you're going to *powercycle your ram* to get randomness, wouldn't it be simpler to just use a hardware stream generator?
  • The article makes a short note on metastability [wikipedia.org], but wouldn't this be more appropriately applied to a register array of flip-flops, which are much more susceptible to falling into a metastable state, instead of a ram? If he's using the SRAM (he never clarifies), he's counting on the charge of a single capacitor being randomly dispersed in order to enforce randomness, not the properties of the transistors at all. Using flip-flop circuits by violating their setup/hold times seems like it'd be more effective

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...