User: Password:
|
|
Subscribe / Log in / New account

thermal noise?

thermal noise?

Posted Dec 13, 2007 6:43 UTC (Thu) by nettings (subscriber, #429)
Parent article: On entropy and randomness

at least for machines equipped with a sound card or onboard sound interface (i.e. almost
everywhere, except some embedded or server machines), couldn't we just record some thermal
circuit noise and have 48k sufficiently random "pieces of entropy" per second to mix into
/dev/random?


(Log in to post comments)

thermal noise?

Posted Dec 13, 2007 10:51 UTC (Thu) by ekj (guest, #1524) [Link]

There's plenty of randomness for 99.99% of all computers, the remaining ones can easily get
some hardware entropy-source, for example one based on the soundcard as you suggest.

/dev/urandom is actually a lot STRONGER than sha, because it does use whatever real entropy is
available, and while sometimes low enough that /dev/random would block, it is seldom -zero-.

Predicting the next number coming out of urandom is similar to predicting the next number
coming out of a scheme like this:

do:
   pool = sha(pool)
   output(pool[0])

Which would perhaps be doable if sha was severly broken. But there's an added complication:
Every once in a while, some -real- entropy from whatever source enters the pool via the rough
equivalent of:

pool = sha(pool xor real-random-data)

This should mess things up enough that -even- if sha is severly broken, predicting the
sequence is, essentailly, impossible.

Our editor is rigth: If you are generating a keypair to use for a decade, by all means, use
real randomness. If you are doing anything less, use urandom and forget about it.

offtopic

Posted Dec 14, 2007 3:34 UTC (Fri) by ikm (subscriber, #493) [Link]

> pool = sha(pool)

I'm curious if there exists such a value of pool that the 'pool == sha(pool)' condition would
be true, effectively turning the rng's period to 0. What is the actual period of such a
generator, the one based on a sha hash? Is there some info to read about this? A proof that
the period is constant for any initial seed values, or a proof that it's not? Any pointers
would be greatly appreciated.

p.s. I understand that the external entropy would (eventually) shift the generator out of the
pit, but I am more concerned about what happens when there is no external entropy available,
on theoretical grounds.

offtopic

Posted Dec 16, 2007 21:54 UTC (Sun) by dlang (subscriber, #313) [Link]

if pool == sha(pool) then the sha hash would be broken as there would be one situation where
knowing the output would tell you what the input was.

offtopic

Posted Dec 17, 2007 19:09 UTC (Mon) by ikm (subscriber, #493) [Link]

You're joking, right? Here's another situation for you:

6768033e216468247bd031a0a2d9876d79818f8f == sha( 0000000000000000000000000000000000000000 )

Here, knowing the output (6768..8f8f) tells us what the input was (00000..0000)! There are
2^160 situations like this, not just one or two. As long as the only way to know the input for
some particular output is to try all the inputs, the hash is not broken. What difference would
it make if some unique value would both be the argument and the result? I fail to see any.

As far as I understand, no one has ever stated that a crypto hash may never produce the same
output value as its input value. That's why I was asking in the first place.

offtopic

Posted Dec 18, 2007 3:18 UTC (Tue) by dlang (subscriber, #313) [Link]

every hash function I am aware of involves doing many iterations of the same funtion, mixing
the output back into the input for the next stage.

in fact, the only difference between sha1 and sha256 are the number of iterations (as I
understand it anyway)

if you ever have a hash produce it's input as it's output you end up in a loop where
additional iterations will always produce the same output.

I was incorrect before when I said the problem was knowing the input that caused the output.
the fact that multiple inputs produce the same output prevents that knowledge from being
useful (expect in the case where the input is small enough that you can enumerate all of them
to produce a rainbow table that you can use in that specific situation)

offtopic

Posted Dec 18, 2007 4:10 UTC (Tue) by ikm (subscriber, #493) [Link]

Well, I personally have no idea how hashes work. As long as they comply with the usual requirements (i.e. resistance to reversion and collision-finding), I'm quite happy. What I am concerned about is their use as RNGs, as I can't google out any papers on how mathematically correct it is to feed a hash output to its input to build an RNG. What would be the period? Does it depend on the seed supplied? The aforementioned requirements for hashes don't allow any conclusions to be drawn on these questions. Hashes are simply designed for a different purpose. While using a hash to compress the resulting entropy from another source is probably ok (as I doubt it would make it less random than it was), looping it to itself seems questionable to me. E.g., consider an example RNG, for instance, a very simple Marsaglia's SHR3. It has a period of exactly 2^32-1, which means that it iterates through all numbers within [1..2^32-1] in some random order. You can use any value to seed it other than 0, and it would still have an exact same period (and a value of zero would make it always produce zeroes). Usually RNGs have these properties well understood. Did anyone make a research on SHA1 in that regard? I don't know and I would want to know. Not that all this stuff matters in real life, but still it's interesting.

p.s. While I can't make google find any real stuff, I found posts of other people concerned with the same question, e.g. this thread. Someone there made statistical conclusions on the period based solely on the hash length, but that's statistical, while the algorithms are deterministic and not too perfect.

offtopic

Posted Dec 18, 2007 15:11 UTC (Tue) by njs (guest, #40338) [Link]

Okay, seriously, last crypto tutorial comment from me for a while, I swear...

There's a lot of research of how you generate good cryptographic PRNGs.  Some approaches use
hashes, some don't.  The analysis in general is utterly different from the analysis of
scientific PRNGs, for the natural reason that you don't care so much about having provably
insane period and provably insane equidistribution requirements; but you do care a huge amount
about e.g. making sure that the internal state remains secret even when people can observe the
PRNG's output, and that even if current state *were* revealed that this would not let you "run
the PRNG backwards" and work out what earlier states were, and that you are clever about how
you mix in new entropy from whatever external sources.  Of course, you'd *like* a provably
insane period too (to the extent that "period" is even meaningful when you are mixing in
entropy as you go), but it's very hard to get an algorithm that resists all cryptographic
analysis, *but* is still amenable to period analysis...

Random posts on python-list are, unfortunately, pretty useless for grokking this stuff.  If
you really are curious to learn how CPRNGs are designed, one convenient place to start might
be Schneier's various online-accessible writings on the subject:
  http://www.schneier.com/paper-prngs.html
  http://www.schneier.com/paper-yarrow.html
  http://www.schneier.com/blog/archives/2007/11/the_strange...
(Yarrow in particular is a fun and interesting algorithm, though no longer quite considered
state of the art.  The last link is awesome for anyone who enjoys some geeky conspiracy, and
has lots more links.)

offtopic

Posted Dec 19, 2007 3:39 UTC (Wed) by ikm (subscriber, #493) [Link]

Yay! Thanks. You're right -- I've never thought too much about the use of RNGs outside
Monte-Carlos and such, and the properties you've mentioned can mean a lot more than typical
equidistribution/period requirements in other contexts indeed.

I'll go dig the links you've provided. Thanks again, your help is much appreciated.

offtopic

Posted Dec 20, 2007 17:57 UTC (Thu) by njs (guest, #40338) [Link]

Cool, glad to help.  Have fun!

(If you really want to get into this, I also highly recommend Ferguson and Schneier's
/Practical Cryptography/.  /Applied Cryptography/ is better known, but it's all like "so in
RSA you use modular arithmetic on primes in the following way" while /Practical Cryptography/
is all like "so if for some reason you want to reinvent SSL, here are the design trade-offs
you have to decide about, and here are the subtle bugs you're going to put in that will break
everything".)

offtopic

Posted Dec 18, 2007 14:43 UTC (Tue) by njs (guest, #40338) [Link]

>in fact, the only difference between sha1 and sha256 are the number of iterations (as I
understand it anyway)

No -- SHA-1 has quite a different design from the SHA-2 family (which includes SHA-224,
SHA-256, SHA-384, and SHA-512).  In fact, SHA-256/224 have fewer rounds than SHA-1.  Not that
this matters to anyone except real cryptography geeks, but hey, in case you were curious.

Except, of course, that it's why the recent attacks against SHA-1 haven't generalized to SHA-2
yet (though the increased bit-length would probably protect them anyway).  It is unclear to
what extent this is coincidence and to what extent it is NSA Sneakiness.

>if you ever have a hash produce its input as its output you end up in a loop where additional
iterations will always produce the same output.

True (at least for the simplest hash-based CPRNG design), but I'm pretty sure no-one has ever
found such a input/output pair, and finding one is very similar to accomplishing a preimage
attack, so I wouldn't worry about it much in practice.

thermal noise?

Posted Dec 13, 2007 14:08 UTC (Thu) by nix (subscriber, #2304) [Link]

The thermal circuit noise isn't as random as all that: it gets interference from system
circuitry, which is regular as anything.

It could certainly contribute some randomness, I'd think, but nowhere near as much as 48kbits.

thermal noise?

Posted Dec 13, 2007 14:26 UTC (Thu) by NRArnot (subscriber, #3033) [Link]

48K samples/second, each 16 bits, of which the least significant should be random regardless
of how much patterned noise is being picked up from the system's innards. In practice one
would run some verification tests first. One wouldn't be relying on it being a truly random
bitstream, just on it containing a reasonable amount of entropy.

thermal noise?

Posted Dec 13, 2007 16:49 UTC (Thu) by nix (subscriber, #2304) [Link]

I've tried it; you don't get anywhere near that much entropy. Quite a lot of it is strongly
correlated, at least with cheap sound cards, even at the least significant bits.

I wish this wasn't true, but it is :/ or at least it was in 2003, when I last looked at it (I
still don't have any sound cards newer than that: I'm not made of money and the newer cards
don't offer anything I can use. I don't have enough speakers or a suitably-shaped room for
surround-sound...)

thermal noise?

Posted Dec 14, 2007 6:02 UTC (Fri) by bronson (subscriber, #4806) [Link]

Electrical engineers learn to become deeply fearful of harmonics.  They show up everywhere!
Thanks to this, (the harmonics, not the fear), I think you'll find that /dev/urandom offers far
more random data than anything arriving over cheap hardware.

Well, unless that hardware was explicitly designed to avoid harmonics, like the transistor
noise RNG on some of Via's chips.

thermal noise?

Posted Dec 25, 2007 12:15 UTC (Tue) by flok (guest, #17768) [Link]

What about installing either audio-entropyd or video-entropyd? For the first one every cheap
soundcard will do nicely.
http://www.vanheusden.com/aed/ and http://www.vanheusden.com/ved/


Copyright © 2017, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds