|
|
Log in / Subscribe / Register

Lowering entropy

Lowering entropy

Posted May 14, 2015 15:07 UTC (Thu) by DigitalBrains (subscriber, #60188)
In reply to: Lowering entropy by dlang
Parent article: Random numbers from CPU execution time jitter

> how do you decide that you need "128 shannons of entropy for my crypto"?

I'm going by the principle that the only secret thing about my crypto application is its key. I'm assuming for the moment that my attacker is able to exhaustively search the 96 shannons, or reduce the search space far enough to do an exhaustive search on what remains.

Because only my key is unknown, I'm assuming the attacker can reproduce the deterministic portions.

When you argue that mixing in bad quality randomness is not a problem because there's still plenty left, this seems like the bald man's paradox. If you have a full set of hair (good quality randomness), and some individual hairs fall out (bad randomness), you still have a full set of hair. Fine, so mixing in some bad sources doesn't make you go bald. But at some point, if enough individual hairs fall out, you are going bald: producing bad quality randomness. So I think that if you suspect that this cpu execution time jitter produces only 0.2 shannons per bit, you should not use it as if it has a full shannon per bit. You can still use it, but you shouldn't pretend it has more information content than it has. And if you don't feel confident giving a reliable lower bound on the amount of entropy delivered by the method, you might even be better off not using it. Better be safe than sorry.

It's about delivering an application what it expects, about quantifying what you mean when you say you are using a certain amount of randomness. A crypto application requesting 128 shannons of entropy does that because its designers decided this is a good amount of entropy to use. There are always margins built in, so it might be safe to give it only 96 shannons. But you're eating away at the built in margins, and at some point you're going past them.

The main point I tried to make is that I agree with commenters saying that you can't lower the entropy by mixing in determinism, but that that is not the point. Other than that, I think this is a really complicated subject and I'm not an expert at all, just an interested hobbyist.

> [...] unless the attacker knows/controls the deterministic data that was mixed in, it's still effectively random as far as the attacker is concerned.

I think that when you say that something is not a good quality source of randomness, you're effectively saying that you suspect someone could predict a part of its output. So yes, then there are attackers that might know the deterministic data to some extent. They can use this knowledge to reduce their search space. It's still only a small reduction; they still have a long way to go.


to post comments


Copyright © 2026, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds