- 1. Overview
- 2. Etymology
- 3. Cultural Impact
Cryptographic Property of Random Output
In the realm of cryptography , the concept of “full entropy” is a property attributed to the output of a random number generator . An output is said to possess full entropy if it is practically indistinguishable from the output of a theoretical, perfect random number source. Essentially, for an n-bit output, it carries approximately n bits of entropy.
This terminology is not merely academic; it is extensively employed in the standards set forth by the National Institute of Standards and Technology (NIST), specifically within the NIST SP 800-90A and NIST SP 800-90B publications concerning random generator standards. When an output achieves full entropy, the entropy per bit is remarkably close to one, often expressed as $1-\epsilon$. According to NIST, for practical purposes, this $\epsilon$ is a vanishingly small number, typically less than $2^{-32}$.
It’s worth noting that some discussions use the term “full entropy” to describe the absolute ideal of a random bit string, where each bit inherently possesses one bit of entropy. In this more rigorous, theoretical sense, achieving “100% full entropy” in the tangible, real world is, in fact, an impossibility.
Definition
The formal definition of full entropy hinges on a conceptual “distinguishing game.” Imagine an adversary armed with seemingly unlimited computational power. This adversary is presented with two distinct collections of random numbers. Each collection contains W elements, and each element is a bit string of length n. One collection is the “ideal” set, comprising bit strings generated by a theoretically perfect random number generator. The other set, the “real” collection, contains bit strings produced by a practical, real-world random number source, typically after those raw outputs have been processed by a randomness extractor . Full entropy is achieved for specific values of W and a positive parameter $\delta$ if the adversary, despite their immense power, cannot correctly identify which set is the “real” one with a probability exceeding $\frac{1}{2} + \delta$. This $\delta$ represents the adversary’s advantage, and a smaller $\delta$ signifies a higher degree of confidence in the randomness of the real output.
Additional Entropy
The practical pathway to attaining this coveted full entropy involves a specific methodology. One begins by sourcing bit strings from an entropy source that are longer than the desired n bits. These longer bit strings are then subjected to a high-quality randomness extractor. This extractor’s function is to distill the raw, potentially imperfect randomness into a refined n-bit result. The “real” set of bit strings is then constructed from these processed results. The ideal elements, by their very nature, inherently possess n bits of entropy.
However, for the “real” set to satisfy the stringent definition of full entropy, the inputs to the conditioning function (the randomness extractor) must possess a higher min-entropy value, denoted as H. The number of “additional bits of entropy,” represented by the difference $H-n$, is not arbitrary. It is directly dependent on the chosen values for W and $\delta$. The following table illustrates some representative values, demonstrating the required additional entropy for different scenarios:
| Minimum Value of Additional Entropy ($H-n$) | $W$ | $\delta = 2^{-20}$ | $\delta = 2^{-10}$ |
|---|---|---|---|
| $2^{32}$ | 67.3 | 47.3 | |
| $2^{48}$ | 83.3 | 63.3 | |
| $2^{56}$ | 91.3 | 71.3 |
This table clearly indicates that as the adversary’s potential advantage ($\delta$) decreases, or as the size of the test set ($W$) increases, the required additional entropy from the source ($H-n$) climbs. Itβs a stark reminder that genuine randomness is a resource that must be carefully managed and amplified.
Randomness Extractor Requirements
It’s crucial to understand that not just any randomness extractor will suffice in this process. Consider, for instance, the Von Neumann extractor . While it commendably produces an unbiased output, it falters in its ability to decorrelate groups of bits. This is a significant drawback because the inputs to extractors often originate from entropy sources that exhibit serial correlation β meaning consecutive bits are not entirely independent. In such cases, the output bits of a simple Von Neumann extractor would also exhibit dependency, undermining the goal of true randomness.
Recognizing this, NIST, in its NIST SP 800-90B standard, meticulously defines “vetted conditioning components.” These are specific algorithms and methods proven to be effective in extracting randomness reliably. Among these are constructs like AES -CBC-MAC , which are engineered to handle the nuances of real-world entropy sources and produce outputs that meet the rigorous standards for cryptographic applications. The selection and implementation of the right extractor are as vital as the quality of the initial entropy source itself.