
Efficient Entropy Accumulation and Random Number Generation Research
Explore the super-efficient entropy accumulation methods and random number generation infrastructure in the context of Win10. Delve into the modeling of entropic inputs, questioning the efficacy of rotation, and analyzing the results obtained. Discover which rotation numbers prove effective in accumulating entropy from independent sources and uniform distributions.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
No Time to Hash: On Super-Efficient Entropy Accumulation Zhiye Xie NYU Shanghai Yevgeniy Dodis New York University Siyao Guo NYU Shanghai Noah Stephens-Davidowitz Cornell University
Win10s Random Number Generation Infrastructure [Fer19] Step I: Entropy Accumulation (32-bit CPU) 32 32-bit 32-bit Step II: Randomness Extraction (32-bit CPU) SHA-512 64 (true) random bits 32-bit
Win10s Entropy Accumulation: ? rot?,?(?) ? ? : rotation number 32-bit ? 32-bit ??+1= rot5,32(??) ? ?? rot5,32?1,?2,?3,?4,?5, ,?32 = ?28,?29,?30,?31,?32,?1, ,?27 Win 10 s rotation number: rot5,32and rot19,64
Our Questions ? rot?,?(?) ? Does rotation work? Are Microsoft s choices of rot5,32and rot19,64reasonable? Can we find a better choice?
Modeling of Entropic Inputs ??(Interrupt timings) Modeling Assumption: Independent ?? ? ?? ? ??~ a distribution whose lower-order bits have high entropy
Modeling of Entropic Inputs ??(Interrupt timings) Example: 2-monotone Distribution discrete Gaussian distribution discrete exponential distribution lower-order bits have high entropy uniform distribution over the first k bits
Question I: Does Rotation Work?
Our Results I: Rotation Works! rot?,?accumulates (?) bits of entropy from ? independent ??~ 2-monotone with (min-)entropy > 1 gcd ?,? = 1 Remark: rot?,?can be generalized to any cyclic permutation
Question II: Which Rotation Numbers Are Good?
Our Results II: Comparing Different ? ? rot?,?(?) ? Covering Number: ??,? ? rot?,?accumulates ? bits of entropy from uniform distribution over the first ? bits ? ? uniform distribution over the first k bits
Our Results II: Comparing Different ? ??,?samples sufficient to accumulate nearly full entropy from 2-monotone sources with (min-)entropy ? Smaller ??,?, better rot?,? Goal: Find ? such that rot?,?has relatively small ??,?for all ?
Our Results II: Comparing Different ? ? = 5 (Microsoft s choice) is generally worse then ? = 7
Our Results II: Comparing Different ? ? = 19 (Microsoft s choice) is reasonable
Question III: A Better Choice?
Our Results III: A New Recommendation Bit-reversed Rotation: ???? (? is a power of 2) Example: tor8 int (0 1 2 3 4 5 6 7) binary 000 001 010 011 100 101 110 111 reversed binary 000 100 010 100 001 101 011 111 int (0 4 2 6 1 5 3 7)
Our Results III: A New Recommendation for for ? < ? ?????? + ? = ????(?) ?????? = ? + ? Bit-reversed Rotation: ???? (? is a power of 2) Example: tor8 ?tor?, ?= 2 ?tor?, ?= 4 More generally, ?tor?, ?? = 2? (optimal) ?
Our Results III: A New Recommendation bit-reversed rotation compares favorably with rotations
Proof Sketch: Why Rotation Works?
Proof Sketch: Why Rotation Works? ? ? ??~? 1<?,?> ? = rot?,? Proof Techniques: ??: distribution after receiving ? independent samples, ??~ 2-monotone ?? ? ~ ??,? = ?1 ??2 ?? 1?? Goal: ? ?? ? log2 ? 0,1? ??? Tool: ??(?) = ?1? ?2??? ??(??)? 1?
Proof Sketch: Why Rotation Works? ? ? ??~? 1<?,?> Main Lemma: ? = rot?,? Given 2-monotone distribution ? over 0,1?with min-entropy ? ? 0,1?with ??= 1 It holds that ?(?) min{1,2?+1 ?}
??: 2-monotone distribution with (min-)entropy ? bits ??: distribution after receiving ? independent samples ??(?) = ?1? ?2??? ??(??)? 1? Proof Sketch If gcd ?,? = 1, then there are |?| vectors having ?0= 1 among (?,???, ,(??)? 1?) Main Lemma: | ? ? | 2 k+1if ?0= 1 2 ?+1|?|= 1 + 2 ?+1? ??(?) ? 0,1? ? 0,1? ? ?? ?(1 log21 + 2 ?+1)
Connection With Covering Numbers ? ?,???, ,(??)? 1? strong condensing! | ? ? | 2 ?+1 ? steps ? ?,???, ,(??)? 1? strong condensing! | ? ? | 2 ?/2 ? = ??, ?/?steps From empirical data, ??, ?steps is already enough for strong condensing
Conclusion Rotation works: If gcd ?,? = 1,rot?,?accumulates a linear entropy from ? independent 2-monotone sources with entropy > 1 Covering number characterizes rotations: ??,?samples sufficient to accumulate nearly full entropy from 2-monotone sources with entropy ? A better choice: bit-reversed rotation
Appendix Efficient Implementation: 1) (??: bit-reversal permutation, ??= ?? tor?= ?? rot1,? ?? ? rot1,?? ??(?) ? = ??(? ) ? ??(rot1,?(??? )) ? Save one permutation per update