|
Boost : |
From: Ivan Matek (libbooze_at_[hidden])
Date: 2024-12-08 19:27:16
Hi everyone,
Implementation uses uint64_t in which it sums into hashes of particular
elements(each computed with "fresh" hasher so ordering does not matter), then
it finally hashes that sum and range size into original passed hasher.
I see no problem with logic this since it cleverly exploits the fact that
uint64_t overflow is defined and that sum is commutative.
But I wonder if this preserves the cryptographic quality of original hash.
For example if we use sha2 it seems to me that finding collisions becomes
easier since
sum is "only" 64 bits, and length is "only" 64 bits(or 32 on some systems),
and may be even less if attacker knows unordered range was not super long,
e.g. number of items was less than 10000.
Does somebody know more about this particular domain and could point me to
some resource to learn more? I know it was in the original proposal, but it
is possible original proposal was not focusing only on cryptographically
secure algorithms.
I was unable to google any guarantees, maybe because idk what keywords to
use. :)
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk