proof of hash iterations.
if we store every generated hash of some hash iterating system, we can do probability based verification whether all the known hashes were calculated. by verifying random set of any given amount of hashes, we can do probabilistic estimate whether all the hashes in list was computed correctly.
however, storing every generated hash for proof is impossible for high performance hashing systems due to extreme data rates utilized in hash generation systems. for example doing 250Mh/s would equal to 250M hashes every second. if each hash is 64 bytes (sha256), then we get data rate of 16GB/s.
we can instead store only some part of the generated hashes. storing first byte of each hash, would reduce hash rate to 1/64th part of the initial data rate, 25MB/s, which is way more tolerable rate for storing on disk.
hashing algorithms produce seemingly random strings, which means storing always first byte is enough for validation. ie it does not matter which byte is stored.
when we validate for example first byte of 1000 hashes, we already get 1000 byte long proof of correctly generated hashes.
therefore, it is possible to proof some number of hashes were generated by only storing first byte of each hash as well as the algorithm for generating each hash, so we need to also store for example nonce for each iteration. if nonce is for example 16bit integer, we get 2 additional bytes for each hash. this increases data rate by factor of 3, making it 75MB/s.