Correcting the error in getnetworkhashrateps

Interesting. This works with any “k’th lowest hash”, actually.

If H_{(k)} is the k’th lowest hash seen (the lowest one having k=1), normalized to [0 \ldots 1], then \frac{k-1}{H_{(k)}} is an estimate for the amount of work.

If we model this as n uniformly random samples being taken from [0 \ldots 1] (which is what each hash attempt is), then

H_{(k)} \sim \mathrm{Beta}(k, n-k+1)

For such a distribution

\mathrm{E}\left[\frac{k-1}{H_{(k)}}\right] = n \\ \mathrm{Var}\left[\frac{k-1}{H_{(k)}}\right] = \frac{n(n-k+1)}{k-2} \approx \frac{n^2}{k-2}

which means that using higher k gives a better estimate.

However, for any small constant k, this variance is still \mathcal{O}(n^2). Using the number of blocks is a much better estimators. Let b be the number of blocks seen, and w the expected amount of work per block:

w = \frac{2^{256}-1}{\mathrm{target}}

Then

b \sim \mathrm{Poisson}\left(\frac{n}{w}\right)

And we can esimate the amount of work as bw:

\mathrm{E}\left[bw\right] = n \\ \mathrm{Var}\left[bw\right] = nw \approx \frac{n^2}{b}

Which has just \mathcal{O}(n) variance.

EDIT: I made a mistake in the variance of the bw estimator, it’s nw, not n. The two approaches do seem similar in terms of variance:

  • \frac{k-1}{H_{(k)}} has standard deviation roughly equal to \frac{n}{\sqrt{k-2}}
  • bw has standard deviation roughly equal to \frac{n}{\sqrt{b}}.