How is mining difficulty defined (formula)?



The result of sha256(s) that miners attempt is a 256 bits long number (64-digits hexadecimal number).

I understand that this 256bits number should be lower than the target threshold, the target threshold being itself a 256 bits long number with a number of leading zeros.

Target threshold is encoded with 4 bytes in the block's headed, as 0xEEMMMMMM in hex notation. EE being the 1byte exponent and MMMMMM being the 3bytes mantissa.

How exactly are these exponent and mantissa translated to the actual 256-bits long target threshold?


Posted 2017-11-04T10:50:30.753

Reputation: 1 142



The nBits field is basically scientific notation in base-256 (256 is 2^8). As an example, we take the one found on the Developer Reference: 0x181bc330 (Big-Endian order). This is split into two parts, the 0x18 exponent (24 in decimal), and the 0x1bc330 mantissa. The mantissa is 3 bytes long, so subtract 3 from the exponent and then raise 256 to that power, and multiply by the mantissa just as in scientific notation:

0x1bc330 × 256 ^ (0x18 - 3)

Giving the target, the mantissa followed by 21 0x00 bytes, or 42 zeroes in hexadecimal.

See this answer for another example:

EDIT: You can also think of it as allocating the number of bytes given in the exponent, in this case 0x18=24 bytes, and than filling the first 3 bytes with the mantissa.


Posted 2017-11-04T10:50:30.753

Reputation: 9 285

1I like the explaination in the edit. – croraf – 2017-11-07T09:05:31.837