Bit-length or bit width is the number of binary digits, called bits, necessary to represent an unsigned integer[1] as a binary number. Formally, the bit-length of a natural number is

where is the binary logarithm and is the ceiling function.

Bit lengths (ε denotes the empty string)
decimal ε 1 2 3 4 5 6 7 8 9 10
binary ε 1 10 11 100 101 110 111 1000 1001 1010
bit length 0 1 2 2 3 3 3 3 4 4 4

At their most fundamental level, digital computers and telecommunications devices (as opposed to analog devices) process data that is encoded in binary format. The binary format expresses data as an arbitrary length series of values with one of two choices: Yes/No, 1/0, True/False, etc., all of which can be expressed electronically as On/Off. For information technology applications, the amount of information being processed is an important design consideration. The term bit-length is technical shorthand for this measure.

For example, computer processors are often designed to process data grouped into words of a given length of bits (8 bit, 16 bit, 32 bit, 64 bit, etc.). The bit-length of each word defines, for one thing, how many memory locations can be independently addressed by the processor. In cryptography, the key size of an algorithm is the bit-length of the keys used by that algorithm, and it is an important factor of an algorithm's strength.

References edit

  1. ^ "Wolfram Mathematica 8 Documentation". Retrieved 10 Jan 2012.