That depends on the character code used:
It takes 8 bits to form 1 byte which equals 1 character (255 variants are possible).
True, but some languages (eg Japanese) which use Kanji characters require more than one byte.
=============
It depends on the encoding your using. With the ASCII encoding (which is, more or less, "standard" in most English-speaking countries, such as the United States), each character requires 1 byte.
However, another encoding, known as "Unicode", has become increasingly prominent, even in English-speaking countries. A Unicode character requires 2 bytes to store a single character. Some programming languages are even beginning to make Unicode the default Encoding, or include some features that enables Unicode throughout a single project. Currently, the "System.String" class within the .NET Framework is made up of purely Unicode characters, with no capacity to change this without implementing some kind of a 'hack'.
If you mean "64 different characters", you need log2(64) = 6 different bits.
help
i dont know
2
6
8
64
45 in binary is 101101, so you need at least 6 bits to represent 45 characters.
bites
8 bits if unsigned, 9 bits if signed
A Byte is a collection of eight bits that can represent a single character.
4
how many bits are needed to represent decimal values ranging from 0 to 12,500?
"recommended setting" There are 19 characters including the space between the two words. If the old convention of using 1 byte to represent a character, then we would need (19 x 8) which is 152 bits. If we use unicode as most modern computers use (to accommodate all the languages in the world) then 2 bytes will represent each character and so the number of bits would be 304.
A nibble (also known as a nybble or nyble) can represent half a character(two nibbles are needed for a valid ASCII character). A nibble is made up of 4 bits and those 4 bits are usually represented by a single hexadecimal value. 4 bits only allows for 16 combinations, 8 bits allows for 255. An ASCII character is represented by two hexadecimal characters, which is the same as 8 bits or two nibbles.
5
log2 200 = ln 200 ÷ ln 2 = 7.6... → need 8 bits. If a signed number is being stored, then 9 bits would be needed as one would be needed to indicate the sign of the number.
1200
23 can be represented in binary as 10111 and would therefore require 5 bits to represent.