answersLogoWhite

0


Verified answer

That depends on the character code used:

  • baudot - 5 bits per character - 320 bits
  • FIELDATA - 6 bits per character - 384 bits
  • BCDIC - 6 bits per character - 384 bits
  • ASCII - 7 bits per character - 448 bits
  • extended ASCII - 8 bits per character - 512 bits
  • EBCDIC - 8 bits per character - 512 bits
  • Univac 1100 ASCII - 9 bits per character - 576 bits
  • Unicode UTF-8 - variable bits per character - depends on the characters in the text
  • Unicode UTF-32 - 32 bits per character - 2048 bits
  • Huffman coding - variable bits per character - depends on the characters in the text
User Avatar

Wiki User

8y ago
This answer is:
User Avatar
User Avatar

Jason Rohrer

Lvl 1
1y ago
Well done 👍
More answers
User Avatar

Wiki User

12y ago

It takes 8 bits to form 1 byte which equals 1 character (255 variants are possible).

True, but some languages (eg Japanese) which use Kanji characters require more than one byte.

=============

It depends on the encoding your using. With the ASCII encoding (which is, more or less, "standard" in most English-speaking countries, such as the United States), each character requires 1 byte.

However, another encoding, known as "Unicode", has become increasingly prominent, even in English-speaking countries. A Unicode character requires 2 bytes to store a single character. Some programming languages are even beginning to make Unicode the default Encoding, or include some features that enables Unicode throughout a single project. Currently, the "System.String" class within the .NET Framework is made up of purely Unicode characters, with no capacity to change this without implementing some kind of a 'hack'.

This answer is:
User Avatar

User Avatar

Wiki User

6y ago

If you mean "64 different characters", you need log2(64) = 6 different bits.

This answer is:
User Avatar

User Avatar

Anonymous

Lvl 1
4y ago

help

This answer is:
User Avatar

User Avatar

Anonymous

Lvl 1
4y ago

i dont know

This answer is:
User Avatar

User Avatar

Anonymous

Lvl 1
4y ago

2

This answer is:
User Avatar

User Avatar

Anonymous

Lvl 1
3y ago

6

This answer is:
User Avatar

User Avatar

Anonymous

Lvl 1
3y ago

8

This answer is:
User Avatar

User Avatar

Anonymous

Lvl 1
3y ago

64

This answer is:
User Avatar
Still have questions?
magnify glass
imp
Related questions

How many bits would be needed to represent a 45 character set?

45 in binary is 101101, so you need at least 6 bits to represent 45 characters.


A set of bits that represent a single character?

bites


How many bits are needed to represent decimal 200?

8 bits if unsigned, 9 bits if signed


What is the a byte?

A Byte is a collection of eight bits that can represent a single character.


How many bits are required to represent character in ascii?

4


How many bits are needed to represent decimal value ranging from 0 to 12500?

how many bits are needed to represent decimal values ranging from 0 to 12,500?


How many bits are required to represent the phrase recommended setting?

"recommended setting" There are 19 characters including the space between the two words. If the old convention of using 1 byte to represent a character, then we would need (19 x 8) which is 152 bits. If we use unicode as most modern computers use (to accommodate all the languages in the world) then 2 bytes will represent each character and so the number of bits would be 304.


How many characters are in a nibble?

A nibble (also known as a nybble or nyble) can represent half a character(two nibbles are needed for a valid ASCII character). A nibble is made up of 4 bits and those 4 bits are usually represented by a single hexadecimal value. 4 bits only allows for 16 combinations, 8 bits allows for 255. An ASCII character is represented by two hexadecimal characters, which is the same as 8 bits or two nibbles.


How many binary bits are needed to represent decimal number 21?

5


How may bits are needed to represent the decimal number 200?

log2 200 = ln 200 ÷ ln 2 = 7.6... → need 8 bits. If a signed number is being stored, then 9 bits would be needed as one would be needed to indicate the sign of the number.


how many bits are needed to represent decimal values ranging from 0 to 12,500?

1200


How many bits to represent twenty-six?

23 can be represented in binary as 10111 and would therefore require 5 bits to represent.