bytes are the computer term to measure how much data is used to create a document. For instance 8 bits equal a byte and one alpha letter is a byte. So if you have a 500 word paper then count all the letters and that is how large your paper is in computer bytes.
I think that you are referring to the technical terms of character, byte, and word rather than the common usage. These are all terms used in the computer field and refer to different sizes of data, all of which are based on the bit.
To explain, the bit is the most basic type of data inside a computer. Bit is actually a contraction of the phrase binary digit. It is a single value of either 1 or 0. Moving up the scale, next is the nibble. This is composed of four bits and, therefore, can have a value anywhere between 0000 and 1111 (0 and 15 in decimal notation). Beyond this is the byte, which is eight bits and can have a value of anywhere between 00000000 and 11111111 (0 and 255 in decimal). In some programming languages, a byte is also sometimes referred to as a character. For example, in Pascal, one can refer to a 15-byte string as an array of 15 values of type Char (short for character). Generally, this refers to printable characters such as the alphabet or numerals but technically can be any ASCII value). At the top end of what you mentioned is the word. This is a series of sixteen bits (the equivalent of two bytes) and can have a value between 0000000000000000 and 1111111111111111 (0 and 65535 in decimal).
A modern computer can use any of these in the appropriate context. Once again I will use Pascal as an example. In Pascal, there is a type of integer variable called Int and has a range of values that can be between -32768 and 32767 (if signed) or between 0 and 65535 (if unsigned). The Int variable is stored in memory as a word value since it uses only sixteen bits to represent its full range of values. By comparison, a ShortInt has a value range of -128 to 127 (if signed) or 0 to 255 (if unsigned) and it is stored in memory as a byte since all values are represented by only eight bits.
I hope this helps. Let me know if you have any other questions.
The Another name of a BYTE is WORD. Sorry, but that is not correct. Computers do have a 'word length' which is described as a number of either Bits or Bytes. Most modern computers have a 16 bit or 32 bit, some 64 bit, word length. This 'word' is nothing to do with word processing or a typed document - it relates to the computers internal architecture. Some people use Byte and Character as the same thing but this again is not quite technically accurate. A Byte is always 8 bits by definition. However a character (eg the letter 'A') may vary in the number of bits required to store it. In most modern systems each character requires 8 bits - which just happens to be one Byte. Some older systems used 5, 6 or 7 bit characters. A correct technical word to use in place of Byte is Octet.
A billionth of a byte. However, since a byte is the number of binary digits used to encode a single character, I do not see how it is possible to have a billionth of a byte - unless you have developed something way more advanced than quantum computers.
byte
No Mitchell
A Byte is a collection of eight bits that can represent a single character.
A byte order mark is a character indicating the endianness (byte order) of a string of text.
In byte stuffing (or character stuffing), a special byte is added to the data section of the frame when there is a character with the same pattern as the flag. The data section is stuffed with an extra byte. This byte is usually called the escape character (ESC), which has a predefined bit pattern. Whenever the receiver encounters the ESC character, it removes it from the data section and treats the next character as data, not a delimiting flag.
No.
a bit is 1/8th of a byte
giga and mega maybe more gigabite mega byte byte or bite
The number of bytes used by a character varies from language to language. Java uses a 16-bit (two-byte) character so that it can represent many non-Latin characters in the Unicode character set.
1 Byte and 1 Byte = 8 bits