In computer terms a nibble = 4 bits = 1/2 byte.
You can further define the data segment as:
Crumb = 2 bits
Nibble = 4 bits
Byte = 8 bits
Word = 16 bits
Double Word=32 bits
Jury still out on 64 bits and Sentence
In keeping with the spelling of "byte", the lick and nibble are sometimes spelled "lyck" and "nybble"
-------------------
A nibble is half a byte, but believe it or not, a byte does not necessarily have to have eight bits. I don't know of any computer platform that uses anything but 8 bit bytes these days, but in computer science terms, a byte is generally the smallest addressable element, and the size needed to store a character. You also might be surprised to know that not all machines use ASCII (eight bit bytes) characters. IBM Mainframes still use EBCDIC under their traditional operating systems, and that stands for Extended Binary Coded Decimal Interchange code, which accounted for the lion's share of data until a few decades ago. It's an extended version of BCD, which uses 4 bits to express numbers, and there's no technical reason that a BCD based machine couldn't have 4 bit bytes.
It's unlikely that you will ever encounter a computer that doesn't use eight bit bytes, but you may encounter people who studied computer science back in the 1970s.
Back in the "old" days (the 1960's) when computer's didn't have operating systems or hign level programming languages, you always dealt with the byte. On some machines the byte was 8 bits and on others it was 8 bits + a parity bit for 9 bits. There was "even" parity and "odd parity", meaning you set the parity bit on if an even number of bits = 1 in the original 8 bits, or set it on for an odd number of bits = 1 in the original 8 bits. The "word" was originally set to be the size of a register (everything was done through a set of registers). The registers were used to assemble the current instruction that you wanted the computer to execute (what kind of action, like move a byte, add, subtract etc., plus the final length of your data, which determined how many cycles the computer had to go through to execute your instruction, plus where the data was coming from and going to). The "word" length was pegged to the length of the register, meaning that in treating the computer like a book, each register was a word. Since the first computers were totally byte oriented, a word was 8 bits. When 16-bit registers were implemented, they became 16 bits, then 32 bits and now 64 bits. There are some computers today that even have 128 bit words. So a "word" is the length of the registers in whatever computer you are using. It is also the biggest chunk of bits that the computer can process at one time.
The word "nibble" was invented to specify the high-order 4 bits in a byte or the low-order 4 bits in a byte (like eating a nibble from a cookie, instead of the whole cookie). Since a number can be specified in 4 bits, you only needed a "nibble" to store a number. So, if you had a field that was all numbers, you could write it out in "nibbles", using half the space you would have used if it was in bytes. Back in those days, space counted. The first "mainframe" computers had 4k of memory (no, that really is 4k), so you didn't have any space to waste if you were doing something like payroll or inventory management.
In some cases, individual bits within bytes are used to store flags (yes or no for a given attribute) and, in at least one IBM manual, these were referred to as tidbits. IBM was not known for a sense of humor, but the term never became a generally accepted abbreviation.
SCSI is an acronym for Small Computer Systems Interface.
computer is nothing but electronics devicves capable of doing all type calaculation
For a cheese burger!
Flynn's Classification is in terms of Instruction and Data Stream but Feng's Classification is in terms of bit parallelismand word parallelism
Baby lambs drink milk from their mothers and nibble on grass.
A nibble can refer to malware or trojans, nibble refering to a virus. But you should rephrase your question, it doesn't make sense to me.
It's spelled "nibble." Here is the definition as it is in terms of amount of information that a computer can send. You can have 1 bit (which is a 1 or a 0), 4 bits make a nibble, 8 bits make a byte, 1024 bytes make a kilobyte and so on.
Network Nibble NAND (gate) Notebook (computer)
A nibble is half a bit octet (commonly known as a byte). A nibble, therefore, is a set of four binary digits. The numeric value of a nibble is commonly presented in binary form, or in form of a single hexadecimal digit.
Half a byte is 4 bits, which is actually known in the computer world as a nibble.
I gently began to nibble on the suspicious looking cake.The rabbit had a nibble on the carrot.Go on, try it. Have a nibble.
what do you mean the bullet in computer terms
nibble is in electronic equal 4 bit nibble=4bit
computer geek speak wise i would say that if 8 bits are one byte and if half a byte is a nibble ( 4 bits ) then half a nibble would be equal to two bits or similar to two peas in a pod so the technical answer would be a pod - ( place of double ) bit grouping
Noun: one nibble, two nibbles Verb: nibble, nibbles, nibbling, nibbled
When I look at you , all I want to do is nibble on your ear . The mouse found a piece of cheese to nibble on
A nibble is a half of a byte, 4 bits.