First it is necessary to understand what a bit and a byte are.
A bit is in computer storage the condition of a piece of data that can exist in one of two conditions, such as if/then, on/off, or yes/no.
A byte is eight bits.
So, a gigabit is 1,000,000,000 bits. (1 billion)
A byte being eight bits then makes a gigabyte 1,000,000,000 bytes, or 8,000,000,000 bits.
Byte (b) and Gigabyte (gb) are different units of measure for binary data storage.
There is a direct conversion between the two... working backwards from gb:
1 gb = 1024 mb (Megabyte)
1 mb = 1024 kb (Kilobyte)
1 kb = 1024 b (Byte)
therefore 1 (gb) * 1024 (mb) * 1024 (kb) * 1024 (b)
yields 1 gb = 1,073,741,824 bytes
gb
A byte is 8 bits. So 1 gigabit(Gb) = 0.125 gigabyte(GB)
One Terabyte is equivalent to 1000 Gigabytes One Gigabit (or Gigabyte) is equivalent to 1000 Megabytes One Megabit (or Megabyte) is equivalent to 1000 Kilobytes One Kilobyte is equivalent to 1000 bits
32 megabit = 0.032 Gigabit
The anwer is 0. Because a gigabit (GB) is: 1000,000,000 bits.
there is no difference, they are the same.
Gigabit Ethernet is the term used to describe the transmission of Ethernet frames at a rate of one gigabit per second. It started being used in early 1999.
On iPod, it abbreviates it to GB in the usage section on general settings.
The reason the phone reads 1 gigabit of memory on the sd card when it is a 4 gigabit sd card is because it needs to be reformatted.
Is there a difference between the Xbox360, 20GB and the Xbox360, 60 GB besides the 40 GB ?
16 GB
512000 megabytes (MB) is equal to 500 gigabyte (GB).