Last Updated on
A gigabyte is approximately equal to a billion bytes of data.
Gigabyte (GB) is the current standard unit for most removable storage media, including DVDs, flash memory, and many forms of external storage devices. It is also commonly used to calculate a computer’s RAM resources. Smaller hard drives and solid-state hard drives are also measured in gigabytes, though many newer hard drives have capacities exceeding a terabyte (1,000 gigabytes).
A gigabyte is 1,000 times larger than a megabyte and 1,000,000 times larger than a kilobyte.
Frequently Asked Questions
How does a gigabyte compare to a gigabit?
A byte is a collection of 8 bits, meaning a gigabyte is 8 times larger than a gigabit. If your networking device or Internet connection claims it has a 1 Gbps bandwidth, that means it can transfer 1 gigabits of data every second. If you are trying to transfer a 1 gigabyte file, it will take 8 seconds at that speed. Gigabyte is traditionally used when referring to storage or file size, while gigabit is used when referring to data transfer speeds. Technically, gigabytes should always be abbreviated as GB, while gigabits should be Gb, but this convention is not always upheld. If you’re not sure, refer to the usage instead.
How much can a gigabyte store?
Depending on the quality of the media you are using, a gigabyte of storage can hold anywhere from several hours of video to just a few minutes. At standard TV quality, it can store approximately an hour of video. However, at HDTV quality, it will only store approximately 7 minutes. This is why a DVD can store an entire movie on either a 4.7 GB disc or an 8.5 GB dual-layered disc, while a Blu-ray needs 25 – 50 GB to store the same film. Likewise, you can store slightly more than 100 minutes of CD-quality audio using 1 GB of storage, but when compressed to MP3 format, you can store hundreds of minutes in the same amount of space. The average book takes up approximately 1 megabyte of storage, so a gigabyte of storage can hold approximately 1,000 books.
Isn’t saying a gigabyte is one thousand megabytes an approximation?
This depends on how it is used. A gigabyte is equal to 1,000 megabytes, or 1 billion bytes. However, if you’re referring to computer memory, capacity is usually increased in powers of 2, because computers are based on a binary system. So rather than a being 1,000 times larger than a megabyte, a gigabyte is 2^10, or 1,024, times larger. Technically this is considered a “gigabinary byte” or “gigigyte,” but most people just say gigabyte. For most of us, the difference is insignificant, and a Gig is just a Gig.