A megabyte is equal to one million bytes or eight million bits. Megabytes are used to describe things like the size of a file, the capacity of a computer hard drive, and the amount of random access memory installed in a computer.
Digital information is measured first in bits, and then in bytes, with each byte containing 8 bits of information. Rarely are any digital files small enough to be meaningfully measured in bits or bytes. Instead, they are typically measured in kilobytes, megabytes, and gigabytes which are equivalent to one hundred, one million, and one billion bytes correspondingly.
Megabytes are commonly used to describe the size of files. A few examples of typical files sizes are:
- Word Processing Document: 10 to 100 kilobytes or .01 to .1 megabytes.
- Low Resolution Image (e.g. 300 x 200 pixels): approximately 20 kilobytes or .02 megabytes.
- High Resolution Image (e.g. 8000 x 6000 pixels): 5 to 10 megabytes.
- E-Book: 1 to 5 megabytes.
- MP3 Song File: approximately 3.5 megabytes.
- CD-ROM Capacity: typically 750 megabytes.
- DVD-Quality Movie File: approximately 4 gigabytes or 4000 megabytes.
- HD (1080p) Movie: 8 to 15 gigabytes or 8000 to 15000 megabytes.
In the past, the megabyte was a large enough unit of measure to be commonly applied to computer hard drives and the amount of random access memory installed in a computer. However, at this time, the gigabyte is a more useful measure of storage capacity and computer memory.
Frequently Asked Questions
Why are some things, such as data transfer rates, measured in bits, while other, such as file size, are measured in bytes?
Bits rarely exist in isolation. In almost all cases, bits are lumped together into groups of 8 bits or 1 byte. The byte is the smallest unit of addressable storage in most computer systems. Because of this, the natural way to refer to data on a computer system is not the bit (b) but the byte (B), and multiples thereof such as megabyte (MB) and gigabyte (GB). However, when data is transferred over a network connection it is broken into individual bits. As a result, in most cases we talk about bytes and multiples thereof, but when talking about data transfer over the Internet we describe the transfer rate in bits.
Is a megabyte always equal to 1000 bytes?
The correct definition of a megabyte is that it is equal to 1000 bytes. However, in computer science, all values are increased on the basis of the binary counting method at the heart of computer programming. As a result, some might say that a kilobyte is really equal to 1024 bytes (2^10) and that a megabyte is equal to 1,048,576 bytes (2^20). To clarify the situation, the terms kibibyte and mebibyte were established to represent the binary method of counting bytes, leaving kilobyte and megabyte to refer to 1,000 and 1,000,000 bytes. However, it is still common for computer hardware manufacturers to use the kilo and mega prefixes to refer to capacities of 2^10 and 2^20 bytes.