What are Megabytes?

Article Details
  • Written By: James Doehring
  • Edited By: Lauren Fritsky
  • Last Modified Date: 14 June 2017
  • Copyright Protected:
    Conjecture Corporation
  • Print this Article
Free Widgets for your Site/Blog
Research shows that drinking a hot drink on a hot, dry day actually cools you down by making you perspire more.  more...

June 26 ,  1963 :  U.S. President Kennedy declared, "Ich bin ein Berli  more...

Megabytes (MBs) are collections of digital information. The term commonly refers to two different numbers of bytes, where each byte contains eight bits. The first definition of megabyte, used mainly in the context of computer memory, denotes 1 048 576, or 220 bytes. The other definition, used in most networking and computer storage applications, means 1 000 000, or 106 bytes. Using either definition, a megabyte is roughly the file size of a 500-page e-book.

The word bit is from binary digit. They are "yes or no" answers to unambiguous questions and are stored in computers as either zeroes or ones. Whether a door is locked is an example of a question that can be answered with only one bit. This is fundamentally how all digital information is stored in telecommunications and computing.

Computers can use multiple bits together to store complex information. It takes at least five bits to designate one of the 26 letters in the Latin alphabet (2×2×2×2×2 = 32). The way a computer would store a letter goes something like this: bit number one designates if the letter is in the fist half of the alphabet, bit number two narrows the remaining letters into two halves, and so on. After five bits, any letter can be identified.


Today, a byte most commonly means a collection of eight bits, although the number has varied in the past. Eight bits is a convenient number of bits for storing a single character; in addition to the letter itself, format data such as capitalization is commonly stored. Information is rarely stored in bits alone — the byte is the standard unit of information addressed in computer architectures.

The confusion regarding megabytes comes from the way bytes are multiplied. Computers use binary number systems rather than decimal, or base-10, number systems. From this standpoint, it makes more sense to group bytes based on powers of two; hence the 220 definition of a megabyte. Recently, there has been a movement to use prefix definitions consistent with the metric measurement system. This would favor the 106 definition for a megabyte.

Megabytes are common file sizes in personal computing and business transactions. Text documents are often less than one megabyte in size. Images, especially from high-resolution digital cameras, commonly take up several megabytes of storage space. Movie-length video files can be larger than 1,000 megabytes. Compact disc read-only memories (CD-ROMs) generally can hold about 700 megabytes, while digital video discs (DVDs) often have capacities of several thousand megabytes.


You might also Like


Discuss this Article

Post 2

@MrMoody - That’s a good point. It should also be noted that the bigger the hard drive, the longer it takes to defragment. As to where this is all headed, however, I don’t even think terabytes are the final destination.

Just keep using a multiplier of 1,000 to find out. You convert gigabytes to terabytes by multiplying by 1,000. Multiply terabytes by 1,000 and you get petabytes; multiply that by 1,000 and you get exabytes, and then next up are zettabytes, also by a factor of 1,000. This is all a bit of trivia I learned from my computer science class.

Will it stop with zettabytes? I don’t know, but I think the greater capacity will be justified. I think by that time computers will be used to store a lot more information than we’re putting in them now.

Post 1

I remember when personal computers first came out and megabytes seemed like a big deal. One of my first hard drives was a ten megabyte drive and everyone thought it was huge. Of course, for the standards of the day, it was huge.

Then we made the leap from megabytes to gigabytes, which are a thousand times more, and now we have drives that are terabytes in size.

It seems that there is no end in sight for hard drive capacity, and while this seems like a good thing for consumers-more bang for the buck-it has the unfortunate effect of making software companies less efficient in their development projects.

After all, why strive to optimize your software application’s size (and make the code leaner) when you know the end user will have a super-huge hard drive to run your software?

Post your comments

Post Anonymously


forgot password?