What is a Gigabyte?

Article Details
  • Written By: G. Wiesen
  • Edited By: Heather Bailey
  • Last Modified Date: 11 November 2019
  • Copyright Protected:
    Conjecture Corporation
  • Print this Article
Free Widgets for your Site/Blog
People can experience an altered state of consciousness by staring into someone else's eyes for 10 minutes.  more...

November 17 ,  1973 :  US President Richard Nixon insisted he was not a crook.  more...

A gigabyte is a term that indicates a definite value of data quantity with regards to storage capacity or content. It refers to an amount of something, usually data of some kind, often stored digitally. A gigabyte typically refers to 1 billion bytes. A gigabyte can potentially be confused with a gibibyte, which is also a storage amount but is based on a binary or base two system.

The easiest way to understand the term “gigabyte” is to separate it into its two base parts: “giga” and “byte.” A byte is a piece of data, and is usually considered to be the smallest amount of data used to represent a single character in a computer code. In other words, bytes are the individual building blocks of a computer code. Each byte is made up of a number of bits, usually eight, and each bit is a piece of data that typically has one of two possible values: usually represented as a 1 or 0.


Bits are the individual parts of binary code that are grouped together, eight at a time, to create a single byte that then actually makes up the data in a larger sense. Therefore, computer programs are made up of bytes, and so the size of a program is represented in terms of bytes. Just as a wall made using 100 bricks is going to be larger than a wall made from 10 of the same bricks, a program of 100 bytes is larger than one of 10 bytes. Rather than express the sizes of large programs in thousands or millions, and now billions and trillions, of bytes, prefixes are used to indicate orders of magnitude.

These prefixes follow the established notations of the International System of Units, similar to what is used in metric measurements. Therefore 1,000 bytes is referred to as a kilobyte, 1 million bytes is a megabyte, 1 billion bytes is a gigabyte, and 1 trillion bytes is a terabyte. These prefixes each indicate an order of magnitude by which the bytes are increased, and somewhat correspond to binary notation that uses similar terminology. It is due to this type of notation that a gigabyte can sometimes be confused with a gibibyte, which is similar but different in size.

A gibibyte is a size used most often to refer to storage or processing capacity for memory, such as random access memory (RAM). This size is based on a binary or base two system, in which orders of magnitude consist of exponential increases by 10 to a base of two. In other words, 210 bytes is a kibibyte, 220 bytes is a mebibyte, and 330 bytes is a gibibyte. While this is close to a gigabyte, it is not exactly the same: a gibibyte is 1,073,741,824 bytes; and has led to confusion with regard to actual storage sizes on hard drives and similar memory devices.


You might also Like


Discuss this Article

Post your comments

Post Anonymously


forgot password?