Everyone who has ever purchased a hard drive finds out the hard way that there are two ways to define a gigabyte.
When you buy a "500 Gigabyte" hard drive, the vendor defines it using the decimal powers of ten definition of the "Giga" prefix.
500 * 109 bytes = 500,000,000,000 = 500 Gigabytes
But the operating system determines the size of the drive using the computer's binary powers of two definition of the "Giga" prefix:
465 * 230 bytes = 499,289,948,160 = 465 Gigabytes
If you're wondering where 35 Gigabytes of your 500 Gigabyte drive just disappeared to, you're not alone. It's an old trick perpetuated by hard drive makers-- they intentionally use the official SI definitions of the Giga prefix so they can inflate the the sizes of their hard drives, at least on paper. This was always an annoyance, but now it's much more difficult to ignore, as it results in large discrepancies with today's enormous hard drives. When is a Terabyte hard drive not a Terabyte? When it's 931 GB.
As Ned Batchelder notes, the hard drive manufacturers are technically conforming to the letter of the SI prefix definitions. It's us computer science types who are abusing the official prefix designations:
I guess I always just thought that the 6 GB missing was just space being used, as the frame of my partitioning, or something..
looking back at it, that doesn't make much sense, does it.. 6 GB is way too much for partitioning space
now I know the truth; the 6 GB isn't being used.. it just doesn't exist
same here, the calculation I've always used is based on the fact that 1GB as used by vendors is calculated as being 1,000,000,000 bytes when in fact it is 1,073,741,824 bytes (as seen by Windows and exactly how it should be calculated). So the actual size is 80GB divided by 1.073741824 = 74.50580596923828125GB!
The calculation is derived from 1kb actually being 1024 bytes, 1MB being 1024*1024 bytes (= 1,048,576 bytes) and 1GB being 1024*1024*1024 bytes (or 1,073,741,824 bytes, as above).