Many users are faced with a not too pleasant situation: you buy a hard disk, for example, 500 GB, plug it in and find that there are about 480 GB. And all because hard disk manufacturers think that the Gigabyte 1000 megabytes, and Windows sure 1024. Who is right? How many megabytes in a Gigabyte?

The main snag is that the amount of information can be measured both in binary and in decimal. In addition, there are various standards of units for measuring the amount of information. This causes confusion and makes it difficult to determine exactly how many megabytes are in a gigabyte.

Actually, the prefix "kilo", "mega", "Giga -", etc. are used in the International system of units (SI)  and denote the powers of the number ten. Therefore, if you think logically, in one gigabyte there should be 1000 megabytes. But why is your operating system sure that there are 1024 of them?

The fact that the developers of many modern operating systems use the standard JEDEC memory 100B.01  (the standard of the Joint Engineering Council for Electronic Devices), according to which the prefixes adopted in the SI system can denote not a ten degree, but a power of two (that is, the standard uses a decimal number rather than a binary one). Therefore, one gigabyte by the standard JEDEC will be equal to 1024 megabytes.

But manufacturers of hard drives and flash drives use standards adopted by the International electrotechnical Commission (IEC). According to these standards prefixes SI are used to denote powers of ten. Therefore, in one GB should be 1000 megabytes and megabytes more. GOST 8.417—2002. regulating the names of units of measurement in the territory of Russia, also adheres to this point of view.

If it is important to emphasize that we are talking about binary system, you need to use decimal prefixes and binary (binary). In this case, 1024 bytes will be equal to one kibibyte, 1024 kibibyte — one mebibyte, 1024 mebibyte — one gibibyte. Such binary prefixes are adopted in the IEC standard.

The problem is that binary consoles though are correct, but practically not used. First, historically, decimal consoles are used to denote units of the amount of information in a binary system. Secondly, binary consoles are simply not very euphonious.

So the average user is unlikely to face a unit of measure "gibibit" because almost nobody uses it. And how to determine how many megabytes in a Gigabyte in this case is 1000 or 1024? Attention should be paid to writing the unit of the amount of information .

According to the proposal of the IEC, if the binary kilobytes/megabytes/gigabytes, the designation should start with a capital letter. for example, GB, Gbyte, GB. Such a designation suggests that the GB, in this case 1024 megabytes. If the first letter in the designation line (GB, gbyte, gb), refers to the decimal ("commercial") gigabytes consisting of 1000 MB.

As you can see, in the name of the units, the amount of information is firmly rooted decimal SI-prefixes. which are used even when the binary prefixes are used by the IEC standard. Therefore, in gigabyte it turns out to be 1000, then 1024 megabytes.

It's easiest to remember thatmanufacturers of hard drives and flash drives use a "proper" gigabytes, decimal. And manufacturers of RAM, video memory, CDs, as well as Microsoft and Apple (developers of Windows OS and Mac OS X, respectively) and software developers use binary gigabytes containing 1024 megabytes (which it would be correct to call gibibytes and mebibytes).