Options

I have an external hard drive which is 640gb, yet there is only 596gb free.

2»

Comments

  • Options
    Rich_LRich_L Posts: 6,110
    Forum Member
    Esot-eric wrote: »
    Perhaps because it's actually a 640GB drive and not 600.

    But it will never format to 640 gig however you add up your bits and bytes will it?
  • Options
    MaxatoriaMaxatoria Posts: 17,980
    Forum Member
    ✭✭
    Rich_L wrote: »
    But it will never format to 640 gig however you add up your bits and bytes will it?

    They always advertise it as unformatted capacity
  • Options
    ibattenibatten Posts: 418
    Forum Member
    d'@ve wrote: »
    All valid points, however, the point I am trying to make is that because the decimal GB is now more common (overall) both in public-facing industry and in International standards bodies, those who want to stick with what is overall now the minority method should be distinguishing it by using the GiB abbreviation or if they don't like that, something similar but not GB/MB etc.

    I'm not sure that it is more common, actually. On a machine with some RAM, a disk, a video card and a copy of Windows (which is a fairly common set of components) then binary Ms and Gs will be used by all of that bar the disk. The only people to consistently used G==10^9 are disk manufacturers (and that only relatively recently) and, even more recently, Apple in reporting disks. Telecoms usage relates almost exclusively to bits per second or, with yet more subtle opportunities for misunderstanding, baud, so there are yet other conversion factors and opportunities for confusion there (and Mbps versus MBps is always fun).
    You'd think that would be obvious, even to Microsoft (and those who I suspect use Microsoft as an excuse not to bother).

    I think overall it would be better if Microsoft reported disk capacities in the units that disk manufacturers use, yes. But there's more opportunity for fun there. An old SD card I keep in my wallet is labelled "512MB", but Mountain Lion reports it as "513.2 MB (513*236*992 Bytes)" --- 1002416 bytes per MB, according to the manufacturer. Now you, and I, know why that is, but it's certainly not a universal cure for confusion to just standardise on 10^6.
  • Options
    the sandmanthe sandman Posts: 621
    Forum Member
    ✭✭
    Helmut10 wrote: »
    That does not show anything, Formatting and the Filing system takes up space on the Hard Drive.

    Then follows many posts about how HDD manufacturers and MS rounds numbers, and how much disk space various filing systems and Windows extra peculiarities take......

    Yo ho ho....press Start to begin..... ;)



    So that's why.
  • Options
    LION8TIGERLION8TIGER Posts: 8,484
    Forum Member
    If you divide the actual size shown by the advertised size
    596 ÷ 640 = 0.93125

    If you do the same with my 160GB hard drive reporting 149GB
    149 ÷ 160 = 0.93125

    At least it is consistent.
  • Options
    LoobsterLoobster Posts: 11,680
    Forum Member
    ✭✭
    Ulysses777 wrote: »
    at the basic level, computers do not work in decimal. Defining a kilobyte as 1000 bytes is worthless. Notice that a 1GB RAM module is not exactly 1,000,000,000 bytes? And there are no RAM manufacturers using the GiB term, as it's worthless to them.

    Of course a 1GB RAM module is not 1,000,000,000 bytes. The amount of bytes is a number that is a multiple of 2. That's how computers work, as you said.

    We understand what the issue is, you don't need to argue that point. It's not like we don't understand it. What you don't seem to understand (or don't want to understand) is that the original notation which began to be used decades ago was replaced with an official, [IEEE] industry standard more than 10 years ago. If manufacturing companies refuse to accept the new international standard, that's up to them. But it doesn't mean that the new international standard has gone away, or that manufacturers can continue to use the 'old' standard indefinitely and still be considered correct.

    The only point of debate is what can correctly be called a 'Gigabyte'. If you want to stick to something whose meaning was changed more than 10 years ago, then that's up to you. But your interpretation is wrong because the meaning of the notation was changed a long time ago.

    End of.
  • Options
    ibattenibatten Posts: 418
    Forum Member
    Loobster wrote: »
    official, [IEEE] industry standard.

    IEEE standards aren't ISO standards, however, and "official" and "industry" aren't the same thing. But in this case the Mebibyte (etc) notation was established by the IEC, who are intergovernmental and carry a lot more weight. But the main complaint in this thread is that Microsoft-GBs aren't the same as Seagate/WD-GBs, which is true, but I suspect that Microsoft don't pay a lot of attention to the IEC (or ISO, or the ITU).

    On measurement as with many other things, the USA isn't terribly influenced by standards bodies, because their governments tend to see it as part of a conspiracy of communist metricators wanting to mess with how they measure their precious bodily fluids. But I think on consideration that the US (and UK, and a few other countries') decision to tell the ITU to **** off over Internet governance was broadly A Good Thing, and had we listened to ISO and the ITU in the 1980s we'd never have had a global Internet or anything that looked like it (1). Standards bodies are good servants but poor masters, and although in this particular case I think Microsoft should probably get with the programme, I don't think the bald statement "there's an international standard" is enough. OSI should have proved that beyond any possible argument.

    (1) Remember, up until the early 1990s European academic networks were expressly forbidden from using TCP/IP because it would "slow OSI adoption". The ITU resisted the Internet tooth and nail, because it threatened the commercial interests of their members.
  • Options
    Ulysses777Ulysses777 Posts: 741
    Forum Member
    ✭✭
    Loobster wrote: »
    Of course a 1GB RAM module is not 1,000,000,000 bytes. The amount of bytes is a number that is a multiple of 2. That's how computers work, as you said.

    We understand what the issue is, you don't need to argue that point. It's not like we don't understand it. What you don't seem to understand (or don't want to understand) is that the original notation which began to be used decades ago was replaced with an official, [IEEE] industry standard more than 10 years ago. If manufacturing companies refuse to accept the new international standard, that's up to them. But it doesn't mean that the new international standard has gone away, or that manufacturers can continue to use the 'old' standard indefinitely and still be considered correct.

    The only point of debate is what can correctly be called a 'Gigabyte'. If you want to stick to something whose meaning was changed more than 10 years ago, then that's up to you. But your interpretation is wrong because the meaning of the notation was changed a long time ago.

    End of.

    I actually understand very well, thank you. What you don't understand is, I don't care.

    It was introduced only after the HDD companies started muddying the waters. Despite this, both Western Digital and Seagate have been sued over this misrepresentation, using the 'correct' terms as you put them, and yet have been forced to make settlements.

    There is no such thing as correct and incorrect standards. This 'so-called standard' you refer to is not binding whatsoever. Apple and the storage companies stand alone on this. Everyone else in the industry disregards it.
  • Options
    d'@ved'@ve Posts: 45,531
    Forum Member
    ibatten wrote: »
    I'm not sure that it is more common, actually. On a machine with some RAM, a disk, a video card and a copy of Windows (which is a fairly common set of components) then binary Ms and Gs will be used by all of that bar the disk. The only people to consistently used G==10^9 are disk manufacturers (and that only relatively recently) and, even more recently, Apple in reporting disks. Telecoms usage relates almost exclusively to bits per second or, with yet more subtle opportunities for misunderstanding, baud, so there are yet other conversion factors and opportunities for confusion there (and Mbps versus MBps is always fun).



    I think overall it would be better if Microsoft reported disk capacities in the units that disk manufacturers use, yes. But there's more opportunity for fun there. An old SD card I keep in my wallet is labelled "512MB", but Mountain Lion reports it as "513.2 MB (513*236*992 Bytes)" --- 1002416 bytes per MB, according to the manufacturer. Now you, and I, know why that is, but it's certainly not a universal cure for confusion to just standardise on 10^6.

    I'm not suggesting that they standardise on anything other than the abbreviation/prefix, they can use base 24 billions if they like but they shouldn't still call them GBs. That's all I'm saying: Microsoft & Co. just stop reporting GiBs as GBs!

    That's it.
Sign In or Register to comment.