|
||||||||
Why does my 128GB Micro SD Card |
![]() |
|
|
Thread Tools | Search this Thread |
|
|
#1 |
|
Forum Member
Join Date: Nov 2013
Posts: 2,804
|
Why does my 128GB Micro SD Card
Only show 119GB available memory on my Samsung Galaxy Tab 4?
I got a Sandisk 128GB one today as I want to add lots of video files to watch on my tablet but its showing me 9GB less than what I paid for. is this right or is the card faulty? Many thanks |
|
|
|
|
Please sign in or register to remove this advertisement.
|
|
|
#2 |
|
Forum Member
Join Date: Dec 2002
Posts: 3,851
|
Quote:
Only show 119GB available memory on my Samsung Galaxy Tab 4?
I got a Sandisk 128GB one today as I want to add lots of video files to watch on my tablet but its showing me 9GB less than what I paid for. is this right or is the card faulty? Many thanks The marketing people use kilo as 1000, mega as 1,000,000 and giga as 1000,000,000 A Gibabyte proper is 1024*1024*1024 bytes. A Gigabyte in marketing is 1000*1000*1000 Thus they say 128,000,000,0000 bytes as 128 gigabytes. Your computer works in proper gigabytes, so 128,000,000,0000 / (1024*1024*1024) = approx 119 (real) gigabytes. To differentiate, a real gigabyte is now termed a Gibibyte GiB ie 1024*1024*1024 and a gigabyte is 1000*1000*1000 (so strictly the marketing con trick is correct and all computers are incorrect!!!. The computer should really either report 119 GiB or 128 GB where 1 GiB = 1.074 GB but actually reports 119 GB which strictly is incorrect (all computers strictly report incorrectly according to the formal conventions but that is not surprising as the convention came later - you just have to know when a computer says GB it actually means GiB - same is true for KB, MB, TB etc.). |
|
|
|
|
|
#3 |
|
Forum Member
Join Date: Nov 2013
Posts: 2,804
|
Quote:
You do not have a problem, This is a classic marketing con trick.
The marketing people use kilo as 1000, mega as 1,000,000 and giga as 1000,000,000 A Gibabyte proper is 1024*1024*1024 bytes. A Gigabyte in marketing is 1000*1000*1000 Thus they say 128,000,000,0000 bytes as 128 gigabytes. Your computer works in proper gigabytes, so 128,000,000,0000 / (1024*1024*1024) = approx 119 (real) gigabytes. To differentiate, a real gigabyte is now termed a Gibibyte GiB ie 1024*1024*1024 and a gigabyte is 1000*1000*1000 (so strictly the marketing con trick is correct and all computers are incorrect!!!. The computer should either report 119 GiB or 128 GB where 1 GiB = 1.074 GB but actually reports 119 GB which strictly is incorrect. That is indeed shady and false advertising to make it appear better.
|
|
|
|
|
|
#4 |
|
Forum Member
Join Date: Dec 2002
Posts: 3,851
|
Quote:
Thanks I think I can just about understand that haha
That is indeed shady and false advertising to make it appear better. ![]() Incidentally broadband speeds are the same, but even worse as they use bits rather than bytes (1 byte = 8 bits) which makes speeds sound much more impressive than they are. |
|
|
|
|
|
#5 |
|
Forum Member
Join Date: Apr 2011
Posts: 10,733
|
Computers work using binary or the powers of 2 at the low level, but what the marketing people discovered was that the people making the purchasing decisions normally had no idea and thus would buy stuff as they thought it was better than it was...
Also it should be said that raw capacity and formatted capacity is not the same....an a4 sheet of blank paper has a potential for a certain amount of writing but a formatted (lined paper) has less unless you write one word per page hulk style |
|
|
|
|
|
#6 |
|
Forum Member
Join Date: Mar 2011
Location: Hertfordshire
Posts: 2,935
|
Quote:
The computer should really either report 119 GiB or 128 GB where 1 GiB = 1.074 GB but actually reports 119 GB which strictly is incorrect (all computers strictly report incorrectly according to the formal conventions but that is not surprising as the convention came later - you just have to know when a computer says GB it actually means GiB - same is true for KB, MB, TB etc.). Giga- is first documented to have been used to denote 10^9 in the scientific community in 1947 and was formally recognised as an international standard in 1960, significantly predating use in the computing industry. Giga- was adopted by the computing industry to denote 2^30 as it was the closest fit to the already-established 10^9 definition, and hence all this confusion began! Also, since Snow Leopard in 2008, OS X has reported file and storage size in correct SI units. i.e. 1GB = 1 billion bytes. |
|
|
|
|
|
#7 |
|
Forum Member
Join Date: Oct 2003
Location: the wild world web
Posts: 28,132
|
There must be near enough two of everything.
We have 2 different size MBs, we have 2 different gallons and2 different trillions ! Heck,we even have a 4G and 3G indentical ! |
|
|
|
|
|
#8 |
|
Forum Member
Join Date: Dec 2002
Posts: 3,851
|
Quote:
It's the other way round, surely?
Giga- is first documented to have been used to denote 10^9 in the scientific community in 1947 and was formally recognised as an international standard in 1960, significantly predating use in the computing industry. Giga- was adopted by the computing industry to denote 2^30 as it was the closest fit to the already-established 10^9 definition, and hence all this confusion began! Also, since Snow Leopard in 2008, OS X has reported file and storage size in correct SI units. i.e. 1GB = 1 billion bytes. Mots computers still say GB when they mean GiB of course. Although OSX may be strictly correct, it will still confuse people when they see the drive as 128 GB in an Apple, but inoy 119 GB in a tablet/windows PC! |
|
|
|
|
|
#9 |
|
Forum Member
Join Date: Mar 2006
Location: Newcastle-upon-Tyne
Posts: 8,175
|
In computer usage, the correct terminology should always be binary, regardless of the incorrect amount that OS X may report.
CPUs are always going to work in binary for the foreseeable future, and that is what RAM sizes will be based upon. When a computer reports that something has 200GB free, it should mean it has 200 multiples of what counts as a gigabyte, each of which is 1024 megabytes, each of those being 1024 kilobytes, each of which is 1024 bytes. In an ideal world, hard-drive sizes would stick to that convention but in order to deceive customers, hard-drive manufacturers have almost always used decimal sizes, meaning drives measured originally in the megabytes to get 5% extra over what it would otherwise have been, then gigabytes to get more than 7% extra, and now terabytes for around 10% extra. It's no better now with SSDs despite those being based on binary chips, as they have to use part of those chips as reserve capacity for failed cells, so a 120GB SSD might have a true 128GB of flash-storage inside it, but only about 112GB of it is made available at any time (which translates to the inflated 120GB it is sold as). Given that computers are everywhere now, it would make sense for kilo, mega, giga, tera, to mean the binary version, and be reported as such by all computers. Do you really want a computer with 32GB of RAM to report that it has been fitted with 34.35GB of RAM? Do OS X machines report the amount of RAM in decimal amounts? |
|
|
|
![]() |
|
All times are GMT. The time now is 19:19.

