In computer usage, the correct terminology should always be binary, regardless of the incorrect amount that OS X may report.
CPUs are always going to work in binary for the foreseeable future, and that is what RAM sizes will be based upon.
When a computer reports that something has 200GB free, it should mean it has 200 multiples of what counts as a gigabyte, each of which is 1024 megabytes, each of those being 1024 kilobytes, each of which is 1024 bytes.
In an ideal world, hard-drive sizes would stick to that convention but in order to deceive customers, hard-drive manufacturers have almost always used decimal sizes, meaning drives measured originally in the megabytes to get 5% extra over what it would otherwise have been, then gigabytes to get more than 7% extra, and now terabytes for around 10% extra.
It's no better now with SSDs despite those being based on binary chips, as they have to use part of those chips as reserve capacity for failed cells, so a 120GB SSD might have a true 128GB of flash-storage inside it, but only about 112GB of it is made available at any time (which translates to the inflated 120GB it is sold as).
Given that computers are everywhere now, it would make sense for kilo, mega, giga, tera, to mean the binary version, and be reported as such by all computers. Do you really want a computer with 32GB of RAM to report that it has been fitted with 34.35GB of RAM? Do OS X machines report the amount of RAM in decimal amounts?