Originally Posted by gomezz:
“Which just goes to show you have no idea how this was actually coded.”
“Which just goes to show you have no idea how this was actually coded.”
Your are talking complete and utter nonsense.
Please stop before you embarrass yourself further.
The code I included is pseudo code that represents what is happening, not how it was written.
Quote:
“ The algorithmic approach is not the only way to skin this particular cat.”
“ The algorithmic approach is not the only way to skin this particular cat.”
That's nothing more than purest gobbledegook.
It's a digital computer. By definition anything that it does must be representable by an algorithm.
Quote:
“Do you even know if it was coded in a high-level language and if so what effect changes would have on memory usage for this hardware architecture?”
“Do you even know if it was coded in a high-level language and if so what effect changes would have on memory usage for this hardware architecture?”
This is also meaningless gibberish.
It doesn't matter how it was written. Eventually it will end up as machine code that the processor obeys.
And whatever method was used to write it, simplifying the algorithm is not going to increase memory usage.



