Can someone clarify a dumb point for me?
If I write a 16-bit integer as a literal like
&h7FE0 in the IDE, why (in the debugger) do I get the decimal value 32736?
Is the nomenclature in big or little endian? If the IDE is running on a little endian machine (which they all are), which run of bytes is represented by &h7FE0?
Byte position ------------- 0 1 7F E0 or Byte position ------------- 0 1 E0 7F
I had assumed that literals were taken in big endian format?