It just seems off to me that we'd assume that 4 hex are integers, 8 hex are longs, and anything larger are integer64s.
If we're counting in int64, using &H, here's what we get:
Print &H7FFF 'prints 32767
Print &H8000 'prints -32768
Print &HFFFF 'prints -1
Print &H10000 'prints 65535
Print &HFFFFFFFF 'prints -1
We instantly say that 4 hex is an integer, but 5 to 8 hex is a long, and more than 8 hex is an integer64.
Without any sort of suffix to the value, why wouldn't we count it as always being an integer64, since that's our greatest variable type? If we want to make it an integer, for example, then all we have to do is assign that value to an integer. As it is, it means that our hex values aren't going to represent what we'd think they'd actually represent for us over half of the time.
When dealing with longs, what value would you expect FFFF to represent? I'd expect it to be 65535... Instead, it's negative one!
&HFFFF = &HFFFFFFFF. And that's supposed to be right somehow??
(And even if we make that value &H0000FFFF, to make certain that we hold 8 spots for a long value, those leading zeros get truncated off and we still end up with the -1 that represents an integer value.
If this behavior is the way it's supposed to be, then it's something which needs to be well documented in the wiki somewhere.