a) *unsigned* integers wrap around; it hadn't been promised for signed
ones and as the matter of fact it hadn't been true for a bunch of
implementations, all way back to late 70s.
b) pointer + integer wraparounds are even worse - all sorts of fun
things happened in a lot of implementations; e.g. there's nothing to
stop the hardware from using upper bits to store extra information of
any kind (e.g. ASN) and using an instruction for pointer + int that
mangles them instead of a wraparound. With instruction for comparison
that would trap on ASN mismatch. And that doesn't take anything too
exotic - e.g. 56bit address space, 8bit ASN in upper bits, 32bit int.
Get a carry into bit 56, have both > and < trigger a trap.
The point is, you are not relying on low-level details - you are assuming
that all world is a VAX (OK, x86 these days). And insisting that all
implementations reproduce exact details of (undefined) behaviour, no
matter how much PITA it would be. BTW, that addition itself can trap
on wraparound - and does on real-world architectures.
So you can do that validation in a sane way (compare untrusted index
with maximal valid value) or, if you want to rely on details on pointer
implementation *and* be perverse, play with casting to uintptr_t and
using the fact that operations on _that_ are well-defined. Will still
break on architectures that do not match your idea of how the pointers
are represented, but at least you won't be feeding the data to be
validated into operations that might do all sorts of interesting things
and trying to guess if those things had happened by the outcome.
There's implementation-defined and there's undefined. That kind of
wraparounds is firmly in the latter class, regardless of any optimizations.
C doesn't try to codify _how_ any particular construct with undefined
behaviour will screw you - it doesn't even promise to try and have it
act consistently on given target.