int* x; // x is an int-pointer
int *y; // dereferencing y gives an int
int * z; // int multiplied by z
I'm being silly, but floating the the asterisk between the type and the identifier gives me the same feeling as the "array indices start at 0.5" compromise mentioned earlier.
(For the record, the second way is the universal and objective truth.)
(For the record, the second way is the universal and objective truth.)