
Charles B. answered 06/15/19
Passionate Professional Programmer
That is actually a convention stemming largely from C. With the C family being very popular languages (compared to TI-BASIC and LISP), this is one of the more common representations. So the best person to answer the question as to 'why' is Dennis Ritchie. But as far as conventions go it's certainly not universal: other languages use different conventions, LISP for example uses #x and #16r, to my recollection, and the TI calculator I had represented hex with 0h (makes more sense in my mind than 0x,but ah well), and my first language (some flavor of BASIC) used &h. Even the use of ABCDEF to represent 10-15 wasn't universal (some old systems used other letters including X and H for hexidecimal), and that might have influenced the decision for which letter to use depending on what hardware the language designers were familiar with. But that's just wild speculation.