You can Google for an "ASCII table" to find those. Like this one:
**broken link removed**
ALL signals are binary in 0s and 1s. Every single signal. Every last one. The only difference is how you interpret them. ASCII is one way to interpret a binary signal. Hexadecimal is just another way to write a number, just like decimal. THey are just different number/counting systems used to represent the same number. Hexadecimel (and octal) are a shorter more convenient ways to write binary. They are used for this purpose instead of decimel because it is easier and more straightforward to convert back and forth between a hex/octal number into binary.
It tends to be faster (and read with less mistakes) to read FF (hex) instead of 11111111 (binary) because you are more likely to get lost in all the digits of a binary number, since they have so many digits because binary has a small base number. When you type a number into the compiler, you have to tell the compiler what number system you are using.
FOr example, some compilers recognize a number 0x______ as hexadecimel and 0b________ or %________ as binary and assume any number that is just ____ is decimel.
For example,
0b11111111 (11111111 in binary) = 0xFF (FF in Hex) = 255 (in decimal)