That is not exactly correct, the USARTS always make a logic 0 start bit and a logic 1 stop bit (assuming you have selected the standard 8n1 settings). So always 10 bits to each "byte sent".
Note! As the bytes are sent LSB first each byte is REVERSED;
So to make the 10 bit string; 0101010101 you send ascii char 0x55 (01010101) and freq = baudrate / 2
and to make the 10 bit string 0000011111 you send ascii char 0xFO (11110000) and freq = baudrate / 5
(edit) As vlad777 pointed out I was wrong because the data is sent LSBit first! I have corrected the mistake so the above is correct (thanks vlad777).
Bit sting in my post #15 has 10 bits (included start and stop)
I send 0xF0 NOT 0x0F because of LSB first (after 0 start bit).
But yes I see something wrong now...
EDIT:
(So now I am sending a chunk of 10MB at once with WriteFile() and everything else comented out in the loop.
All initialized to same byte.)
For char 0x55 I get 4.8 KHz. (baud=freq*2) (Not 0xAA but 0x55 coz LSB first.)
For char 0xF0 I get 0.9 KHz. (baud=freq*10) (one full period frequency wise is 10 baud marks)