I don't get it.
In the past, I had it where in my project I send out 7 bytes of data from my computer to my circuit at 38 Kbps over a radio link. Then my circuit sends a response (also 7 bytes of data) using the same radio module and link and speed. Sometimes I send out the same data twice (no more) for the remote to receive it.
Now I'm trying the same thing again but this time with 16 bytes because my application demands much more data to be transferred in a shorter period of time, however the reliability is significantly worse.
After experimentation, it seems that one byte is always missing from the 16 bytes I send, and I'm not sure if its because my computer is acting funny or if the radio link itself is the problem.
As for the device I'm sending data to, I ran the code in a simulator to verify that the same 16-byte packet I'm sending out is recognized as valid by the device.
This is where things get crazier. In the past I did a simple serial connection test by hooking up two PC's together (yes processor is at least 1Ghz) with a serial port and ran terminals on both and configured both to run at the exact same speed, and characters were also lost in the terminal as well.
If I recall, I think the PC chokes the serial operations after sending or receiving 14 bytes because its internal buffer is full and not flushed? and If I recall, the device I'm dealing with (yes an 8051) has a 1-byte buffer. so I'm not sure if this could be the reason why data isn't being sent properly?
Anyone have any ideas how I can solve this?
In the past, I had it where in my project I send out 7 bytes of data from my computer to my circuit at 38 Kbps over a radio link. Then my circuit sends a response (also 7 bytes of data) using the same radio module and link and speed. Sometimes I send out the same data twice (no more) for the remote to receive it.
Now I'm trying the same thing again but this time with 16 bytes because my application demands much more data to be transferred in a shorter period of time, however the reliability is significantly worse.
After experimentation, it seems that one byte is always missing from the 16 bytes I send, and I'm not sure if its because my computer is acting funny or if the radio link itself is the problem.
As for the device I'm sending data to, I ran the code in a simulator to verify that the same 16-byte packet I'm sending out is recognized as valid by the device.
This is where things get crazier. In the past I did a simple serial connection test by hooking up two PC's together (yes processor is at least 1Ghz) with a serial port and ran terminals on both and configured both to run at the exact same speed, and characters were also lost in the terminal as well.
If I recall, I think the PC chokes the serial operations after sending or receiving 14 bytes because its internal buffer is full and not flushed? and If I recall, the device I'm dealing with (yes an 8051) has a 1-byte buffer. so I'm not sure if this could be the reason why data isn't being sent properly?
Anyone have any ideas how I can solve this?