Electronics4you
Member
Hi there,
I have an oscilloscope readout of a serial communication (NMEA from a GPS). I need to decode the signal into ASCII characters without running the stream through any hardware. I have attached a snippet of the signal, and how should I initialize the decoding?
The GPS I'm using is a Linx RXM-GPS-SR (https://www.electro-tech-online.com/custompdfs/2013/01/RXM-GPS-SR_Data_Guide-19336.pdf), and the default settings are baud=9600, data=8 bits, stop=1 bit and no parity.
Should the decoding begin from the falling edge and then 1/9600 seconds ahead for the first start bit and then detecting the level of each interval. Or is it the direction of the edge?
According to the datasheet, the first characters should be $GPGGA..., but I can only get the $ using the level of each interval. Can anyone spot the problem? Thanks
I have an oscilloscope readout of a serial communication (NMEA from a GPS). I need to decode the signal into ASCII characters without running the stream through any hardware. I have attached a snippet of the signal, and how should I initialize the decoding?
The GPS I'm using is a Linx RXM-GPS-SR (https://www.electro-tech-online.com/custompdfs/2013/01/RXM-GPS-SR_Data_Guide-19336.pdf), and the default settings are baud=9600, data=8 bits, stop=1 bit and no parity.
Should the decoding begin from the falling edge and then 1/9600 seconds ahead for the first start bit and then detecting the level of each interval. Or is it the direction of the edge?
According to the datasheet, the first characters should be $GPGGA..., but I can only get the $ using the level of each interval. Can anyone spot the problem? Thanks