I currently have a 16F18326 programmed as a slave device and an Arduino Leonadro as the master. Everything works fine if I print the received character on the Arduino but becomes garbage if the prints are removed. Replacing the print with a 4uS delay fixes the problem again. The pic is running at 32MHz and is using the SSP interrupt to process everything. I don't understand why these delays are necessary.
void __interrupt() inter(void){
static char buffCount;
char temp;
if(SSP1IF && SSP1IE){
if(buffCount>=32){ // stop buffer overflowing
temp=SSP1BUF; //throw away
}else{
temp=SSP1BUF;
SSP1BUF=buff[buffCount];
buff[buffCount++]=temp;
}
SSP1IF=0;
}
if(IOCAF2){ //A2 is Slave Select input
IOCAF2=0; //interrupt only on falling edge
buffCount=0;
dummy=SSP1BUF; //prepare to receive data
SSP1IF=0; //clear pending interrupt
SSP1IE=1; //allow SSP to interrupt
}
}
Note RA2 (Slave select) is setup to interrupt on rising edges only so I can make sure it stays in sync. I first thought that the problem may only be with the first byte due to the RA2 interrupt but it happens with all bytes. Note the pic receives all data correctly with or without the delays.
I've sped up the Pic code by allowing the buffer to overflow (made it bigger). I slowed the Arduino down to 250k but I still need the small delays. I can speed the Arduino up to 2MHz without any problem if I have the delays. It's as though it's using some kind of callback to fill the buffer. Guess I'll just use it "as is" with the delays.