AnsweredAssumed Answered

SPIRIT1 CSMA understanding problem?

Question asked by karam.karam on Aug 10, 2015
Latest reply on Aug 25, 2015 by karam.karam
I am performing some tests on the CSMA feature. Specifically on the non persistent CCA method.

I have a test process whereby every 10s the transmitting module has its TX FIFO flushed, has its FIFO filled with 20 bytes of data, then the TX command (0x60) is issued. The transmission is basic packet protocol with fixed payload length of 20, GFSK1, 38.4k data rate, in the 868MHz band and I believe other related radio and protocol parameters are at least reasonably correct since the receiver is receving the packets just fine, although missing perhaps 10% at present (even at short range). Though this is something I have to figure out perhaps in terms of Quality acceptance filters, for the moment I also thought I'd try and test the CSMA feature on the transmitting unit.

So I have enabled non persistent CSMA (register 0x51 = 0x05), I have set up some Tcca and Tlisten values and other CSMA parameters (register 0x67 = 0x25, 0x66 = 0x04), I have set CS_mode to be static RSSI sensing (register 0x21 = 0xE3).

Now a strange thing seems to happen:

If I set RSSI_TH to some relatively low value like 0x21 the transmissions seem to be made ok (received at receiver unit) and the nterrupt information from the transmitter just after transmission shows that RSSI threshold was exceeded and that Tx data was sent. If I now increase RSSI_TH to something like 0x30 (which I consider to be significantly above the general noise floor that I'm seeing on this channel) then I try the transmission process again, what now happens is that the transmitter appears to remain stuck in what I'm guessing to be in carrier sense mode. I say this because I observe no received packet on the receiver side and also when I query the MC_STATE registers [1],[0] on the transmitter unit it returns 02,67 which suggests it is sitting in RX mode. Also if I read the interrupt flags, first time round I can see some undocumented bits are set, second time all 00 (which is to be expected if no activity is taking place due to auto clearing after first read).

So my question is - am I really understanding what the CSMA process is doing? Or is ther yet another mistake I am making with some parameter which could account for these observations.

My understanding of CCA is that if you raise the RSSI_TH then you should get more liklihood of transmission, not less. Because it implies that only high channel signals will be seen as channel not clear. Whereas a low RSSI_TH should pretty much result in failure to transmit at all times since the channel would be considered not clear at all times due to background noise being higher than the RSSI_TH.

Any suggestions?

Outcomes