2026-03-26 10:01 AM
Hello,
I must implement a SSI (Synchronous Serial Interface) to output a measured value. As SSI in principle is just a bitstream which is clocked out with the serial clock, I wanted to use a SPI in slave mode, as the clock comes from the SSI master.
This works in principle, but from time to time I get a bit shift and the received data is delayed by one bit, which results in a division by two.
Checking the SSI data clock, I didn't find an expected unstable clock with additional edges but from time to time the SSI master makes a pause after the first falling edge before it starts clocking. Could it be that this pause disturbes the SPI interface? I would have expected, that the SPI in slave mode only checks the clock edges and that the time between the clock edges isn't that critical?
2026-03-31 12:36 AM
Hello,
if a clock edge is received while the SPI is disabled, I wouldn't expect a bit-shift at all.
The problem is, as SSI does not frame the data packets with a select signal, the integrity of the data paclet relies only on the clock. If the µC starts while the master is already clocking or if a clock edge is missing, only part of the data is clocked out. If the microcontroller then only writes a new value, when it detects the end of transmission, I would expect that the new data is stored in the FIFO, as the data register was not clocked out completely. Wouldn't then the controller not shift out the remaining data bits and reload the register from the FIFI at the next transfer?
My expectation then would be to get a permanent bit shift. To get rid of the bit shift, I could check the TXC bit at the end of the detected frame, to check if the data was clocked out completely and only then disable the peripheral to reset the data registers, or to not check the status and always reset the SPI block.
Or is there another possible way to reset the transmit registers to make sure no deserted bits are left?