cancel
Showing results for 
Search instead for 
Did you mean: 

Send 32bit data on SPI protocol

parisa
Senior

Hello,

For communicating with a sensor I need to send 32 bits as a clock for my sensor. As you can see in the 8-bit data transmission an interruption occurs during sending 4 bytes which is unpleasant for my communication.

I wrote a Software SPI (bit-bang) with a timer but for low speed communication (even 200Khz) it makes my MCU slow (subroutine of time should be triggered each 2 us). How can I do for deleting the small interruption in Hardware SPI by using a software trick?

10 REPLIES 10
S.Ma
Principal

Do you mean that you need specifically for your sensor a perfect square wave at 2 usec period?

Because usually as the clock SCK signal is sent to the slave, it does not need to be "perfect"

Your code is interrupted by other (critical?) interrupt such as Systick, USB etc...

If you block interrupts, linked peripheral may malfunction when these interrupts are firering too late.

If this is not a problem in this specific application, and depending on the used STM32 (which you should at least give family name F1, L1, F4, L4, F7, etc...)

  1. Check what is the fastest SPI speed the sensor can accept
  2. Interrupt will take 24+ cycles
  3. Don't use Timer if the frequency is above 1 MHZ, use SW delay. say your core clock is 48 MHz, 1+ usec will be 48 nops.
  4. Then if the SPI complete transaction is short enough (<500 usec?) you might disable interrupts by wrapping the code like this:
  uint32_t tmp;
  tmp = __get_PRIMASK();
  __set_PRIMASK(tmp | 1);
  MyCodeWithInterruptsBlocked()
  __set_PRIMASK(tmp);

Read ARM documentation - NOP is not a timing instruction.

https://www.pabigot.com/arm-cortex-m/the-effect-of-the-arm-cortex-m-nop-instruction/

And dual-issue Cortex-M7 can screw up such timings completely!

S.Ma
Principal

For single core without cache typical microcontrollers, the SW delay loop function has been used more than scarcely.

Post compilation calibration using once the 1msec Systick will get rid of most side effects.

Actually, it might be fun to set all peripheral prescalers by binary search with precise Systick reference... ask 115200 bps and let the MCU find out by streaming data disconnected from the GPIO. 😉

@parisa​ - STM SPI implementation is quite weak, but it is possible to coerce the thing into DMA-only activity and certainly to avoid bit-banging. See:

https://community.st.com/s/question/0D50X00009q4MRT/stmf4-external-interrupt-dma-20bit-spi-xfer-completion-interrupt-to-isr?t=1567621155733

parisa
Senior

I am really appreciative of your responses.

Actually I want to communicate with a 32bit sensor and my signal form is acceptable. Today I was successful to remove the little delay between bytes.( I send 16 bits with 16bit SPI mode) as you can see the attachment.

0690X00000AQswuQAD.png

And here is my sensor output which is completely correct.

0690X00000AQswzQAD.png

But I don't receive the right value of my sensor. I want to know this function SPI_I2S_ReceiveData returns two bytes from each 16bits transmission or byte by byte?

unsigned long Data1;
char Flag;
void SPI1_IRQHandler(void){		
	unsigned long Input;
	
			if (SPI_I2S_GetITStatus(SPI1,SPI_I2S_IT_RXNE)== SET) 
        			{  		
 					Input=SPI_I2S_ReceiveData(SPI1);	
					if(Flag==1){Data1=Input;}
					if(Flag==2){Data1=Data1<<16;Data1=Data1 | Input;}
					if(Flag<1){SPI_I2S_ITConfig(SPI1,SPI_I2S_IT_TXE,ENABLE);}			
				}
				 		 
				if (SPI_I2S_GetITStatus(SPI1, SPI_I2S_IT_TXE)== SET )  
					{  		
							SPI_I2S_ITConfig(SPI1, SPI_I2S_IT_TXE, DISABLE);
							SPI_I2S_SendData(SPI1, 0xffff); 
							SPI_I2S_ITConfig(SPI1, SPI_I2S_IT_TXE, DISABLE);						
							Flag=Flag+1;
				        		
					}
					
}

S.Ma
Principal

Ah so now it's not longer bit bang SPI... which STM32 is used?

parisa
Senior

Thank you for your companion,

I use STM32F10x series. Is there any way to implement 34bit( instead of 32bit) under DMA ?

S.Ma
Principal

Probably too exotic to be implemented by SPIv1

When using SPI master, use transmitreceive functions. And if there is DMA, irq only on RX one.

Hello, could you please tell us how did you remove the Delay between bytes?