2025-12-02 8:06 AM
Hi,
I am using STM32G474RE. I try to read out LTC2440 with the SPI interface. I have configured the LTC2440 as SPI master (In its datasheet the mode of operation is specified as Internal SCK, 2-Wire I/O, Continuous Conversion). The LTC2440 provides a BUSY signal which indicates the state of the conversion. LTC2440 does output 4 bytes of data on its own (Burst approx. 880 times / second).
I have configured the SPI interface of my STM32G4 as Half-Duplex slave to read these bursts.
static void MX_SPI3_Init(void)
{
hspi3.Instance = SPI3;
hspi3.Init.Mode = SPI_MODE_SLAVE;
hspi3.Init.Direction = SPI_DIRECTION_1LINE;
hspi3.Init.DataSize = SPI_DATASIZE_8BIT;
hspi3.Init.CLKPolarity = SPI_POLARITY_HIGH;
hspi3.Init.CLKPhase = SPI_PHASE_1EDGE;
hspi3.Init.NSS = SPI_NSS_SOFT;
hspi3.Init.FirstBit = SPI_FIRSTBIT_MSB;
hspi3.Init.TIMode = SPI_TIMODE_DISABLE;
hspi3.Init.CRCCalculation = SPI_CRCCALCULATION_DISABLE;
hspi3.Init.CRCPolynomial = 7;
hspi3.Init.CRCLength = SPI_CRC_LENGTH_DATASIZE;
hspi3.Init.NSSPMode = SPI_NSS_PULSE_DISABLE;
if (HAL_SPI_Init(&hspi3) != HAL_OK)
{
Error_Handler();
}
}
static void MX_DMA_Init(void)
{
__HAL_RCC_DMAMUX1_CLK_ENABLE();
__HAL_RCC_DMA1_CLK_ENABLE();
HAL_NVIC_SetPriority(DMA1_Channel3_IRQn, 0, 0);
HAL_NVIC_EnableIRQ(DMA1_Channel3_IRQn);
}I start the SPI data reception within a callback of the rising edge of BUSY-signal. I disable the "Half-Complete" Interrupt. TIMDBG sets a debug line to HIGH level which we will see on the oscilloscope later. LTC2440_FRAMESIZE is defined as 4.
void HAL_GPIO_EXTI_Callback(uint16_t GPIO_Pin)
{
TIMDBG_START;
HAL_SPI_Receive_DMA(&spi_handle, spi_data, LTC2440_FRAMESIZE);
__HAL_DMA_DISABLE_IT(&spi_dma_rx_handle, DMA_IT_HT);
}For now I have not implemented anything else than a macro, which shows that the conversation has completed.
void HAL_SPI_RxCpltCallback(SPI_HandleTypeDef *hspi)
{
TIMDBG_STOP;
}
So what I would expect is, that the MCU waits for the rising edge of the BUSY signal. Then it should wait for the data burst from the ADC. After 4 bytes have been received, I should get a callback.However, I get this callback earlier.
I can observe the following with my scope:
The first picture shows the clock signal of the burst and the BUSY signal. As expected, BUSY is pulled LOW before the data burst and pushed HIGH after.
The second picture shows the clock signal of the burst and the TIMDBG signal. As expected, the TIMDBG signal is pushed HIGH right after data burst (caused by rising edge of BUSY signal). However TIMDBG doesnt stay high for the entire dataframe, but goes LOW after a few bits. This means, that my receive complete callback is called before receiving the complete package. So I have to assume that my data is corrupted. Why is that? Do you have any troubleshooting tips how I can find the reason for this early callback call? Are thres ome other interrupts I have to exclude?
P. S.: One might think the observed behaviour might result from the short time between the end of the databurst and the rising edge of the burst signal. I already tried to start the conversation not by the rising edge of the BUSY signal, but by some timer or other ressource. Unfortunately, this didn't had any effect on when my callback routine is called.
2025-12-02 9:10 AM
We're lacking information on what TIMDBG_START/TIMDBG_STOP is doing. Which one sets it high and which one sets it low?
Looks like TIMDBG_START happens due to EXTI but we are lacking information how this EXTI is configured and what is triggering it.