2016-03-22 12:10 PM
I'd like to use an STM32L4 in a project where I am interfacing with a 1-MSPS, 16-bit SPI ADC (LTC2378-16). Obviously the SPI clock has to be > 16 MHz to run this at full speed. However, for best performance the clocking out of each sample should be restricted to ~ half of the overall sample period. Thus, I really need to run SPI close to the theoretical maximum of 40 MHz.
The reference manual doesn't say much about clock speeds. The master-mode clock control has a setting for PCLK/2 which would allow 40 MHz with the max APBx clock of 80 MHz. But the datasheet paints a more complex picture: It lists a top speed of 40 MHz for ''master transmitter'' or ''slave receiver.'' For ''master receive / full duplex'' it says the max is 13 or 24 MHz depending on supply voltage. Does anyone know why the max reception speed would be different in master vs slave modes? I initially thought receive might be slower due to synchronizers on the input pins, but that doesn't explain why master receive and slave receive would be different. If I have to, I could probably figure out a complicated timer setup to generate exactly 16 clocks after every ADC READY indication, pipe that clock into both the ADC and SPI_CK, and run the SPI peripheral in slave mode. I just don't see why that should allow me almost 2x the speed vs. having the SPI peripheral generate the clock itself in master mode. I mean, everything is basically the same in those two scenarios, except where the clock gets generated. Anyone have more insight? Is the datasheet really correct?2016-03-22 02:58 PM
2016-03-23 08:33 AM
So the MOSI inputs (in slave mode) and MISO inputs (in master mode) may have differences inside the peripheral... could be I suppose. It would be useful to hear from anyone who has pushed the chip to its limits and could say for certain.
My application requires the DFSDM, so I think I am stuck with STM32L4. It's a really nice part too; shame about the SPI limitations. I guess I will plan on running SPI in slave mode, using a timer as the master clock. Does anyone know if this will work? i.e. using one or more timers to generate a burst of 16 cycles an on output pin, every time it sees a falling edge on a trigger input pin. (The trigger comes from the ADC data-ready signal.) Then both the ADC and the SPI peripheral run in slave mode, and the timer has taken the role of ''master'' in that it generates the SPI clock and paces the transfers. If this works it will actually make the DMA configuration easier.