2012-01-19 07:54 PM
I've seen figures quoting speeds of over 7.2 MSample/sec, when using triple ADC a 36MHz ADClock, and DMA. However, at those speeds (by my calculations) the effect of RADC on the input sample circuit means you lose about 7 bits of accuracy. Stretching the sample period from 3 up 6 clock cycles still only gives 10-bit resolution. Is that right?
My target is a Software Defined Radio (SDR) and the higher the sample rate the better. Has anyone got a circuit where a fast ADC simply pumps data into a GPIO port? Considering the ADC limitations, should I be looking at another controller altogether? #stm32f4-adc-speed-resolution #stm32f4-adc-speed-resolution #adc-stm32f4-resolution-speed2012-01-20 05:17 AM
Hello,
I don't use ADC in triple interleaved mode myself. You can only use the following sampling times: (RM0090, Page 238) 000: 3 cycles 001: 15 cycles 010: 28 cycles ... When you choose 15 cycles you will get the full resolution, see equation 1 in the STM32F407xx datasheet. On the other hand your sampling rate will drop to <= 4Mhz: 3*1/((15+12)*1/36MHz)2012-01-20 04:13 PM
What is the highest signal frequency to convert ?
Cheers, Hal2012-01-20 04:43 PM
Aarg, I posted a reply and the system lost it. Here goes again-
Thanks for the reference. By interlacing 3 channles, it claimed you get triple the sample rate. For 15 cycles (3 sample, 12 convert) you offset the second ADC by 5 samples, and the third by 10 samples. One sample every 5 clocks effectively gives you 7.2 Msample /sec. Which is adequate, until I try designing the input source. From the data sheet, the worst case RADC is 6000 ohms. Using 12 bit resolution, a 36 MHz clock, and CADC = 4 pF, the input impedance needs to be less than: NEGATIVE 4210 ohms... Put another way: sample a 1.5MHz signal. It goes through 360 degrees every 24 clocks. During the (12-clock) ADC conversion, the input goes through 180 degrees, swinging from zero to max. What happens during the next sample? The error is: 4095 * exp -t/(RC) At 3 clocks, the sample circuit hasn't had time to settle. Its error margin is about 127, or 7 bits (as per my first entry). I was hoping to sample a signal at 1.8MHz (1/4 of 7.2MHz) but this doesn't seem practical. --- EDIT --- In the DSP Library, under triple ADC conversion, I found an Excel spreadsheet. I added a page for error calculations - attached. According to that, I can stay within 0.5 bit error IF my input signal varies by no more than 15 bits, ie. 4-bit resolution in a 12-bit ADC.2012-01-22 11:12 AM
This looks like a big challenge for you. Two clarifications:
The 6000 ohm max impedance is at 1.8V reference, the max capacitance is TBD. If you are operating at 3.3V the max resistance will probably be less. Since this seems to be critical for you, ask you ST rep if they have statistical data on impedance and capacitance so you can see if it is worthwhile to toss components outside your critical range. After the sample period, the value is held by an internal capacitor during conversion, so you don't need to worry about the signal changing during conversion. The next ADC in the sequence then starts its sample sequence after the programmed delay from the previous ADC sample start. I am wondering if the program is going to choke on processing this high speed data stream. If this is IF data, consider a redesigned front end with a lower IF. Cheers, Hal2012-01-22 02:26 PM
Thanks Hal. I can only hope the ''probably less'' is true.
My original plan was to sample a 1.8 MHz signal at 7.2 MS/sec. To reduce workload, this was to be downsampled (decimated) by 8, producing giving a (theoretical) 900KS/sec, 15-bit resolution signal to go via high-speed USB to a PC. The first flaw in my plan appeared when I found out the Discovery board doesn't support High speed (480 Mps), only Full Speed 912 Mbps). The second hurdle is the ADC. As you suggested, I'm now trying for a lower IF. The problem is, I keep coming up against that *blessed* RADC. With only the Data sheet to go by, I HAVE to assume 6000 Ohms - which limits the input signals to under... six KILOHERTZ ! I suppose the first circuit I could design would feed a ten KiloHertz sinewave to the ADC, and measure the distortion. Then I could move on to connecting a faster ADC to a GPIO port - or write the STM32F4 off as inappropriate to my needs.2012-01-23 12:05 AM
If accuracy is important for you, I think you have to chose 15 cycles sampling time.
This gives you an error of: 4095 * exp -t/(RC) -> 4095*exp((-15/(36Mhz))/(6000*4e-12)) = 0.0001 So in this case Radc isn't an issue! See Page 225 from RM0090 for timing. They state that the delay between conversions is (sampling cycles + 2)*1/fclock. Ideally you could sample at 1/Tdelay = fclock/(sampling cycles + 2) = 36Mhz/17 = 2.1Mhz (My first post wasn't correct though.)Baird's second clarification is really important.
You would have to try a lower IF in order to do correct sampling. ->Nyquist-Shannon. You could also try to do a non-baseband sampling: http://en.wikipedia.org/wiki/Nyquist%E2%80%93Shannon_sampling_theorem#Sampling_of_non-baseband_signals Best, Raffael2012-01-23 08:36 AM
You would have to try a lower IF in order to do correct sampling. ->Nyquist-Shannon.
Of be comfortable with the bandpass filtered input being above the Nyquist Frequency and being folded (ie negative frequency, rotating backward)