2014-10-09 03:20 AM
2014-10-09 04:09 AM
void
DMA2_Stream0_IRQHandler(
void
)
{
/* Test on DMA Stream Transfer Complete interrupt */
if
(DMA_GetITStatus(DMA2_Stream0, DMA_IT_TCIF0))
{
/* Clear DMA Stream Transfer Complete interrupt pending bit */
DMA_ClearITPendingBit(DMA2_Stream0, DMA_IT_TCIF0);
DACvalues[0] = ADCConvertedValues[0]/2;
DAC_SoftwareTriggerCmd(DAC_Channel_2, ENABLE);
}
}
Instead of enabling the software trigger, you just need to write your DAC value into the DHR register.
To cite from the manual:
''Data stored in the DAC_DHRx register are automatically transferred to the DAC_DORx register after one APB1 clock cycle, if no hardware trigger is selected (TENx bit in DAC_CR register is reset).''
2014-10-09 04:32 AM
so you mean,
if
(DMA_GetITStatus(DMA2_Stream0, DMA_IT_TCIF0))
{
/* Clear DMA Stream Transfer Complete interrupt pending bit */
DMA_ClearITPendingBit(DMA2_Stream0, DMA_IT_TCIF0);
DACvalues[0] = ADCConvertedValues[0]/2;
DAC_SetChannel2Data(DAC_Align_12b_R, DACvalues[0]);
}
but what happens with my 2nd dma stream? it's useless or is it?
edit: btw it's not working, DACC2DOR doesn't get updated.
2014-10-09 05:44 AM
And set the SW trigger afterwards.
#define SWTRIGR_CH1_Set ((uint32_t)0x00000001)
#define SWTRIGR_CH2_Set ((uint32_t)0x00000002)
pDAC_DataOut = (uint32_t *)(DAC_BASE + DHR12R1_Offset);
*pDAC_DataOut = uhDAC1Buffer;
DAC->SWTRIGR |= SWTRIGR_CH1_Set; /* set trigger; is automatically reset by hardware */
It works for me that way.
2014-10-09 05:48 AM
Thanks, the bit setting was the problem.
When I'm going through my code step by step with the debugger, all registers have the right values. But I don't see anything at my DAC Pin...2014-10-09 06:37 AM
But I don't see anything at my DAC Pin...
In unbuffered mode, the DAC has an output impedance of 1 MOhm. You might easily pull the output down with your measuring instrument. In buffered mode, you get 15kOhm, but have to live with up to 200mV offset on both rails. For real applications, you will need an external buffer amplifier. This internal DAC is technically a compromise, but still better than having none.
2014-10-09 06:58 AM
Thank you, for that information (it may be useful in the future), but that was not the problem. Somehow the DAC_Cmd was commented out...
Unfortunately the software triggering didn't solve my timing problem. As you can see in the attached file (green = input signal, yellow = dac signal), the rising edges of the yellow signal doesn't appear at the same moments... The question is how can I achieve this? Is it possible to put a timer interrupt into the dma interrupt? Would this solve my problem?thanks in advance!2014-10-09 07:28 AM
Unfortunately the software triggering didn't solve my timing problem. As you can see in the attached file (green = input signal, yellow = dac signal), the rising edges of the yellow signal doesn't appear at the same moments...
The DAC has a settling time of about 4 microseconds (also depending on step size - look it up in the datasheet). I'm not sure what you want to achieve with re-sampling the DAC output (if I got that right). But in general, DACs are not the best solution to generate binary signals.
2014-10-10 03:20 AM
It's not about the settling time, it's the unpredictable time variance when the next rising edge of my dac output signal occurs. Sometimes the latency between the rising edge of my input signal and my DAC output signal is 3us, sometimes it's less, sometimes more...and I want to control this behavior....
Maybe I should explain my project. I want to sample signals, generated with a function generator, where fmax is 50 kHz, process the signal (fir, iir filters ... ) and put it back out with the dac. Usually the signal will be a sin-wave, but at first I wanted to try it with a square wave signal (as my ''worst case''). When my input signal is below 10 kHz the time variance of my output signal is no problem, it's still there but the period of my signal is much bigger than the error. Is it possible to put a timer-interrupt in my dma-interrupt routine?2014-10-10 05:14 AM
It's not about the settling time, it's the unpredictable time variance when the next rising edge of my dac output signal occurs. Sometimes the latency between the rising edge of my input signal and my DAC output signal is 3us, sometimes it's less, sometimes more...and I want to control this behavior....
At first glance I would assume a problem with interrupt priorities here - but I can see only one interrupt in the provided source. Did you try to measure the interrupt frequency and runtime jitter without DAC ? I would toggle a GPIO pin to visualize this. Some comments to your ADC code: First, for just one sample of one channel, you don't need DMA. You can use the ADC EOC interrupt instead. Second, a sampling time of 3 cycles is rather short. This requires good signal conditioning circuitry (fast, low impedance). Otherwise, you get signal distortions.Maybe I should explain my project. I want to sample signals, generated with a function generator, where fmax is 50 kHz, process the signal (fir, iir filters ... ) and put it back out with the dac. Usually the signal will be a sin-wave, but at first I wanted to try it with a square wave signal (as my ''worst case''). IMHO this requires the DAC to be triggered via Timer. The filter calculation would introduce an incalculable jitter. You only need to guarantee the cycle time. In theory, this should be possible with one timer, since e.g. TIM2 is listed as possible trigger for both units. I have never tried this, however.