AnsweredAssumed Answered

Starting to do some DSP with STM32F7Disco

Question asked by Carlos Lopez on Feb 1, 2018

Hi everyone:

 

I am trying to build an audio effects processor on the STM32F7Discovery board. I am relatively new to this matter as I have never used a microcontroller before, and I am having a bit of trouble trying to do what I'd like. So far, I have managed to build a simple GUI that repressents the buffer, lets you adjust the volume and select the different effects that I am trying to implement. Everything is good until here. The problem comes when I try to implement a delay line, as I am trying and I am not really sure of how should I manage this. I am working based on the DISCO_AUDIO_DEMO program that reproduces audio live when it comes into the microphones, streaming it into the line output. This works by creating two audio buffers (one for input, one for output) of the size of the block you define (which comes set to 512). If i set the buffer to a bigger number like 44100 or 22050, the sound comes out already delayed without any kind of algorithm, but the feedback is so strong the sound gets very bad after a few seconds.

 

So I was trying to make an algorithm for a delay line, in a way that the sound that the microphones get and stream into the line output gets delayed a little bit of time. Reading this: Time-Varying Delay Effects | Physical Audio Signal Processing i have found an interesting algorithm but i don't really understand how it works, how should i implement it or what's the equivalence to the variables that I have:

 

I understand how the read pointer has to be behind the writing one an amount of M samples, and that A is the buffer i'm working with. What I don't understand is the while loop that it does (why does it have to be smaller than A, and it substracts the whole length of the buffer) and then in the delay line function i don't really get what x and y represent. I am guessing it's the input and output but it does not really make sense to me. Also, shouldn't it be on a for loop so it happens for every sample on the array?

 

I am sorry if these questions are too basic, but I have never worked with anything like this and I am starting to get desperate as I can't get to do what I am trying to. I would appreciate it very much if someone could help me. Here is the code I am talking about.

 

Thanks in advance

 

"Let A denote an array of length $ N$. Then we can implement an $ M$-sample 
variable delay line in the 
C programming language as shown in Fig.5.1.
We require, of course, 
$ M\leq N$.
static double A[N];    static double *rptr = A; // read ptr    static double *wptr = A; // write ptr     double setdelay(int M) {        rptr = wptr - M;        while (rptr < A) { rptr += N }    }     double delayline(double x)    {      double y;      A[wptr++] = x;      y = A[rptr++];      if ((wptr-A) >= N) { wptr -= N }      if ((rptr-A) >= N) { rptr -= N }      return y;    }

Outcomes