2022-03-02 07:03 PM
I'm sampling 8-bit audio at 8 kHZ, with a mic output biased at around 1.6 volts. The issue is that we need to end up with a signed 8-bit representation of the signal (in the range of -127 to 128), but the raw data in the ADC's data register will be represented as an unsigned 8 bit type (with values ranging from 0 to 255). The only way I know to deal with this issue is to subtract the DC offset (that I'd determine through a moving average calibration phase) from each sample right before it's written to memory. I can't subtract the offset after the sample is written to memory because the data type of the memory buffer needs to remain SIGNED, not unsigned, and signed information will not be preserved and will be corrupted if I just load the values from the ADC data register straight to memory. However, this is not ideal either because I am trying to utilize DMA to bus my samples to memory and avoid 8 kHZ interrupts (where I would need to take care of the offset).
So is there a way to use hardware peripherals on this particular board, or to design a circuit that would allow me to sample negative voltages ranging from -1.5 to +1.5 in order to eliminate this offset?
Or is there another software method whereby I can cast or convert my buffer from unsigned (so as to retain the full range from ADC data register) to signed, and then subtract the DC offset right before I'm going to use it. In other words, is there a way to write C code that compiles to a set of instructions that performs both a data type conversion as well as a subtraction using only one memory load per sample?
Thank you for any help in advance!