2012-09-21 06:07 AM
When I program a GPIO pin, I have to define a ''speed'': 2, 10 or 50MHz. What does this mean in terms of hardware? Is this a filter?
If I program the pin with 2MHz I can't output a signal with 5MHz. What is the hardware implemented in the port circuit that cuts (filter) the output signal frequency? Thank you2012-09-21 07:04 AM
It relates to the slew control on the pin, and defines how sharp the rising/falling edges are, and how hard/rapidly the pin is driven.
Define it too slow, and your output waveform will be less square, will depend on the capacitive load being driven. Drive it too hard, you tend to get under/over shoot, which will also add to the EMI generated. Generally you'd want to set it appropriately for the signal being output.