AnsweredAssumed Answered

implementing swd with spi on stm32f3

Question asked by brown.geoffrey.001 on Nov 29, 2016
Latest reply on Dec 11, 2016 by brown.geoffrey.001
I have a working bit-banged implementation of swd, now I want to use the spi device.  The protocol is a bit odd in requiring some short input sequences.  Essentially, I'm using the spi device for everything that's a nice multiple of 8 bits (disabling the MOSI pin for input), but I need to bit-bang a couple of input bits.   I've successfully seized the clock, but I can't seem to read the data from the MOSI or MISO pins reconfigured as input.  The key bit of code follows.  The clock banging works fine, but no joy on the input.  Using a logic analyzer I can see that the correct bits are coming from the SWD slave, but I'm reading all high on the input:

tatic inline uint32_t SWDIO_IN(){

 

  uint8_t b;

 

  LL_GPIO_ResetOutputPin(tgt_SWDCLK_GPIO_Port, tgt_SWDCLK_Pin);

 

  b = LL_GPIO_IsInputPinSet(tgt_SWDIO_GPIO_Port, tgt_SWDIO_Pin);

 

  delay();

 

  LL_GPIO_SetOutputPin(tgt_SWDCLK_GPIO_Port, tgt_SWDCLK_Pin);

 

  return b;

 

}

 

 

 

static void _SetSWDIOasOutput(){

 

  LL_GPIO_SetPinMode(tgt_SWDIO_GPIO_Port, tgt_SWDIO_Pin, LL_GPIO_MODE_ALTERNATE);

 

  LL_GPIO_SetPinMode(tgt_SWDCLK_GPIO_Port, tgt_SWDCLK_Pin, LL_GPIO_MODE_ALTERNATE);

 

}

 

static void _SetSWDIOasInput(){

 

  LL_GPIO_SetPinMode(tgt_SWDIO_GPIO_Port, tgt_SWDIO_Pin, LL_GPIO_MODE_INPUT);

 

  LL_GPIO_SetPinMode(tgt_SWDCLK_GPIO_Port, tgt_SWDCLK_Pin, LL_GPIO_MODE_OUTPUT);

 

}

 

 




Outcomes