2021-11-12 08:44 PM
I am writing from an esp32 I2C master to a stm8s103f mcu. My scope shows the esp is sending a clean read addr byte onto the scl/sda pins (PB4/PB5). But the stm8 is not running my interrupt routine. I have a gpio pin that should toggle when the interrupt happens. I have tested this debug pin elsewhere in my code. Also the stm8 is not responding with an ACK.
Here is my init code. I would be very grateful if anyone can give a hint as to what stupid thing I'm doing (I've never met a smart bug yet, always stupid).
void initI2c(void) {
// set I2C (IRQ19) software interrupt priority to 2 (next to lowest)
ITC->ISPR5 = (ITC->ISPR5 & ~0xc0) | 0x00; // 0b00 => level 2
I2C->CR1 &= ~I2C_CR1_PE; // start with peripheral off (PE = 0)
// I2C_CR1_NOSTRETCH | // 0x80 Clock Stretching Disable (Stretching enabled)
// I2C_CR1_ENGC | // 0x40 General Call Enable (master only)
// I2C_CR1_PE | // 0x01 Peripheral Enable (set at end)
I2C->CR1 = 0;
// I2C_CR2_SWRST | // 0x80 Software Reset
// I2C_CR2_POS | // 0x08 Acknowledge (ACK is for current byte)
// I2C_CR2_ACK | // 0x04 Acknowledge Enable (set at end)
// I2C_CR2_STOP | // 0x02 Stop Generation (1 => send stop -- not used)
// I2C_CR2_START | // 0x01 Start Generation (1 => send start -- not used)
I2C->CR2 = 0;
I2C->FREQR = 16; // I2C frequency register, 16MHz {>= 4MHz for 400k}
// I2C_OARL_ADD, addr = 0x20
// I2C_OARL_ADD0 Interface Address bit0 (not used in 7-bit addr)
I2C->OARL = 0x40; // I2C own address register LSB, this matches the address on my scope
// I2C_OARH_ADDMODE | // 0x80 Addressing Mode (7-bit addr)
// I2C_OARH_ADD | // 0x06 Interface Address bits [9..8] (for 10-bit addr)
I2C->OARH = I2C_OARH_ADDCONF; // Address Mode Configuration (must be 1 for some odd reason)
I2C->ITR = // I2C interrupt register -- all type ints enabled
I2C_ITR_ITBUFEN | // 0x04 Buffer Interrupt Enable
I2C_ITR_ITEVTEN | // 0x02 Event Interrupt Enable
I2C_ITR_ITERREN; // 0x01 Error Interrupt Enable
// I2C->CCRL // I2C clock control register low (master only)
// I2C->CCRH // I2C clock control register high (master only)
// I2C->TRISER // I2C maximum rise time register (master only)
I2C->CR1 |= I2C_CR1_PE; // 0x01 Peripheral Enable (enable pins)
I2C->CR2 |= I2C_CR2_ACK; //send ack every byte (why must this be last?)
}
EDIT: And yes, I have the interrupt vector set up correctly. My TIM4 is interrupting correctly and has a similar setup to my I2C interrupt.
2021-11-13 12:32 PM
Never mind. It started working and I swear I didn't change anything (:-).