I have an application that uses both USB (device) and I2C (master). Running either the USB or the I2C alone works fine, but when they are both running together about every 300 USB transfers results in a stall on ep1 OUT. This is strange in that the process is serial. That is, the host sends a USB msg (ep1 out) to the stm32; after reception the stm32 does a write of one byte on the I2C; and finally the stm32 sends a USB message (ep2 in) back to the host to indicate success/failure.
I used the VirtualCommPort example to start my USB code and I2CRoutines.c for my I2C code. As far as I can tell, when doing the I2C write interrupts are not touched. So how does the I2C code effect the USB?