cancel
Showing results for 
Search instead for 
Did you mean: 

SCCB emulation by I2C communication problem

mchalapetr
Associate II
Posted on May 13, 2015 at 01:17

Hi,

I have an application where I configure a OV7670 camera by SCCB bus (Omnivision thing) emulated by I2C...

I used this sucessfully for couple of months but yesterday I disassembled the PCB that I have made for that application and made some changes that arent related to the SCCB. After I put the PCB back with the kit the configuration that I have used for a few months stopped working...

The function that I use is listed below...It gets stuck at while(!I2C_CheckEvent(I2C1, I2C_EVENT_MASTER_TRANSMITTER_MODE_SELECTED))...

But the strangest thing that if I unplug the camera and connect all the pins through 20cm cable then the configuration works again. I found that out by testing if the module works at all. Simply if I add 20cm of cable (no pins were swapped) then it works again...

I think it has something to do with the interference comming from DCMI but shouldnt the cables make it even worse?

I have no rational explanation...Thanks very much for help...

uint8_t SCCB_write_reg(uint8_t reg_addr, uint8_t* data){

    uint32_t timeout = SCCB_TIMEOUT;

// Send start bit

I2C_GenerateSTART(I2C1, ENABLE);

    // Wait until I2C is busy

    while(!I2C_GetFlagStatus(I2C1, I2C_FLAG_SB)){

if ((timeout--) == 0)

return 1;

}

    while( !I2C_CheckEvent(I2C1,I2C_EVENT_MASTER_MODE_SELECT)){

        if ((timeout--) == 0)

return 1;

    }

// Send slave address (OV7670_WRITE_ADDR)

    I2C_Send7bitAddress(I2C1, OV7670_WRITE_ADDR, I2C_Direction_Transmitter);

while(!I2C_CheckEvent(I2C1, I2C_EVENT_MASTER_TRANSMITTER_MODE_SELECTED)){

if ((timeout--) == 0)

return 1;

    }

// Send register address

I2C_SendData(I2C1, reg_addr);

while(!I2C_CheckEvent(I2C1,I2C_EVENT_MASTER_BYTE_TRANSMITTED)){

if ((timeout--) == 0)

return 1;

    }

// Send new register value

    I2C_SendData(I2C1, *data);

    while(!I2C_CheckEvent(I2C1,I2C_EVENT_MASTER_BYTE_TRANSMITTED)){

if ((timeout--) == 0)

return 1;

    }

// Send stop bit

    I2C_GenerateSTOP(I2C1, ENABLE);

    // Write done

return 0;

}

#stm32 #i2c #dcmi #sccb
1 REPLY 1
mchalapetr
Associate II
Posted on May 14, 2015 at 12:50

Hi,

I dont know how is it possible but I found error in speed of the I2C bus at initialization. Basically one extra zero (from 100k to 1000k). I dont get how it got there or how it worked for few months if it was there all the time...

Anyway the post above and this should work:

// I2C config

I2C_DeInit(I2C1);

I2C_InitStructure.I2C_Mode = I2C_Mode_I2C;

I2C_InitStructure.I2C_DutyCycle = I2C_DutyCycle_2;

I2C_InitStructure.I2C_OwnAddress1 = 0x00;

I2C_InitStructure.I2C_Ack = I2C_Ack_Enable;

I2C_InitStructure.I2C_AcknowledgedAddress = I2C_AcknowledgedAddress_7bit;

I2C_InitStructure.I2C_ClockSpeed = 100000;

I2C_Init(I2C1,&I2C_InitStructure);

I2C_Cmd(I2C1, ENABLE);