cancel
Showing results for 
Search instead for 
Did you mean: 

Is 700 kb/s a normal datarate for USB in device mode ?

VGava
Associate II

Hi,

I am working with an STM32L476RCT6. On STM32CubeMX, in "Connectivity", I make use of the "USB_OTG_FS" functionnality and I have choosen the "Device_only" mode. In "Middleware", I use the "USB_DEVICE" in "Communication Device Class (Virtual Port Com)" mode. All parameters are set to their default settings. 

The microcontroller is programmed to send an amount of frames when I open the port on a serial terminal from a PC. I measure a datarate of about 700 kb/s. The speed in the "USB_OTG_FS" section is set to "Full Speed 12MBit/s". 

Is it a normal datarate in such a configuration ? I knew the practical datarate would be smaller than the 12 Mb/s due to the encapsulation but I did not think it would be about 20 times smaller. Are there any tips to increase it ? 

What effective datarate could I achieve using USB in HID or mass storage mode ? 

Thanks.

--

Here are more info : 

- I am using Windows 10

- The USB port of my computer is an USB 3.0 port.

- I am using Tera Term, but I have had the same result with another serial terminal.

- Here is a snippet of my code :

if(CDC_HostPortOpen())
{
    CDC_Transmit_FS("Begin \r\n", 9);
    while(1)
    {
        counter++;
        result = CDC_Transmit_FS("--------------------------------------------------------------\r\n", 64);
        while(result != 0)
        {
            result = CDC_Transmit_FS("--------------------------------------------------------------\r\n", 64);
        }
        if(counter == 31250)
        {
            CDC_Transmit_FS("End \r\n", 7);
        }
    }
}
 
uint8_t CDC_Transmit_FS(uint8_t* Buf, uint16_t Len)
{
    uint8_t result = USBD_OK;
    /* USER CODE BEGIN 7 */
    USBD_CDC_HandleTypeDef *hcdc = (USBD_CDC_HandleTypeDef*)hUsbDeviceFS.pClassData;
    if (hcdc->TxState != 0)
    {
        return USBD_BUSY;
    }
    USBD_CDC_SetTxBuffer(&hUsbDeviceFS, Buf, Len);
    result = USBD_CDC_TransmitPacket(&hUsbDeviceFS);
    /* USER CODE END 7 */
    return result;
}

5 REPLIES 5

700 KByte/s at 12 Mbit/s for a MSC would be typical (scale there is 1.5x assuming 8 bits in a byte), CDC might be significantly slower,you'd have to look at the packet sizes, and the sustained ability of host and device to interact.

Tips, Buy me a coffee, or three.. PayPal Venmo
Up vote any posts that you find helpful, it shows what's working..
TDK
Guru

Sounds fairly typical. Send more data at once to increase efficiency.

I can get something around 9Mbit/sec with very optimized code and a terminal program which polls the port at a very high rate.

If you feel a post has answered your question, please click "Accept as Solution".
VGava
Associate II

Thank you for your answers.

I have edited my question to make it more clear that I have 700 kbit/s and not 700 kbyte/s. I also forgot to mention that my OS is Windows, not Linux. :)

I have tried different sizes for the buffer (64, 100, 500, 1000 bytes) but I have not observed any noticable improvement. Bigger buffers and smaller buffers (10 bytes or 8000 bytes) gave bad results.

Do you mean that you wrote your own terminal program, or that you are using one that can be found on the web ? If so, could you recommand me one ?

I wrote my own terminal program for this application. I don't recall offhand the transfer rate using TeraTerm or similar. I always liked Termite the most for its simplicity.
You should use a buffer that is a multiple of the max data packet size (64 bytes). I'm not sure why you don't see improved behavior from, say, a buffer size of 4096.
If you feel a post has answered your question, please click "Accept as Solution".
VGava
Associate II

I have tested the firmware with a buffer of 4096 bytes. The datarate is slower : about 500 kb/s.

But I have made another interesting test. Until now, to measure the datarate, I was sending 2 Mbytes of data. On the PC, I was using Tera Term and logging what I was receiving in a text file, with a timestamp for each new frame (this logging feature is part of Tera Term). I could then deduce the datarate by calculating the difference between the last timestamp and first timestamp.

I have tried buffers of 4096 without logging what I was receiving, and I measure how much time it takes with a chronometer. Now, the datarate is about 3 Mb/s.

In conclusion, the datarate was limited by what the PC was busy doing (in that case, writing the frames into a text file). So I think that writing a more efficient program to log the frames will indeed increase the datarate, and the worst datarate achievable is about 700 kb/s.

Thanks for your support.