2021-11-26 04:02 PM
Hi,
I saw something that alerted me:
Disadvantages of using a flash drive for backup
Source: https://www.datahoards.com/how-reliable-are-flash-drives-for-
So I would like to know about the reliability of the USB Host library data writing routine in conjunction with the reliability of a USB disk (Flash drive):
I am using STM32CubeIDE 1.7.0.
Thank you.
2021-11-26 04:35 PM
Understand the ST library code is not heavily tested/validated, so you'd need to evaluate it further and perform your own test, and test across a variety of cards and examples. Some USB sticks skip the crystal, so the timing might be outside specifications. Used by thousands of customers you're bound to get some that don't work or are otherwise of poor quality.
In my experience (magnetic/optical storage background) these flash drives tend to be pretty robust, they manage the wear, and also handle the error-correction and block sparing. Basically a NAND memory with an MCU to manage them. The biggest cause of data loss is likely to be file system level corruption due to bad software, caching or power interruption. Make sure the file system content cleanly flushes prior to card removal. Things like FatFs finalize the file system data at f_close, or f_sync events.
For transient logging I don't bother with verification. If the data is critical, you might want to do so at the erase block size (128KB, or what it reports), doing sector level read/write will be very slow. For linear writes perhaps keep a running hash that you can check via a subsequent read-back.
For SDCard / eMMC with FatFS system implementation I tend to fill the media with pseudo-range test patterns that don't align to sector/cluster boundaries, and don't repeat in simplistic fashion. I've done this on STM32 out to 400GB or so. The FatFs middleware shipped by ST is OLD (2017), don't use it for critical applications, it has known flaws.
2021-11-26 05:19 PM
Thanks for the comments, I was hoping that the library already had some protection against data loss.
I'm generating a report of around 45MiB so I don't think it's necessary to use a lot of juggling to verify the data as reading is much faster than writing.
I found the f_gets function to read lines, I'll try to do the verification the same way it was written, line by line, I think it's a little time-consuming, but it should work. I don't expect to use a very large buffer and rotate data for comparison as a substring.
2021-11-26 06:26 PM
If you are enterprise, buy quality stuff from reputable manufacturers.
Not any random cheap junk from ebay.
Verify characteristics declared by the makers, or hire specialists.
All this is not directly related to quality of the software.
One can make the whole system more reliable even with crappy hardware by more complicated software and redundancy, and performance may suffer. Anything new or surprising here?
2021-11-26 06:58 PM
I just proved in practice the doubt I had.
The f_write routine did not return an error in about 10,000 rows, but the f_open function did return an error when trying to read.
I had to format the USB disk again.
This failure occurred due to firmware rewrite (StLink) that forced USB disk power off temporarily during some data access procedure.
Anyone trying to use the f_gets function needs to use the f_eof and f_error functions for control, otherwise the f_gets function will read everything it finds in front of it without stopping even after the file reaches the end.