2022-02-10 05:40 AM
2022-02-10 08:20 AM
How complicated and large does the database need to be?
You might look to what was popular for MS-DOS type implementations, perhaps keeping the core data and indexes in separate files, and doing the management in a way that limits wholesale rewriting of large chunks of data on the media. Something that journals, and is recoverable/rebuildable.
The f_read and f_write are most efficient used at sector/cluster sizes and boundaries, ie multiples of 512 byte sectors, and buffering/caching methods around those.
Perhaps broaden search to those tailored to Cortex-Mx MCUs and FatFS.
Ones also that have complementary PC based apps, or can output data in .CSV forms for each importation.