I'm working on a program for the LPC11U68 that scans a folder on an SD card, and copies all of the short filenames it finds into a list of structs defined like so:
#define MAX_FILES 100
struct FileEntry {
char fileName[22];
char someField[64];
char someOtherField;
};
FileEntry fileList[MAX_FILES];
This worked fine until today, when I tried to increase the size of the array. For some reason, I'm able to cause all kinds of inconsistent memory problems simply by creating, and interacting with this array. If I increase MAX_FILES to between 140 and 155, the program deadlocks and won't even run. But if I increase it to 200 or more, the program will run fine up until it accesses items around the 150's. This is very strange, since the array is using less than half of the available RAM, even at 200 items! I was able to reproduce the issue using a test program:
[Repository '/users/neilt6/code/MemoryBug_TestApp/latest/' not found]
To reproduce the bug, I created a folder on an SD card with 1600 files in it. The test program scans the folder without using the list first, in order to prove that this isn't a FatFs-related stack overflow:
Then it scans the folder again using the list, and inconsistent memory corruption ensues! This particular run crashed gracefully at index 155:
This run, however, deadlocked after reading index 168:
Am I doing something wrong here? How do I even begin to troubleshoot this?
I'm working on a program for the LPC11U68 that scans a folder on an SD card, and copies all of the short filenames it finds into a list of structs defined like so:
This worked fine until today, when I tried to increase the size of the array. For some reason, I'm able to cause all kinds of inconsistent memory problems simply by creating, and interacting with this array. If I increase MAX_FILES to between 140 and 155, the program deadlocks and won't even run. But if I increase it to 200 or more, the program will run fine up until it accesses items around the 150's. This is very strange, since the array is using less than half of the available RAM, even at 200 items! I was able to reproduce the issue using a test program:
[Repository '/users/neilt6/code/MemoryBug_TestApp/latest/' not found]
To reproduce the bug, I created a folder on an SD card with 1600 files in it. The test program scans the folder without using the list first, in order to prove that this isn't a FatFs-related stack overflow:
Then it scans the folder again using the list, and inconsistent memory corruption ensues! This particular run crashed gracefully at index 155:
This run, however, deadlocked after reading index 168:
Am I doing something wrong here? How do I even begin to troubleshoot this?