I just did a quick test on my own in DOSBox by mounting a network drive I have (mounted in OS X, hosted on a Linux server via Samba) and running "dir" on a directory with 3600 zip files for DOS games. The processor usage for DOSBox stayed pegged until the directory listing comes back. Now for kicks I tried running an "ls -l" on the same directory from the OS X terminal and it took even longer, though it used far less processor power. Now I could see where the argument for caching might come into play - but truth be told I'm not sure it's such a great idea. If this data needs to be cached, then it's the job of your OS to do so. Beyond that, you have no way of having meaningful caching unless you have some way of knowing when something has changed. Based on what we are talking about here... it seems that there is no way to do that without incurring the very same performance penalty that the caching is supposed to be rid of in the first place and adding some more memory bloat in the process.
I think caching the file names is a great idea (which DOSBox already does hence the "rescan" command). Caching file sizes, dates and times is less than useful however.