One of our applications stores and retrieves PDFs from a single directory. This directory exists as a share on on a FlexVol, and contains more than 6 million documents. The issue that I am having is with the memory allocated to that share index. For example, as new records are added, I have to increase the allocated memory in order to add the new files to that directory. As of now, I'm somewhere around 1GB of memory.
My question is: given the number of records the directory contains, is there a better way to store the documents? That is, should I use Qtrees or LUNs?
I am a novice when it comes to data storage as well as NetApp, so some guidance and recommendations are much appreciate. My apologies if I've not been clear in describing the problem, I can try to clarify as needed.
Hi @paul_stejskal, thanks for replying. Yeah, that would be the preferred method as we do experience latency issues. I was able to fine-tune the QOS policy group to set priority to that directory. However, the files are saved to the directory via an automated process in the application, and to add them to separate folders requires a code update in the application. Unfortunately, our application developer (vendor) is not very responsive, and we are not able to get this work done anytime soon. I was hoping there would be a way I could tackle the problem from the NetApp side.