0

My web applications needs to store and retrieve data files created by the application users (these files are log files that originate from a monitoring device that the app user uses). A typical file is less than 10KB big. Each such file has a creation date attribute. Usually a user would upload and ask to retrieve several files created on adjustment days at one time.

My question is, should I design my file handing code to concatenate several user files on adjacent dates and store them together as one file to optimize server performance? IOW, should I be worried about reducing the number of file fetches? Last, is there a limit, on Linux, to the number of files that can be placed inside a folder?

Thanks, Avi

Avi
  • 23
  • 5

1 Answers1

0

For fast retrieval you should play around with Folder structure, let it be any OS Windows, Linux etc, when searching happens it checks all file list.

So what you can do is you can have folder structure like 2012 -> 12 -> 20 2012 - year 12 - Month 20 - day

This will help in retrieval fast and also if you can perform indexing it would be great.

Linux or any OS they don't put limit on number of files in folder.

Nipun Ambastha
  • 2,553
  • 1
  • 16
  • 26
  • Thanks Nipun! That is very helpful. Can you explain the indexing approach? – Avi Dec 22 '12 at 15:55
  • Awesome. I am more Windows Guy, so if you want any help regarding Indexing on Windows I can help you, however for Linux you can try these few links http://superuser.com/questions/120717/what-are-the-popular-file-indexing-engines-on-linux http://tech.lds.org/forum/viewtopic.php?t=96-Indexing-with-Linux http://stackoverflow.com/questions/698943/file-search-algorithms-using-indexing-in-linux – Nipun Ambastha Dec 23 '12 at 09:52