I've got an /archive/ directory that stores a lot of files (~6k) ... It's growing fast too... Should I be worried? This is on one of my smaller sites, on shared hosting... Just wondering, is there a limit? Should I be worried about having too many files in the same folder? Is it worth it to separate them into subfolders? Thanks
I can tell you that for example my ftp client (flashfxp) doesn't list more than 2000 files per folder. It depends on the file system running on your server. For performance reason (searching, listing, etc) it would probaby be better to separate the files per month/year basis. Here you can read about the limits for Linux ext2 fs and you can do a comparison with all other filesystems: http://en.wikipedia.org/wiki/Ext2 Excerpt: The limit of sublevel-directories is about 32768. If the number of files in a directory exceeds 10000 to 15000 files, the user will normally be warned that operations can last for a long time. The actual limit of the number of files in a directory is very theoretical, because before reaching the limit of 1.3 × 1020 files it will become hard to find new well-defined file names. Code (markup):
You should be worried. You are very far from the limit...But why not sort it? It's very hard to handle a folder with so much files in it. Sort it now, cause in the future it will be harder.
Day to day operations become complex with more than 3K files with usual FTP clients Organizing them in sub-folders could be a solution
This is all good advice. You definitely want to split them up at some stage. Depending on the content, I also use the alphabetical approach. If you had a bunch of user files for example you can use the first few letters to keep them archived So, for a nickname like mine youd have /t/o/n/myfile.jpg If it is date based or articles then the y/m/d approach works well as hogan_h said. Hope this helps