Im beginning to build up an internet site which will have 500,000+ images. Besides requiring lots of disk space, do you consider the performance from the site is going to be affected basically make use of a fundamental folder schema, for instance all of the images under /images?
It will likely be not fast enough if your user demands /images/img13452.digital?
When the performance decreases proportional to the amount of images within the same folder, which schema/arquitecture would you recommend me to make use of?
Thanks ahead of time, Juan
Is dependent around the file system, is dependent on a number of other things. On common approach though would be to hash the filenames after which create subdirectories, this limits the files per directory and for that reason will improve performance (again with respect to the FS).
ab ab.png lm ablm.png cd compact disc.png xo xo.png
You may even wish to search SO for additional on that subject:
That will rely on the OS and filesystem usually you will not visit a performance hit should you reference the file through the full path. Where you may have problems is by using maintenance scripts and the like needing to read a huge directory listing. I usually think it is better over time to arrange a lot of files into some type of logical directory structure.