I wish to check your blog publish for occurrences of specific foreign words, after which link individuals words to seem files to allow them to be performed.

I've an XML file with 2500 words which i have seem files for, and I am wondering what's the easiest method to store and traverse this list? Their email list is not prone to change, and also the function is going to be operate on each blog publish when seen entirely (not when excerpts are proven on archive pages etc).

The XML file is 350KB, that we was loading into PHP with simplexml_load_file. I figured it was a little large, and so i converted it right into a PHP file that contains an indexed (by string) variety of what, that can bring the quality lower to around 60KB.

Must I be worrying a lot concerning the quality, or even more about the length of time it will require to look with the data? It is possible to better method of carrying this out or will it be very best in a database? Any help could be appreciated!

If you discover parsing and matching the XML file from the blogpost happens within reasonable time, then there's you don't need to optimize. Optimize whenever you notice any significant negative impact.

The simplest approach would most likely be to merely cache the processed pages. Whenever your blog publish or even the word list changes, invalidate the cache, therefore it will get processed anew next time it's known as.

Transforming your file right into a PHP array is simply great (you cannot fare better than that performance-smart unless of course you decide to go into writing your personal extension). Not just may be the input file more compact, however, you also have taken proper care of quite a CPU-heavy (with regards to your other procedures) XML-parsing step.

An objection might have been elevated because an assortment will pressure you to definitely read all the data in at the same time, but coming in at 60K that's not a problem.

For searching the information, since PHP arrays are associative they provide very good performance within this type of scenario.

Overall I 'd say your approach is the most appropriate one.

Indexing in line with the variety of words saved inside a file 's time consuming than searching within the XML.

Could it be an optiion to cache the information using memcached ?

Undoubtedly probably the most extensible means to fix this is by using a database. This could handle immeasureable data without significant performance drops, if you had more data later on it might be trivial to include it. Within this situation, you could utilize [cde], which requires fairly little when it comes to installation and configuration but is rather fast and effective.

Your solution utilizing a PHP array (most probably using sqlite/include) is a nice doozy, and that i wouldn't worry an excessive amount of about altering it. You're absolutely right, however, to get rid of the XML file. That might be both excessively work-intensive and slow.