You will find there's website that lists links to blogs in realtime. The issue is the pages are slow to load since they're reading through data in the various source sites.

I authored a PHP script that produces an HTML version of every page. This runs once hourly. The issue is the PHP script is timing out before it finishes all of the pages. I understand which i could boost the execute time permitted for PHP scripts, but this doesn't appear like the best way to handle problem.

Can there be a different way to do that? I simply have no idea things to begin searching for - PERL? JAVA? Python? How can these scripts operate on a server? What must i search for from my hosting company?

Python with urllib2 will most likely perform a good job. Also, will i appreciate this right: you've got a site that aggregates data using their company sites, and it is all produced static HTML? It may sound like you are type of using HTML like a database, so perhaps get a proper one.

Since your original problem is just one of network latency ("pages are slow to load") I see pointless to think that PHP may be the bottleneck here. I doubt altering languages will affect your script run time.

Another solution may be to utilize a database, and never bite off a lot work on once. Create a table listing the websites you pull, and store once they were last drawn. Then possess the cron take out one or two that weren't drawn shortly. Get it run frequently, then you will also have fresh data, however the script may have an simpler time being employed as it is not trying to do this much at the same time. This idea will scale well.