I am an unskilled and ambitious coder, that has brought me to difficulties with this function (that we use within WordPress):

<?php

add_filter( 'the_content', 'scan_links_tables', 9 );

function scan_links_tables($content) {
    global $post;

    if ( is_singular('post') ) {

        $search[] = '/\b(Word1)\b(?![^<>]*<\/a>|[^<]*>)((?:(?!<td>).)*)<\/td>/s';
        $replace[] = '<a href="http://website1.com/">$1</a>$2</td>';

            // Repeat the same search and replace for ~700 different word/link combos

        $content = preg_replace($search, $replace, $content);

    }

    return $content;    
}
?>

So essentially I believe the problem may be the amount of substitutes I am doing. I recieve a 500 Internal Server Error the moment I make put this survive my website. This is actually the error during my logs:

Premature finish of script headers: php-fastcgi.fcgi

Any tips about either how you can fix the timeout problem, or better start the preg_replace? I have investigated HTML parsing a bit, but this works perfectly in testing with less terms and I'm not sure if utilizing a parser could be anymore efficient. The regex works fine, I am mostly worried about speed/efficiency for that server. Thanks ahead of time for just about any input! :)

EDIT: Blah, it had been an problem with my regex.

Transformed it from:

/\b(Word1)\b(?![^<>]*<\/a>|[^<]*>)((?:(?!<td>).)*)<\/td>/s

to:

/\b(Word1)\b(?![^<>]*<\/a>|[^<]*>)((?:(?!<td>).)*?)<\/td>/s

The issue was when a webpage did not possess a table (and therefore a td), it broke. I desired to create that area of the expression ungreedy therefore it works both in table and non-table instances.