I have got a chat program which pushes JSON data from Apache/PHP to Node.js, using a TCP socket:

// Node.js (Javascript)
phpListener = net.createServer(function(stream)
{
    stream.setEncoding("utf8");
    stream.on("data", function(txt)
    {
        var json = JSON.parse(txt);

        // do stuff with json
    }
}
phpListener.listen("8887", 'localhost');

// Apache (PHP)
$sock = stream_socket_client("tcp://localhost:8887");
$written = fwrite($sock, $json_string);
fclose($sock);

The issue is, when the JSON string is big enough (over around 8k), the output message will get split up into multiple portions, and also the JSON parser fails. PHP returns the $written value because the correct entire string, however the data event handler fires two times or even more.

Must I be affixing the function to another event, or it is possible to method to cache text across event fires, in ways that will not succumb to race conditions under heavy load? As well as other solution I've not considered?

Thanks!

You should attempt utilizing a buffer, to cache the information, as Node.js has a tendency to split data to be able to improve performance.

http://nodejs.org/api.html#buffers-2

you are able to buffer all of your request, after which call the function using the data saved in internet marketing.

TCP electrical sockets don't handle loading for you personally. How could it? It does not understand what application layer protocol you're using and for that reason has no clue such a "message" is. It can be you to definitely design and implement another protocol on the top from it and handle any necessary loading.

But, Node.js comes with a built-in application layer protocol on the top of TCP that does instantly handle the loading for you personally: the http module. If you are using the http module rather than the tcp module with this you will not be concerned about packet fragmentation and loading.