There exists a script which pulls some XML from the remote server. If the script is running on any server apart from production, it really works.

Upload it to production however, also it fails. It's using cURL for that request however it does not matter the way we get it done - fopen, file_get_contents, electrical sockets - it simply occasions out. This happens basically make use of a Python script to request the URL.

Exactly the same script, provided with another Hyperlink to query, works - each time. Clearly it does not return the XML we are searching for however it DOES return SOMETHINg - it may connect with the remote server.

If the URL is asked for through the command line using, for instance, curl or wget, again, information is came back. It isn't the information we are searching for (actually, it returns a clear root element) but something DOES return.

Oddly enough, when we strip out query string components from the URL (the entire URL has 7 query string elements and runs to around 450 figures as a whole) the script will return exactly the same empty XML response. Certain mixtures of the query string will once more make the script to break.

This, understandably, has me absolutely baffled - it appears to operate in each and every circumstance EXCEPT the main one it must operate in. We are able to obtain a response on our dev servers, we are able to obtain a response around the command line, we are able to obtain a response when we drop certain QS elements - we simply can't obtain the response we would like using the correct URL around the LIVE server.

Does anybody have suggestions whatsoever? I am inside my wits finish!

Run Wireshark and find out what lengths the request goes. Might be a firewall problem, a DNS resolution problem, amongst other things.

Also, try thumping your curl timeout to something much greater, like 300s, and find out the way it goes.