I've some issues with a PHP script that calls a Party script.... within the PHP script is submitted a XML file, then your PHP script calls a Party script that cut the file in portions (for instance, is submitted a XML file of 30,000 lines, therefore the Party script cut the file in servings of 10,000 lines, so it will likely be 3 files of 10,000 each one of these)

The file is submitted, the Party script cut the lines, however when the Party script returns towards the PHP script, the PHP script dies, &lifier I have no idea why... I examined the script in another server and delay pills work fine... I dont believe that's a memory problem, maybe it's a processor problem, I have no idea, I dont get sound advice, so what can I actually do??? (Im while using function spend_professional in PHP to call the Party script)

The mistake only happens when the XML file has a lot more than 8,000 lines, but when the file has less then 8,000 things are ok (this really is relative, it is dependent of the quantity of data, of strings, of letters that consists of each line)

what else could you suggest me??? (sorry for my bad british, I must practice a great deal xD) I leave the code here

PHP script (in the finish, following the ?>, there's html &lifier javascript code, however it does not appear, just the javascript code... essentially the html is just to upload the file)



" . date('c') . ": $str
"; $file = fopen("uploadxmltest.debug.txt","a"); fwrite($file,date('c') . ": $str\n"); fclose($file); } try{ if(is_uploaded_file($_FILES['tfile']['tmp_name'])){ debug("step 1: the file was uploaded"); $norg=date('y-m-d')."_".md5(microtime()); $nfle="testfiles/$norg.xml"; $ndir="testfiles/$norg"; $ndir2="testfiles/$norg"; if(move_uploaded_file($_FILES['tfile']['tmp_name'],"$nfle")){ debug("step 2: the file was moved to the directory"); debug("memory_get_usage(): " . memory_get_usage()); debug("memory_get_usage(true): " . memory_get_usage(true)); debug("memory_get_peak_usage(): " . memory_get_peak_usage()); debug("memory_get_peak_usage(true): " . memory_get_peak_usage(true)); $shll=shell_exec("./crm_cutfile_v2.sh \"$nfle\" \"$ndir\" \"$norg\" "); debug("result: $shll"); debug("memory_get_usage(): " . memory_get_usage()); debug("memory_get_usage(true): " . memory_get_usage(true)); debug("memory_get_peak_usage(): " . memory_get_peak_usage()); debug("memory_get_peak_usage(true): " . memory_get_peak_usage(true)); debug("step 3: the file was cutted.
END"); } else{ debug("ERROR: I didnt move the file"); exit(); } } else{ debug("ERROR: I didnt upload the file"); //exit(); } } catch(Exception $e){ debug("Exception: " . $e->getMessage()); exit(); } ?> Test function uploadFile(){ alert("start"); if(document.test.tfile.value==""){ alert("First you have to upload a file"); } else{ document.test.submit(); } }

Party script with AWK



#!/bin/bash

#For single messages (one message per contact)
function cutfile(){
 lines=$( cat "$1" | awk 'END {print NR}' )
 fline="$4";

 if [ -d "$2" ]; then
  exsts=1
 else
  mkdir "$2"
 fi

 cp "$1" "$2/datasource.xml"
 cd "$2"

 i=1
 contfile=1
 while [ $i -le $lines ]
 do 
  currentline=$( cat "datasource.xml" | awk -v fl=$i 'NR==fl {print $0}' )

  #creates first file
  if [ $i -eq 1 ]; then
   echo "$fline" >>"$3_1.txt"
  else
   #creates the rest of files when there are more than 10,000 contacts
   rsd=$(( ( $i - 2 ) % 10000 ))
   if [ $rsd -eq 0 ]; then
    echo "" >>"$3_$contfile.txt"
    contfile=$(( $contfile + 1 ))
    echo "$fline" >>"$3_$contfile.txt"
   fi
  fi

  echo "$currentline" >>"$3_$contfile.txt"
  i=$(( $i + 1 )) 
 done

 echo "" >>"$3_$contfile.txt"
 return 1
}


#For multiple messages (one message for all contacts)
function cutfile_multi(){
 return 1
}

cutfile "$1" "$2" "$3" "$4"
echo 1


thanks!!!!! =D