Training, Open Source computer languages
PerlPHPPythonMySQLApache / TomcatTclRubyJavaC and C++LinuxCSS 
Search for:
Home Accessibility Courses Diary The Mouth Forum Resources Site Map About Us Contact
For 2023 (and 2024 ...) - we are now fully retired from IT training.
We have made many, many friends over 25 years of teaching about Python, Tcl, Perl, PHP, Lua, Java, C and C++ - and MySQL, Linux and Solaris/SunOS too. Our training notes are now very much out of date, but due to upward compatability most of our examples remain operational and even relevant ad you are welcome to make us if them "as seen" and at your own risk.

Lisa and I (Graham) now live in what was our training centre in Melksham - happy to meet with former delegates here - but do check ahead before coming round. We are far from inactive - rather, enjoying the times that we are retired but still healthy enough in mind and body to be active!

I am also active in many other area and still look after a lot of web sites - you can find an index ((here))
Data truncation copying from file to mySQL table

Posted by Helenb (Helenb), 25 January 2005
I have a program which reads lines from a text file, formats the data and dumps it into a mySQL table. (code below). It works fine on a small text file, but terminates abrupty when dealing with a longer file. My file has 1500 lines, and is opened and read correctly - char count and line count at the beginning are OK. But when it runs, it never gets to the end, but stops at a variable no of lines in - normally around 1100.  
If I don't write data to a table, the file is read correctly and the program terminates cleanly.
I've tried putting in a print statement to show progress, and no of lines read does not seem fewer for a longer debug message. Yet it feels like a timing thing - it takes several seconds to run.
All running on one laptop. I've even tries disconnecting the laptop from the network and running stand-alone, and still no difference.

I can chop my data entry file into segments to make progress, but I'd live to know what's wrong.

PS Presuming this goes to Graham, I'm still working on getting on the PHP course next week, and feeling optimistic.


function createExportedTable($fname,$create,$tablename){
  if ($info = file($fname)) {
  print "file size of $fname is ".filesize($fname)." chars in ".count($info)." lines<br>";
# debug code to sort out CRLF at end of line - left in for info
#      print "Raw line  ".$info[0]."<br>";
#      print "raw length ".strlen($info[0])."<br>";
#      print "trimmed length ".strlen(trim($info[0]))."<br>";
     $flist = explode(",",trim($info[0]));
#      print "trimmed hex last field ".bin2hex($flist[7])."<br>";
#      print count($flist) . " fields created<br>";
#      print "Table created - fields " .join(",",$flist)."<br><br>";
/******************************** */

/* decode line 1 to get field names, and then create table if necessary */
     $flist = array_map('hbtrim',explode(",",trim($info[0])));
     print ("field names: ".join(", ",$flist)."<br>");

     if ($create) {
           print "creating $tablename <br>";
           mysql_query("drop table if exists ". $tablename);
           $query = ("create table ".$tablename." (".join(" text, ",$flist).
                 " text, rid int not null primary key auto_increment)");
           print "Create SQL: ".$query."<br>";
/* now process the rest of the file to get the data and import it */
     for ($ln=1; $ln<count($info); $ln++) {
#            print "Data record ...$ln <br>";
           $vlist = PervasiveDataTrim(explode(",",trim($info[$ln])));
           $query = ("Insert into ".$tablename." (".join(", ",$flist).") values (".
                 join(", ",$vlist).")");
#            print "SQL: ".$query."<br>";
           $rset = mysql_query($query);
           if (! $rset) print "OH SHXXXXXXXXXXXXdear<br>";
           if ((int)($ln / 100) == $ln/100) print (int)($ln / 100)."<br>";

     print "$ln records created<br>";
           print "input table not found";
}      }

function hbtrim ($item1){
#      print "hbtrim  - ".$item1."  -  ".trim($item1,"\"")." length ".strlen($item1).
#      " hex ".bin2hex($item1)."<br>";
     return (trim($item1,"\""));

function PervasiveDataTrim ($inarray){
     /* Pervasive exports all data with an initial zero field before the data. This function
        deletes the initial, and every alternate field in the array (i.e. 0,1,2,3,... -> 1,3...)
        so the data can be imported to another file */
     if (2*(count($inarray)/2 ) == count($inarray)) {
           $outarray = array();
           for ($i = 1; $i<count($inarray); $i=$i+2){
                 $outarray[] = $inarray[$i];

<title>Hi there</title>
This is HTML<br>
error_reporting(  E_PARSE);
#error_reporting(E_ERROR | E_WARNING | E_PARSE);
#   print "This is php<br>";
print ("Create tables var value:  ".$_REQUEST[anew]."<br>");
switch ($_REQUEST[tabletype]){

       case "ddict": {
          print "Create ddict<br>";
             createExportedTable($_REQUEST[file],$_REQUEST[anew], "datadict");

       case "codes": {
             print "Create codes table<br>";
             createExportedTable($_REQUEST[file],$_REQUEST[anew], "codes");

       case "dmodel": {
             print "Create data model table<br>";
             createExportedTable($_REQUEST[file],$_REQUEST[anew], "datamodel");

       default: {
             print "nothing to do<br>";
  }      }

Name of file: <input name=file><br>
Table Patient notes dump <input type=checkbox name=tabletype value=pnotes><br>
Table data dictionary dump <input type=checkbox name=tabletype value=ddict><br>
Table codes dump <input type=checkbox name=tabletype value=codes><br>
Table datamodel dump <input type=checkbox name=tabletype value=dmodel><br>
Check if I must refresh tables first <input type=checkbox name=anew value=1><br>
and <input type=submit><br>



This is a shortened version - just duplicate to have > 1100 lines.

"0","@CORRESPON","0","ZDA01","0","00","0","00","0","","0","0","0","Correspondence for Doctor     ","0","                    ","0","  ","0","                    ","0","  ","0","                    ","0","  ","0",""
"0","@DOCTOR   ","0","ZDA01","0","00","0","00","0","","0","0","0","Correspondence for Doctor     ","0","                    ","0","  ","0","                    ","0","  ","0","                    ","0","  ","0",""
"0","@CORRESPON","0","ZDA02","0","00","0","00","0","","0","0","0","Correspondence for Consultant ","0","                    ","0","  ","0","                    ","0","  ","0","                    ","0","  ","0",""
"0","@CONSULTAN","0","ZDA02","0","00","0","00","0","","0","0","0","Correspondence for Consultant ","0","                    ","0","  ","0","                    ","0","  ","0","                    ","0","  ","0",""
"0","@PATIENT  ","0","ZDA03","0","00","0","00","0","","0","0","0","Patient correspondence to     ","0","                    ","0","  ","0","                    ","0","  ","0","                    ","0","  ","0",""

Posted by admin (Graham Ellis), 25 January 2005
early in your PHP code.


Fact a: PHP processes are normally server based and run quite fast.
Fact b: It just needs a program bug or two to set up an infinite loop that can be hard to kill at a distance from a browser.

Because of these two, PHP scripts are configured to time out after only a relatively short time when they're run, and that could be what's happening in your case.

My example call will add 180 seconds of cpu time to your "allowance" for the current script;  as the default is usually 10 or 30 seconds, this should be more than enough time.

Edit to add - I understand that this did fix the problem

This page is a thread posted to the opentalk forum at and archived here for reference. To jump to the archive index please follow this link.

You can Add a comment or ranking to this page

© WELL HOUSE CONSULTANTS LTD., 2024: Well House Manor • 48 Spa Road • Melksham, Wiltshire • United Kingdom • SN12 7NY
PH: 01144 1225 708225 • FAX: 01144 1225 793803 • EMAIL: • WEB: • SKYPE: wellho