Training, Open Source computer languages
PerlPHPPythonMySQLApache / TomcatTclRubyJavaC and C++LinuxCSS 
Search for:
Home Accessibility Courses Diary The Mouth Forum Resources Site Map About Us Contact
 
For 2023 (and 2024 ...) - we are now fully retired from IT training.
We have made many, many friends over 25 years of teaching about Python, Tcl, Perl, PHP, Lua, Java, C and C++ - and MySQL, Linux and Solaris/SunOS too. Our training notes are now very much out of date, but due to upward compatability most of our examples remain operational and even relevant ad you are welcome to make us if them "as seen" and at your own risk.

Lisa and I (Graham) now live in what was our training centre in Melksham - happy to meet with former delegates here - but do check ahead before coming round. We are far from inactive - rather, enjoying the times that we are retired but still healthy enough in mind and body to be active!

I am also active in many other area and still look after a lot of web sites - you can find an index ((here))
Connecting to MySQL over an unreliable network

Posted by aparks (aparks), 25 May 2006
I have a Perl script that connects to a server running a network packet sniffer and grabs the data, one line at a time. I pull out the source IP and MAC address, and the destination IP and MAC address.
Here's the (very cut down) code. Naturally in the real code there's lots of error checking

Code:
open (RA, /usr/local/ra -S 172.16.1.1);
while (<RA>) {
     my ($srcip,$destip,$srcmac,$destmac) = split(/","/,$_);
}


/usr/local/ra is the client program. It queries the monitoring server at 172.16.1.1 and gets back a line of data similar to:

172.16.1.2,192.168.34.34,01:23:45:67:89:0A,98:76:54:32:10:AB

So I'd like to get this data into a MySQL database. There are a couple of other relevant issues here too:
- Connectivity between the MySQL server and the host on which this script will run is not that robust (the bandwidth is saturated with students using P2P software, which is why we need this in the first place!).
- I want to leave this running for days on end so I need the best solution in terms of connectivity to the MySQL server. If the script can't see the server, I don't want it to die on me but just carry on going until it can see the server and start writing out to it again. It doesn't matter if it misses records so long as when there is decent network connectivity, it reconnects and starts writing them out to the database again.

Here are the questions!

1. Where best to put in the $dbh = DBI->connect and $dbh->disconnect() statements? Should they be outside the "while (<RA>)" statement above or should I connect each time I want to write a record to the database and then disconnect again immediately afterwards? The latter sounds more robust for our environment but perhaps not very efficient?

2. Might it be better to store up a few of these records in the Perl script (I can write them all to a hash or something) and then populate the SQL database from the hash every time I get (say) 1000 records?

Any thoughts on the above would be greatly appreciated!

Graham - hope you and Lisa are both well. Received the latest "Of Course" a couple of days ago, a great read as always. Even my wife looked through it - she doesn't work in IT but is a keen photographer and was most impressed with the Wiltshire montage!

All the best,

Adrian

Posted by admin (Graham Ellis), 25 May 2006
Hi, Adrian - I'm in Saudi Arabia at the moment ... access is patchy to the web here so I'm cutting and pasting your question and I'll come back with an answer tomorrow.  Funnily enough, we had a similar issue when our live server was feeling the strain a couple od months back, so I may have some useful comments!

Posted by aparks (aparks), 25 May 2006
Thanks Graham - much appreciated!

Posted by admin (Graham Ellis), 26 May 2006
Adrian, we had a similar situation on our web site a few weeks back, where a denial of service by some twits with more bandwidth than sense was bringing up to 1000 MySQL connection requests at a time to the server. Similar, but not the same, and in PHP too where we had to take immediate action to get back to our web site visitor rather than being able to simple delay an update.   Our solution:

1. We used mysql_pconnect - in other words, we picked up an already-open connection to the database from a previous page where possible.  In your situation, the equivalent advise is to connect once when your program starts, but then to re-run the connect should an insert fail - "Oops - something has gone wrong, let's have another go".

2. We put our connect into a loop to try 10 times over, with a one second delay between each. In your situation I would recommend against this - rather, I would suggest you buffer up records you want to log if an insert fails, and then a single attempt to reconnect also fails.

3. If we still failed after 10 attempts, our web site continues/d to function but with a "Stop Press" box on each page - "Our server is very busy at present and some database information may not be available".

The "Stop Press" also pops up with a less dramatic message / a warning to us if there are problems before they reach the crisis level; the mechanism is in any case a bit of a self-defence trick - we have so many good friends out there who visit ouur site more often that I had realised, and they were all emailing me to tell me I had problems.  Fabulous, except I was concentrating on both answering them AND fixing the issue.

Quote:
1. Where best to put in the $dbh = DBI->connect and $dbh->disconnect() statements? Should they be outside the "while (<RA>)" statement above or should I connect each time I want to write a record to the database and then disconnect again immediately afterwards? The latter sounds more robust for our environment but perhaps not very efficient?


Connect ONCE.  If you find that an insert fails, try to connect again. If that fails, question (2) provides your answer ...

Quote:
2. Might it be better to store up a few of these records in the Perl script (I can write them all to a hash or something) and then populate the SQL database from the hash every time I get (say) 1000 records?


Yes, if you don't need the database to be instantly updated as every logged event happens, I would store records in a LIST.  Then attempt to populate the the database from the list when the number of items in it is a multiple of (say) 100 - here's some pseudocode:

Code:
push @logrecs,$newrecord;
unless (@logrecs%100) {
       for ($try=0; $try<2; $try++) {
       # Run insert code (including your split)
       if ($succeeded) {
               undef @logrecs;
               last;
               }
       last if ($try == 1);
       # reopen code
       }


I note you say that you don't mind loosing a few records. But with this scheme, you shouldn't.  You might want to consider putting in some sort of automated email alarm once the record log exceeds (say) 100,000 records - you'll know an appropriate theshhold, and you might want to consider buffering via a local text file if you have serious issues with lots of buffer-fulls.  Finally, if you don't want to loose any records that are stored in memory when you terminate your monitoring daemon - e.g. when you reboot - you should teminate on a signal and flush the records out as part of the termination handler.  Probably a subject for a further post if you need to go there.

Quote:
Graham - hope you and Lisa are both well. Received the latest "Of Course" a couple of days ago, a great read as always. Even my wife looked through it - she doesn't work in IT but is a keen photographer and was most impressed with the Wiltshire montage!


Many thanks, Adrian - it's all so much more worthwhile when we hear comments like this.  Lisa and I are both well, if rushed off our feet. About a half of the newsletters have been mailed out so far, and the other half go out at the end of the Bank Holiday weekend - we prefer them not to land right at the holiday. And we're emailing a proportion of our database ahead of time too just to check that they still want to be included.  All of which makes for a frenetic time!

Lisa's rather good with the pictures, artwork and layout isn't she?   Mind you, Wiltshire is a lovely county which does help with pages like that!  In fact ... "watch this space" as you'll see us talking more and more about Well House Manor - our new hotel / delegate accommodation venture, and the associated "Weekend in Wiltshire" web site that we'll be using as our splash site for delegate web connections from there and for promoting ourselves as a tourist facility at weekends when we're not running courses.


Posted by aparks (aparks), 26 May 2006
Plenty of food for thought there - thanks for the detailed answer, Graham - it's very much appreciated.



This page is a thread posted to the opentalk forum at www.opentalk.org.uk and archived here for reference. To jump to the archive index please follow this link.

You can Add a comment or ranking to this page

© WELL HOUSE CONSULTANTS LTD., 2024: Well House Manor • 48 Spa Road • Melksham, Wiltshire • United Kingdom • SN12 7NY
PH: 01144 1225 708225 • FAX: 01144 1225 793803 • EMAIL: info@wellho.net • WEB: http://www.wellho.net • SKYPE: wellho