Time out setting
Posted by TedH (TedH), 15 March 2009Hi Graham, just need to check something. I found a little code to list links. It also checks whether that link exists. Another thing that happens is a time out. If there is no response from a server after so much time, it continues and won't show that particular link. Links are in a small text file (not many about 10 or so - some are added occassionally)
The time out code is this:
My question is - what is the 10? 10 Seconds? 10 milliseconds? or what?
This works fine and fairly quickly, when I tested it from my Stateside server to UK server based links.
Hope you can shed some light on it for me, many thanks - Ted
Posted by admin (Graham Ellis), 15 March 2009Hi, Ted ... I'm afraid I'm going to answer a question with a question. "What sort of object id $ua?"
Timeout settings vary in their units ... our Tomcat deployment course always springs to mind when I mention this, as there are two settings in the main configuration files - one in milliseconds and one in minutes!
Posted by TedH (TedH), 16 March 2009Hi Graham, does this help? It's a script I ran across and tried out. I assume the larger the time setting the longer before the "time out" for a URL.
There's the bit that says $ua head, so I guess that's some kind of reference in the LWP Simple module. It's not as long as a minute because the links get listed pretty quickly. I'm presuming it's milliseconds, but just wanted to be sure.
Cheers - Ted
Posted by KevinAD (KevinAD), 17 March 2009Ted,
The code you have posted should do nothing except return an error or errors. Are you sure that is the exact code you are trying to use?
The timeout() methods interval is measured in seconds unless you tell your script to do something different.
But timeout() is not a method of LWP::Simple, which is not even a OO module. Maybe you are using LWP::UserAgent?
Posted by TedH (TedH), 17 March 2009Hi Kevin,
I get no errors on my Local testbed server. Tried online on my live server it ran a treat. Both Apache 2.0/Perl 5.8
Here's the whole thing
Seconds should be fine then, I may lessen it some. It handles URLs that are not there okay, if a server's bogged down though it would slow stuff a little. Seeing as there won't be very many links it should suffice.
Posted by KevinAD (KevinAD), 17 March 2009OK, I See what is going on, quoted from LWP::Simple:
The module will also export the LWP::UserAgent object as $ua if you ask for it explicitly.
That was a new one to me. What that does is this:
So it is using LWP::UserAgent and creates a $ua object sort of automatically. So, ignoring syntax errors inthe original code you posted, the last code you posted should run fine although it could maybe be pared down by using LWP::UserAgent directly and not loading other LWP stuff you're not using.
Posted by TedH (TedH), 17 March 2009Thanks Kevin,
I'll give that a try and see how it goes.
Posted by TedH (TedH), 17 March 2009Ended up leaving as it was.
Tried a live test with 15 calls on and it's quite slow. The object is to have a links list so that if the URL is not there, it won't show. Too many links lists have sites that are no longer valid and I wanted to avoid that by testing the links first (without massive amounts of code).
The quest continues....
Now having done the test and seeing it's inherent slowness, I can see this needs to be approached from a different direction. Probably where I have a list of links in a file that comes up as normal (quickly). Then, do a regular check in my admin script (or a cron job) to see whether or not they are all still valid and write code to delete the out of date links.
Otherwise I end up with complaints that links aren't working. I hadn't realized the process it takes to check things like this (inexperience). Learn something new every day
Posted by admin (Graham Ellis), 18 March 2009on 03/17/09 at 17:14:07, TedH wrote:
I've been there, Ted ... (but I do need to update my procedures and be there again).
Since links don't tend to die and com eback again quickly, there's no need to check all the URLs every time, or indeed at the time that the user is on the page - a script on the server that runs (say) once every 24 hours and builds up a local file or database of what exists and what has gone away should be sufficient in frequency, I would have thought.
I need one of those too - perhaps you should come down one evening / weekend and we could work on it (do I feel another Geekmas coming on)?
Posted by TedH (TedH), 18 March 2009Whle I may not be able to get down there, I'll email you a copy when I've done it.
Been quite busy here lately, which is good, all things considered.
Get back to you later - Ted
Posted by KevinAD (KevinAD), 19 March 2009on 03/17/09 at 17:14:07, TedH wrote:
It should speed up if you shorten the timeout() interval but it might also be less accurate. Personally I would do something like Graham suggests or implement a "report a bad link" feature so your users can report bad links to you and you can investigate them on an as needed basis. Links not working one day might work the next just because of the unpredictable nature of the internet. Overall, this is not something you want to do in real time, you want to do it as a cron job or task or periodically check the links manually or use a reporting system.
Posted by TedH (TedH), 19 March 2009Yeah, I think it is best to do it as a cron job or as an admin function. I prefer the admin function as that would give time to double check in case a link comes back (i.e. - server down at time of check).
Right now I've got to the point of putting the result into an array and figuring out how to get it written to a file. Standard type array didn't work (only wrote the first line), so I'm going to have a look at your hashes suggestion (from another posting).
Oh what fun we have
Tried to get too fancy. It's just basic Perl. I simply used a form for the results and saved them. Kills a number of birds with one stone.
Posted by TedH (TedH), 21 March 2009Hey Kevin, been tearing my hair out for hours trying stuff before I figured out that LWP Simple throws in an extra, blinkin' carriage return. There's no smiley for Aaaaaaargh!!
Happened after the valid URLs got read into the textarea and I saved them. The save grabbed it. Solved with,
I think I'll sit down and watch NCIS now - I need a break.
Posted by KevinAD (KevinAD), 21 March 2009I've been bald for years, so welcome to the club.
This page is a thread posted to the opentalk forum at www.opentalk.org.uk and archived here for reference. To jump to the archive index please follow this link.
PH: 01144 1225 708225 • FAX: 01144 1225 793803 • EMAIL: firstname.lastname@example.org • WEB: http://www.wellho.net • SKYPE: wellho