Training, Open Source computer languages
PerlPHPPythonMySQLApache / TomcatTclRubyJavaC and C++LinuxCSS 
Search for:
Home Accessibility Courses Diary The Mouth Forum Resources Site Map About Us Contact
simultenious telnet sessions with Expect

Posted by krapiva (krapiva), 19 January 2005
Please, please, please help...i've been at it for 2 weeks now...Ok, i am writing an expect script that telnets to 2 different servers.  When I log in one server, prompts for password and the other one doesn't...my script can't handle it...what do i do?

here is what i've got:

foreach site [array names sites] {
           send_verbose "Spawning telnet for $sites($site)\n"
           spawn telnet $s($sites($site),host)
           set timeout -1
           expect {
              timeout { send_verbose "timed out\n"; exit 1}
              "connection refused" { send_verbose "timed out\n"; exit 1}
              "Unknown host" { send_verbose "Unknown host\n"; exit 1}
              "o route to host" {send_verbose "host not found\n"; exit 1}
              "o address associated with name" {send_verbose "host not found\n"; exit 1}
              "nable to connect" { send_verbose "unable to connect\n"; exit 1}
              "login:"
           }
           set host$sites($site) $spawn_id
           lappend ulist $spawn_id
     }


send -i $ulist "$s($sites($site),user)\r"
expect {
           -i $host1 -re "Password:" {
                 send_verbose "process $spawn_id just did $expect_out(buffer)"
                 send "$s($sites($site),password)\r"}

           eof { send_verbose "end of file by $spawn_id: $expect_out(buffer)\n";}
     }

expect -i $ulist "$s($sites($site),prompt)"
send -i $ulist "cd $s($sites($site),dir)\r"

i can see that host1 logs in, but then i get "connection closed by foreign host

TIA, J.

Posted by admin (Graham Ellis), 20 January 2005
Nothing there that shouts "error" at me ... but then I don't know (for example) how $s is set up.

Suggestion - try "reducing" the problem and / or playing with the program and seeing how the error changes.  Your opening the first host followed by the second ... what happens if you reverse the order - which one of the two fails?   If you then reduce to just one host, (alternate at testing), does one fail and the other work?  The clues will help you narrow down where the problem lies.

Also try writing a piece of "hard code" that explicity opens one connection then the other, using as few variables and as many constants as possible, and as few loops - a specific case.  Does this fail or work?   Make the analysis easier?  At what point does it fail - i.e. at which point does that "connection closed" come up, and is that message always at the same point or does it more if you put in a few delaying sleeps or extra exchanges with the first host before you go on to the second host?

Have you tried printing out your spawn_id and seen what's happening there?

Posted by krapiva (krapiva), 20 January 2005
Thanx for the speedy reply...Here is some additional data.  if i execute processes one at a time, it works.  Looks like what happends is the following:

first process logs in, second process logs in, enters username, password, tries to cd to the directory, that's when it gets the connection closed message...i suspect, it's the first process that throws it...when i try to print out $expect_out(buffer) it has $spawn_id of the second process and my username...a line that happends right before having to enter the password...

i also have to run ftp sessions for both processes, but this part actually works without problem...somehow, the 'password' part causes the problem

thank you again,
-J.

Posted by admin (Graham Ellis), 20 January 2005
Does sound odd.  I look forward to hearing how you get on with the other ideas / trying out patterns I suggested a couple of hours ago; that sort of thing usually leads towards tieing the problem down really tightly then either solving or working around it.

Posted by krapiva (krapiva), 21 January 2005
Looks like it just can't be done...basically if i do:

foreach site [array names sites] {
           send_verbose "Spawning ftp for $sites($site)\n"
           if [catch "spawn -ignore SIGQUIT ftp $s($sites($site),host)" reason] {
                 send_verbose "failed to spawn ftp for process\n"
                 exit 1
           } else {
                 send_verbose "connected to $s($sites($site),host)\n"
           }
           lappend ulist $spawn_id
           set host($spawn_id) $sites($site)
     }
     foreach h [array names host] {
           send_verbose "host of $h is $host($h)\n"
     }
     expect {
           -i $ulist "Name" {
                 send_verbose "proc is $expect_out(spawn_id) user is $s($host($expect_out(spawn_id)),user)\n"
                 send -i $expect_out(spawn_id) "$s($host($expect_out(spawn_id)),user)\r"
                 exp_continue
           }
                 "Password" {
                 send_verbose "proc is $expect_out(spawn_id) password is $s($host($expect_out(spawn_id)),password)\n"
                 send -i $expect_out(spawn_id) "$s($host($expect_out(spawn_id)),password)\r"
                 exp_continue
}


it works fine...what happends at the 'expect {' one of the processes is picked...then the code is executed for this process.  Once i get to 'exp_continue' another process is picked and code for it is executed...However, the next line if ftp prompt...if i do:

"ftp>" {
stuff to do at ftp prompt
exp_continue
}

it gets cought up in neverending loop, since at the end it gets to ftp prompt again, and everything is being repeated...if i don't put 'exp_continue' then one one process gets executed...


*Unless of course i am missing something*.  This is my first experience with Expect, so i might've understood something in a book.  But for now i don't see a way of doing what i am trying to do ' a simultanious ftp to 2 different servers with different logins and stuff.  So, i'll try to re-write the code without -i flag and will try to use fork instead.  I'll just fork before spawning each of the hosts...any good examples of doing this?

thanx again for all the help,
-J.




Posted by admin (Graham Ellis), 22 January 2005
What you're looking to do is possible but it's not easy programming.  Having a look through your latest example, I'm concerned that you're ovecomplicating it by having the logins and not just the transfers run in parallel; with FTP, the time taken (and therefore the gain in doing things in parallel) is at transfer time.  And I think you'll get all sorts of issues if you try to fork and have the processes communicate that way - swapping one problem for a potentially bigger one

Suggestion - write a loop to completly log in to each of the two hosts IN TURN (i.e. have it run serially and not in parallel), storing each of the spawn_ids from this into your list.  The do the transfers in parallel.  I *think* your issue might be that one of the processes needs to still be in the login part of your code and the other needs o be in the FTP transfer section; you've carefully allowed several processes to be concurrent within each, but you've not allowed for one process to be in the login section while the other has already advanced to the transfer section;  that's a design flaw, but a very easy mistake to make since parallel control of multiple processes is something that none of us does very often.

Have a look at this example which although very simple (using ping) show you how I did all the initial setup first, got all the spawnded processes staring and running along sweetly, and only then went into the mode of harvesting the results.  You should be able to alter my example and add inthe extra login send / expect type stuff within the same loop that spawns the new processes.



This page is a thread posted to the opentalk forum at www.opentalk.org.uk and archived here for reference. To jump to the archive index please follow this link.

You can Add a comment or ranking to this page

© WELL HOUSE CONSULTANTS LTD., 2014: Well House Manor • 48 Spa Road • Melksham, Wiltshire • United Kingdom • SN12 7NY
PH: 01144 1225 708225 • FAX: 01144 1225 899360 • EMAIL: info@wellho.net • WEB: http://www.wellho.net • SKYPE: wellho