| |||
job control in expect with ssh Posted by that_guy (that_guy), 5 August 2003 Hi. I'm writing a program that logs the output of programs that are launched on several remote machines to which I am connecting via ssh. Each machine potentially needs to run several different programs, all of which need to be logged and started in a specified order. I'm wondering where the job control here would sit. Should I spawn a new ssh+ command_I_want_to_run, for each command and call expect -i on that? or would it be possible to spawn one ssh process to each machine and then background jobs in this single shell ? Right now, I've only tested running programs serially (wait for each one to finish before continuing) using one ssh process per host. So far the logging for this works, but being able to scale up to multiple programs per host is crucial.Posted by admin (Graham Ellis), 7 August 2003 I would tend to spawn multiple ssh's per machine; not elegant, but otherwise you're left trying to catch all the job completed messages for the things you're backgrounded which would be very tough I think.Posted by that_guy (that_guy), 7 August 2003 thanks. I think I'm going to have to follow that route because as you point out, the alternative is quite complicated. My only concern is that spawning new ssh connections is relatively slow, hopefully this won't become an issue.Posted by admin (Graham Ellis), 7 August 2003 Thought - as a bit of a compromise, you could keep ssh sessions open when you finish a command (in a pool), and then just run anything else that's needed on a remote machine on an "old" ssh that has finished. More programming, but could solve the time issue if it did become a problemThis page is a thread posted to the opentalk forum
at www.opentalk.org.uk and
archived here for reference. To jump to the archive index please
follow this link.
| |||
PH: 01144 1225 708225 • FAX: 01144 1225 793803 • EMAIL: info@wellho.net • WEB: http://www.wellho.net • SKYPE: wellho |