FTP GET for a large file from mainframe.
Posted by gauri (gauri), 29 June 2006I'm trying to do an FTP GET a file from mainframe to Linux.
I have a perl daemon on Linux which issues the follwing commands-
my $olddir = getcwd();
chdir(makedir()) or $ftp=0;
and I get the follwing messages from FTP server:
220-FTPD1 IBM FTP CS V1R7 at IMSJES2.us.rxcorp.com, 21:48:41 on 2006-06-29.
220 Connection will close if idle for more than 15 minutes.
331 Send password please.
230 xxxxxxx is logged on. Working directory is "xxxxxxx.".
200 Representation type is Image
227 Entering Passive Mode (162,44,231,228,7,34)
200 Port request OK.
125 Sending data set G1ZIP01.TST.ET770056.FILE001.SPLIT001
250 Transfer completed successfully.
221 Quit command received. Goodbye.
The above script downloads a file of 200 bytes correctly.
However a 50 MB or more file is downloaded as a zero byte file and a successful transfer message is received.
When I manually try to do the ftp for 300 MB file, it downloads correctly in 40 seconds.
Why does the script download a zero byte file?
What do I need to do to have the complete file transferred correctly-whatever the size!!
Posted by admin (Graham Ellis), 30 June 2006I tested out the following code:
(password changed before publication, of course!) and it correctly downloaded my 89 Mbyte compressed SQL backup file.
I can't see anything wrong in the code you have posted, but I do note that it's incomplete - perhaps a few lines that you felt to be the important ones cut and pasted from a larger application, and there's also a certain amount of pseudocode in there. Can I suggest you start by getting an example as simple as mine working as a test case, like I did, then work up from there; I suspect that your problem and the clues to it lie may not lie in the code you posted.
PH: 01225 708225 • FAX: 01225 793803 • EMAIL: firstname.lastname@example.org • WEB: http://www.wellho.net • SKYPE: wellho