unable to calculate file size greater than 1 GB us
Posted by gauri (gauri), 18 October 2006My perl script is a very big script and performs a lot of complex tasks. One of the tasks within the script is to calulate the size of a file as follows:
$size = -s $filename;
$size = -1 unless defined($size);
When the file size is larger than 1GB for example 16376862124 bytes or 16 GB, the script returns $size = -1
For a file size less than 1 GB for example 775752944 bytes,
the correct file size is returned.
When I run another sample script performing just the above simple task of calculating file size, the correct size of 16376862124 bytes is returned for the larger file after a long time-about 40 minutes. The problem arises when the above task is within the larger script.
Can sombody suggest a solution to this problem?
I think perhaps it is a timing issue and in the larger script after a fixed time of 20 minutes or so, the perl interpreter moves to the next task/command in the script.
Is there someway I can increase the time limit to execute any command?
What is the default time limit set by the interpreter to execute any command?
Please suggest a solution to my problem.
Posted by admin (Graham Ellis), 19 October 2006The initial problem is likely to be the integer range of the system and not any timeout function. 32 bit integers are limited to 4Gb, or to 2Gb if they're signed which is no longer plentiful for modern computing. As I recall, Perl 6 will switch to a bigint type where it needs to, but I think for the moment the -s is going to give you a problem.
Your test script worked, you say, but took 40 minutes. So that's not a practical way forward. Personally, as a quick kludge I would run a back-ticked OS command to get the information - something like a dir or ls -l which gets the correct data from the disc header rather than scanning the entire file.
Posted by Custard (Custard), 19 October 2006Have a look at:
Maybe these will help your large numbers..
Posted by banjo (banjo), 25 November 2006I agree with Graham Ellis.. if you just need the size of the file back ticking ls -l then using awk or cut or regex would give you the exact same result in a less process hungry approach.
please do correct me if I'm wrong.
This page is a thread posted to the opentalk forum at www.opentalk.org.uk and archived here for reference. To jump to the archive index please follow this link.
PH: 01144 1225 708225 • FAX: 01144 1225 793803 • EMAIL: email@example.com • WEB: http://www.wellho.net • SKYPE: wellho