Training, Open Source computer languages
PerlPHPPythonMySQLApache / TomcatTclRubyJavaC and C++LinuxCSS 
Search for:
Home Accessibility Courses Diary The Mouth Forum Resources Site Map About Us Contact
 
For 2023 (and 2024 ...) - we are now fully retired from IT training.
We have made many, many friends over 25 years of teaching about Python, Tcl, Perl, PHP, Lua, Java, C and C++ - and MySQL, Linux and Solaris/SunOS too. Our training notes are now very much out of date, but due to upward compatability most of our examples remain operational and even relevant ad you are welcome to make us if them "as seen" and at your own risk.

Lisa and I (Graham) now live in what was our training centre in Melksham - happy to meet with former delegates here - but do check ahead before coming round. We are far from inactive - rather, enjoying the times that we are retired but still healthy enough in mind and body to be active!

I am also active in many other area and still look after a lot of web sites - you can find an index ((here))
unable to calculate file size greater than 1 GB us

Posted by gauri (gauri), 18 October 2006
My perl script is a very big script and performs a lot of complex tasks. One of the tasks within the script is to  calulate the size of a file as follows:

$size = -s $filename;
$size = -1 unless defined($size);

When the file size is larger than 1GB for example 16376862124 bytes or 16 GB, the script returns $size = -1
For a file size less than 1 GB for example 775752944 bytes,
the correct file size is returned.

When I run another sample script performing just the above simple task of calculating file size, the correct size of 16376862124 bytes is returned for the larger file after a long time-about 40 minutes. The problem arises when the above task is within the larger script.

Can sombody suggest a solution to this problem?

I think perhaps it is a timing issue and in the larger script after a fixed time of 20 minutes or so, the perl interpreter moves to the next task/command in the script.

Is there someway I can increase the time limit to execute any command?
What is the default time limit set by the interpreter to execute any command?

Please suggest a solution to my problem.

-Thanks,
Regards,
Gauri


Posted by admin (Graham Ellis), 19 October 2006
The initial problem is likely to be the integer range of the system and not any timeout function.  32 bit integers are limited to 4Gb, or to 2Gb if they're signed which is no longer plentiful for modern computing.  As I recall, Perl 6 will switch to a bigint type where it needs to, but I think for the moment the -s is going to give you a problem.

Your test script worked, you say, but took 40 minutes.  So that's not a practical way forward. Personally, as a quick kludge I would run a back-ticked OS command to get the information - something like a dir or ls -l which gets the correct data from the disc header rather than scanning the entire file.

Posted by Custard (Custard), 19 October 2006
Have a look at:

Math::BigFloat
Math::BigInt

Maybe these will help your large numbers..

B

Posted by banjo (banjo), 25 November 2006
I agree with Graham Ellis.. if you just need the size of the file back ticking ls -l then using awk or cut or regex would give you the exact same result in a less process hungry approach.

please do correct me if I'm wrong.



This page is a thread posted to the opentalk forum at www.opentalk.org.uk and archived here for reference. To jump to the archive index please follow this link.

You can Add a comment or ranking to this page

© WELL HOUSE CONSULTANTS LTD., 2024: Well House Manor • 48 Spa Road • Melksham, Wiltshire • United Kingdom • SN12 7NY
PH: 01144 1225 708225 • FAX: 01144 1225 793803 • EMAIL: info@wellho.net • WEB: http://www.wellho.net • SKYPE: wellho