I've been using lwp-rget, for instance: cyc$ lwp-rget --depth=2 --limit=100 "http://www.somesite.com/~whoever with no problems. tonight I got this msg: Can't use string ("URI::URL") as an ARRAY ref while "strict refs" in use at /usr/local/path_to_module/URI/WithBase.pm line 41. version info: This is perl, version 5.005_03 built for i386-freebsd and: This is lwp-rget version 1.19 (libwww-perl-5.52) Any advice would be really appreciated. CYC.
Hi! I have a simple program which opens each URL in a file and saves the result (actually the HTML code of the requested document) into a data file. Something like: my $webdoc = $browser->request(HTTP::Request->new(GET => $url)); .... and then my $string = $webdoc->content; .... print FILE $string; .... But if I try to reach some URL's there an error emerges - out of memory. I tried to save the content of the following URL's: http://razor.fri.uni-lj.si:8080/misterP/bin/jpegpush2 http://home.izum.si/cobiss/konference/konf_2000/video/13.wmv And it doesn't work - there is an out of memory error. I think that that is because the program tries to download a lot of data - video stream or big application... My question is: how to avoid this out of memory error? bye, Matej