Why is my image download CGI script written in Perl not working?

徘徊边缘 提交于 2019-12-01 09:35:15

There are quite a few issues. The first one is the fact that you are using @fileholder = <DLFILE>; to slurp a binary file. On Windows, automatic conversion of line endings will wreak havoc on the contents of that file.

Other issues are:

  1. You are not checking the return value of open. We don't even know if open succeeded.

  2. You never assign a value to $ID, meaning you're sending "filename=\n\n" in your response.

  3. You are slurping a binary file, making the memory footprint of your program proportional to the size of the binary file. Robust programs don't do that.

  4. You're useing CGI.pm, but you are neither using it nor have you read the docs.

  5. You're using a bareword (i.e. package global) filehandle.

The fundamental reason, however, is that open fails. Why does open fail? Simple:

C:\temp> cat uu.pl
#!/usr/bin/env perl

use strict; use warnings;

my $files_location = "C:\Users\user\Documents\hello\icon.png";
print "$files_location\n";

Let's try running that, shall we?

C:\temp> uu
Unrecognized escape \D passed through at C:\temp\uu.pl line 5.
Unrecognized escape \h passed through at C:\temp\uu.pl line 5.
Unrecognized escape \i passed through at C:\temp\uu.pl line 5.
C:SERSSERDOCUMENTSHELLOICON.PNG

Here is a short script illustrating a better way:

use CGI qw(:standard);
use File::Copy qw( copy );
use File::Spec::Functions qw( catfile );

use constant IMG_DIR => catfile(qw(
    E:\ srv localhost images
));

serve_logo(IMG_DIR);

sub serve_logo {
    my ($dir) = @_;

    my %mapping = (
        'big' => 'logo-1600x1200px.png',
        'medium' => 'logo-800x600.png',
        'small' => 'logo-400x300.png',
        'thumb' => 'logo-200x150.jpg',
        'icon' => 'logo-32x32.gif',
    );

    my $cgi = CGI->new;

    my $file = $mapping{ $cgi->param('which') };
    defined ($file)
        or die "Invalid image name in CGI request\n";

    send_file($cgi, $dir, $file);

    return;
}

sub send_file {
    my ($cgi, $dir, $file) = @_;

    my $path = catfile($dir, $file);

    open my $fh, '<:raw', $path
        or die "Cannot open '$path': $!";

    print $cgi->header(
        -type => 'application/octet-stream',
        -attachment => $file,
    );

    binmode STDOUT, ':raw';

    copy $fh => \*STDOUT, 8_192;

    close $fh
        or die "Cannot close '$path': $!";

    return;
}

I also posted a detailed explanation on my blog.

It took me a while to figure what was wrong, so for those that end up here (as I did) having random issues serving large files, here's my advice:

Avoid File::Copy, as it's bugged for this purpose. When serving data through CGI, syswrite can return undef ($! being 'Resource temporarily unavailable') for some time.

File::Copy stops in that case (returns 0, sets $!), failing to transfer the entire file (or stream).

Many different options to work around that, retrying the syswrite, or using blocking sockets, not sure which on is the best though !

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!