In Perl, how can I release memory to the operating system?

前端 未结 4 1311
野性不改
野性不改 2020-12-05 08:30

I am having some problems with memory in Perl. When I fill up a big hash, I can not get the memory to be released back to the OS. When I do the same with a scalar and use

相关标签:
4条回答
  • 2020-12-05 08:50

    Why do you want Perl to release the memory to the OS? You could just use a larger swap.

    If you really must, do your work in a forked process, then exit.

    0 讨论(0)
  • 2020-12-05 08:59

    Try recompiling perl with the option -Uusemymalloc to use the system malloc and free. You might see some different results

    0 讨论(0)
  • 2020-12-05 09:03

    In general, you cannot expect perl to release memory to the OS.

    See the FAQ: How can I free an array or hash so my program shrinks?.

    You usually can't. Memory allocated to lexicals (i.e. my() variables) cannot be reclaimed or reused even if they go out of scope. It is reserved in case the variables come back into scope. Memory allocated to global variables can be reused (within your program) by using undef() and/or delete().

    On most operating systems, memory allocated to a program can never be returned to the system. That's why long-running programs sometimes re- exec themselves. Some operating systems (notably, systems that use mmap(2) for allocating large chunks of memory) can reclaim memory that is no longer used, but on such systems, perl must be configured and compiled to use the OS's malloc, not perl's.

    It is always a good idea to read the FAQ list, also installed on your computer, before wasting your time.

    For example, How can I make my Perl program take less memory? is probably relevant to your issue.

    0 讨论(0)
  • 2020-12-05 09:09

    Generally, yeah, that's how memory management on UNIX works. If you are using Linux with a recent glibc, and are using that malloc, you can return free'd memory to the OS. I am not sure Perl does this, though.

    If you want to work with large datasets, don't load the whole thing into memory, use something like BerkeleyDB:

    https://metacpan.org/pod/BerkeleyDB

    Example code, stolen verbatim:

      use strict ;
      use BerkeleyDB ;
    
      my $filename = "fruit" ;
      unlink $filename ;
      tie my %h, "BerkeleyDB::Hash",
                  -Filename => $filename,
                  -Flags    => DB_CREATE
          or die "Cannot open file $filename: $! $BerkeleyDB::Error\n" ;
    
      # Add a few key/value pairs to the file
      $h{apple}  = "red" ;
      $h{orange} = "orange" ;
      $h{banana} = "yellow" ;
      $h{tomato} = "red" ;
    
      # Check for existence of a key
      print "Banana Exists\n\n" if $h{banana} ;
    
      # Delete a key/value pair.
      delete $h{apple} ;
    
      # print the contents of the file
      while (my ($k, $v) = each %h)
        { print "$k -> $v\n" }
    
      untie %h ;
    

    (OK, not verbatim. Their use of use vars is ... legacy ...)

    You can store gigabytes of data in a hash this way, and you will only use a tiny bit of memory. (Basically, whatever BDB's pager decides to keep in memory; this is controllable.)

    0 讨论(0)
提交回复
热议问题