xargs pass multiple arguments to perl subroutine?

醉酒当歌 提交于 2021-01-28 13:55:06

问题


I know how to pipe multiple arguments with xargs:

echo a b | xargs -l bash -c '1:$0 2:$1'

and I know how to pass the array of arguments to my perl module's subroutine from xargs:

echo a b | xargs --replace={} perl -I/home/me/module.pm -Mme -e 'me::someSub("{}")'

But I can't seem to get multiple individual arguments passed to perl using those dollar references (to satisfy the me::someSub signature):

echo a b | xargs -l perl -e 'print("$0 $1")'

Just prints:

-e

So how do I get the shell arguments: $0, $1 passed to my perl module's subroutine?

I know I could just delimit a;b so that the xarg {} could be processed by perl splitting it to get individual arguments), but I could also just completely process all STDIN with perl. Instead, my objective is to use perl -e so that I can explicitly call the subroutine I want (rather than having some pre-process in the script that figures out what subroutine to call and what arguments to use based on STDIN, to avoid script maintenance costs).


回答1:


I am not sure about the details of your design, so I take it that you need a Perl one-liner to use shell's variables that are seen in the scope in which it's called.

A perl -e'...' executes a Perl program given under ''. For any variables from the environment where this program runs -- a pipeline, or a shell script -- to be available to the program their values need be passed to it. Ways to do this with a one-liner are spelled out in this post, and here is a summary.

A Perl program receives arguments passed to it on the command-line in @ARGV array. So you can invoke it in a pipeline as

... | perl -e'($v1, $v2) = @ARGV; ...' "$0" "$1"

or as

... | xargs -l perl -e'($v1, $v2) = @ARGV; ...'

if xargs is indeed used to feed the Perl program its input. In the first example the variables are quoted to protect possible interesting characters in them (spaces, *, etc) from being interpreted by the shell that sets up and runs the perl program.

If input contains multiple lines to process and the one-liner uses -n or -p for it then unpack arguments in a BEGIN block

... | perl -ne'BEGIN { ($v1, $v2) = splice(@ARGV,0,2) }; ...'  "$0" "$1" ...

which runs at compile time, so before the loop over input lines provided by -n/-p. The arguments other than filenames are now removed from @ARGV, so to leave only the filenames there for -n/-p, in case input comes from files.

There is also a rudimentary mechanism for command-line switches in a one-liner, via the -s switch. Please see the link above for details; I'd recommend @ARGV over this.

Finally, your calling code could set up environment variables which are then available to the Perl progam in %ENV. However, that doesn't seem to be suitable to what you seem to want.

Also see this post for another example.




回答2:


While bash's argument are available as $@ and $0, $1, $2, etc, Perl's arguments are available via @ARGV. This means that the Perl equivalent of

echo a b | xargs -l bash -c 'echo "1:$0 2:$1"'

is

echo a b | xargs -l perl -e'CORE::say "1:$ARGV[0] 2:$ARGV[1]"'

That said, it doesn't make sense to use xargs in this way because there's no way to predict how many times it will call perl, and there's no way to predict how many arguments it will pass to perl each time. You have an XY Problem, and you haven't provided any information to help us. Maybe you're looking for

perl -e'CORE::say "1:$ARGV[0] 2:$ARGV[1]"' $( echo a b )


来源:https://stackoverflow.com/questions/56199968/xargs-pass-multiple-arguments-to-perl-subroutine

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!