问题
I'm trying to figure out how to pipe output into mysqlimport without any luck. I have a huge file (~250 GB) that I want to pipe to mysqlimport after processing it. I don't want to create an intermediate file/table. I'm imagining something like this:
cat genome.mpileup | nawk 'sub("^...","")' | mysqlimport -uuser -ppassword Database
But obviously this isn't working. Any suggestions on how to accomplish this?
回答1:
It doesn't look like mysqlimport can read from STDIN but you can perhaps experiment with a named pipe. Something like this (untested)
mkfifo bigfile
mysqlimport -uuser -ppassword Database bigfile &
cat genome | nawk > bigfile
Or you can use an extension to bash to run commands instead of files
mysqlimport -uuser -ppassword Database <(cat genome | nawk)
来源:https://stackoverflow.com/questions/15841956/mysqlimport-from-pipe