qsub

How fast can one submit consecutive and independent jobs with qsub?

那年仲夏 提交于 2020-01-01 03:36:05
问题 This question is related to pbs job no output when busy. i.e Some of the jobs I submit produce no output when PBS/Torque is 'busy'. I imagine that it is busier when many jobs are being submitted one after another, and as it so happens, of the jobs submitted in this fashion, I often get some that do not produce any output. Here're some codes. Suppose I have a python script called "x_analyse.py" that takes as its input a file containing some data, and analyses the data stored in the file: ./x

how to automatically run a bash script when my qsub jobs are finished on a server?

十年热恋 提交于 2019-12-19 07:55:26
问题 I would like to run a script when all of the jobs that I have sent to a server are done. for example, I send ssh server "for i in config*; do qsub ./run 1 $i; done" And I get back a list of the jobs that were started. I would like to automatically start another script on the server to process the output from these jobs once all are completed. I would appreciate any advice that would help me avoid the following inelegant solution: If I save each of the 1000 job id's from the above call in a

parameter for shell scripts that is started with qsub

孤街醉人 提交于 2019-12-18 10:13:39
问题 how can I parametrize a shell script that is executed on a grid (started with qsub) ? I have a shell script, where I use getopts to read the parameters. When I start (qsub script.sh -r firstparam -s secondparam ..) this working script with qsub I receive error messages, qsub: invalid option -- s qsub: illegal -r value as qsub thinks the parameter are for itself. Yet I have not found any solution. Thanks 回答1: Using the qsub -v option is the proper way: qsub -v par_name=par_value[,par_name=par

SGE Cluster - script fails after submission - works in terminal

谁说我不能喝 提交于 2019-12-12 03:16:35
问题 I have a script that I am trying to submit to a SGE cluster (on Redhat Linux). The very first part of the script defines the current folder from the full CWD path, as a variable to use downstream: #!/usr/bin/bash # #$ -cwd #$ -A username #$ -M user@server #$ -j y #$ -m aes #$ -N test #$ -o test.log.txt echo 'This is a test.' result="${PWD##*/}" echo $result In bash, this works as expected: CWD: -bash-4.1$ pwd /home/user/test Run script: -bash-4.1$ bash test.sh This is a test. test When I

Run a qsub command from a parent folder to all directories below with a file *.dat

耗尽温柔 提交于 2019-12-12 02:58:04
问题 I am using Redhat and a pbs queuing system to submit jobs to finite element analysis code. I typically have a folder that contains a .dat file, which is what I want to run, and a .pbs file that will submit the .dat file. To submit the .dat file I would run the command "qsub *.pbs" in the directory containing both files. How could I submit or just run "qsub *.pbs" from outside of the directories containing the .dat files. I would typically be two directories up from the .dat files. Thanks 回答1:

Checking for status of qsub jobs running within shell script

眉间皱痕 提交于 2019-12-11 16:07:13
问题 I have been given a c shell script that launches 800 individual qsubs for a sample. I need to run this script on more than 500 samples (listed in samples.txt ). To automate the process, I thought about running the script (named SrchDriver ) using the following bash shell script: #!/bin/sh for item in $(cat samples.txt) do (cd dir_"$item"/MAPGAPS && SrchDriver "$item"_Out 3) done This script would launch the SrchDriver script for all samples one right after another which would result in too

SGE submitted job doesn't run

人盡茶涼 提交于 2019-12-11 12:59:23
问题 I'm using Sun Grid Engine on my ubuntu 14.04 to queue my jobs to be run on my multicore CPU. I've installed and set up SGE on my system but I have problem when testing it. I've created a "hello_world" dir which contains two shell scripts named "hello_world.sh" & "hello_world_qsub.sh" first including a simple command and second including qsub command to submit the first script file as a job to be run. Here's what "hello_world.sh" includes: #!/bin/bash echo "Hello world" > /home/theodore/tmp

Naive parallelization in a .pbs file

混江龙づ霸主 提交于 2019-12-11 11:22:50
问题 Is it possible to do parallelize across a for loop in a PBS file? Below is an my attempt.pbs file. I would like to allocate 4 nodes and simultaneously allocate 16 processes per node. I have successfully done this but now I have 4 jobs and I would like to send one job to each node. (I need to do this because queuing algo will make me wait a few days for submitting 4 separate job on the cluster I'm using) #!/bin/bash #PBS -q normal #PBS -l nodes=4:ppn=16:native #PBS -l walltime=10:00:00 #PBS -N

shell variables in qsub

若如初见. 提交于 2019-12-11 08:28:18
问题 I have this script which works fine when I call it with sh but it fails when I use qsub. Could someone please help me debug this? I can't seem to find an answer online #!/bin/bash #$ -S /bin/bash #$ -V #$ -cwd #$ -l h_vmem=6G #$ -N MHCIp if [ $# -lt 2 ] then echo need 2 arguments echo "USAGE : qsub run_MHCIprediction.sh <input_peptide_file> <MHCI_allele_file>" exit 0 fi input_file=$1 allele_file=$2 output_prefix=`echo ${input_file} | awk -F"." '{print $1}'` while read -u 10 allele strip

Redirect output of my java program under qsub

杀马特。学长 韩版系。学妹 提交于 2019-12-11 06:05:49
问题 I am currently running multiple Java executable program using qsub. I wrote two scripts: 1) qsub.sh, 2) run.sh qsub.sh #! /bin/bash echo cd `pwd` \; "$@" | qsub run.sh #! /bin/bash for param in 1 2 3 do ./qsub.sh java -jar myProgram.jar -param ${param} done Given the two scripts above, I submit jobs by sh run.sh I want to redirect the messages generated by myProgram.jar -param ${param} So in run.sh , I replaced the 4th line with the following ./qsub.sh java -jar myProgram.jar -param ${param}