pbs

使用PortableBasemapServer 和 terrabuilder 制作MPT

南楼画角 提交于 2020-01-07 04:26:15
【推荐】2019 Java 开发者跳槽指南.pdf(吐血整理) >>> PortableBasemapServer 下载地址 下载数据 打开PortableBasemapServer 选择【格式转换 】-【在线地图->MBTiles】 在数据源类型中选择自己需要的项,找到要下载区域,“按下并”拖动鼠标右键绘制下载范围(这里我选择的是bing地图,图上没有标注,找目的地时可以打开网页百度地图参考位置) 点击【开始】按钮,进行下载 发布数据 在PortableBasemapServer 主界面中的【数据源类型】下拉列表中选择【MBTiles】,【数据源路径】选择刚才下载的 .mbtiles 文件 点击创建新服务(双击生成的服务,可以预览地图),拷贝【OGC WMTS URL】备用 到这里,pbs的设置告一段落 使用terrabuilder 生成MPT 创建一个Globe project 工程 插入图层 开始创建MPT 创建完成后,会在相应目录下生成 mpt文件,可以在 TerraExplorer中打开 最终效果展示(这里用mpt叠加倾斜摄影成果) 来源: oschina 链接: https://my.oschina.net/u/592443/blog/716196

PBS programming

廉价感情. 提交于 2020-01-05 14:06:44
问题 some short and probably stupid questions about PBS: 1- I submit jobs using qsub job_file is it possible to submit a (sub)job inside a job file? 2- I have the following script: qsub job_a qsub job_b For launching job_b, it would be great to have before the results of job_a finished. Is it possible to put some kind of barrier or some otehr workaround so job_b is not launched until job_a finished? Thanks 回答1: Answer to the first question: Typically you're only allowed to submit jobs from the

PBS programming

醉酒当歌 提交于 2020-01-05 14:06:29
问题 some short and probably stupid questions about PBS: 1- I submit jobs using qsub job_file is it possible to submit a (sub)job inside a job file? 2- I have the following script: qsub job_a qsub job_b For launching job_b, it would be great to have before the results of job_a finished. Is it possible to put some kind of barrier or some otehr workaround so job_b is not launched until job_a finished? Thanks 回答1: Answer to the first question: Typically you're only allowed to submit jobs from the

File can't be found in a small fraction of submitted jobs

Deadly 提交于 2020-01-02 10:17:33
问题 I'm trying to run a very large set of batch jobs on a RHEL5 cluster which uses a Lustre file system. I was getting a strange error with roughly 1% of the jobs: they could't find a text file they are all using for steering. A script that reproduces the error looks like this: #!/usr/bin/env bash #PBS -t 1-18792 #PBS -l mem=4gb,walltime=30:00 #PBS -l nodes=1:ppn=1 #PBS -q hep #PBS -o output/fit/out.txt #PBS -e output/fit/error.txt cd $PBS_O_WORKDIR mkdir -p output/fit echo 'submitted from: '

How fast can one submit consecutive and independent jobs with qsub?

那年仲夏 提交于 2020-01-01 03:36:05
问题 This question is related to pbs job no output when busy. i.e Some of the jobs I submit produce no output when PBS/Torque is 'busy'. I imagine that it is busier when many jobs are being submitted one after another, and as it so happens, of the jobs submitted in this fashion, I often get some that do not produce any output. Here're some codes. Suppose I have a python script called "x_analyse.py" that takes as its input a file containing some data, and analyses the data stored in the file: ./x

sleep command not found in torque pbs but works in shell

余生颓废 提交于 2019-12-24 13:02:24
问题 We create a torque pbs file " testpbs " as follows: #!/bin/sh #PBS -N T1272_flt #PBS -q batch #PBS -l nodes=1:ppn=1 #PBS -o /data/software/torque-4.2.6.1/testpbs.sh.out #PBS -e /data/software/torque-4.2.6.1/testpbs.sh.err sleep 20 Then submitted the file testbps. qsub testpbs We got error messages: more testpbs.sh.err /var/spool/torque/mom_priv/jobs/8.centos64.SC: line 9: sleep: command not found However, we ran sleep 20 in command line. No error occurs. $sleep 20 Thanks in advance. We ran

Output file contains nothing before script finishing

怎甘沉沦 提交于 2019-12-24 12:29:44
问题 I write a python script in which there are several print statement. The printed information can help me to monitor the progress of the script. But when I qsub the bash script, which contains python my_script &> output , onto computing nodes, the output file contains nothing even when the script is running and printing something. The output file will contains the output when the script is done. So how can I get the output in real time through the output file when the script is running. 回答1:

Random seed across different PBS jobs

落花浮王杯 提交于 2019-12-24 04:23:23
问题 I am trying to create random numbers in Matlab which will be different across multiple PBS jobs (I am using a job array). Each Matlab job uses a parallel parfor loop in which random numbers are generated, something like this: parfor k = 1:10 tmp = randi(100, [1 200]); end However when I plot my result, I see that the results from different jobs are not completely random - I cannot quantify it, e.g by saying the numbers are exactly the same, since my results are a function of the random

Loading shared library in open-mpi/ mpi-run

二次信任 提交于 2019-12-21 17:23:05
问题 I'm trying to run my program using torque scheduler using mpi run. Though in my pbs file I load all the library by export LD_LIBRARY_PATH=/path/to/library yet it gives error i.e. error while loading shared libraries: libarmadillo.so.3: cannot open shared object file: No such file or directory. I guess error lies in variable LD_LIBRARY_PATH not set in all the nodes. How would I make it work? 回答1: LD_LIBRARY_PATH is not exported automatically to MPI processes, spawned by mpirun . You should use

how to automatically run a bash script when my qsub jobs are finished on a server?

十年热恋 提交于 2019-12-19 07:55:26
问题 I would like to run a script when all of the jobs that I have sent to a server are done. for example, I send ssh server "for i in config*; do qsub ./run 1 $i; done" And I get back a list of the jobs that were started. I would like to automatically start another script on the server to process the output from these jobs once all are completed. I would appreciate any advice that would help me avoid the following inelegant solution: If I save each of the 1000 job id's from the above call in a