openmpi

Probe seems to consume the CPU

冷暖自知 提交于 2019-11-26 20:58:32
问题 I've got an MPI program consisting of one master process that hands off commands to a bunch of slave processes. Upon receiving a command, a slave just calls system() to do it. While the slaves are waiting for a command, they are consuming 100% of their respective CPUs. It appears that Probe() is sitting in a tight loop, but that's only a guess. What do you think might be causing this, and what could I do to fix it? Here's the code in the slave process that waits for a command. Watching the

MPI_Rank return same process number for all process

元气小坏坏 提交于 2019-11-26 15:31:13
I'm trying to run this sample hello world program with openmpi and mpirun on debian 7. #include <stdio.h> #include <mpi/mpi.h> int main (int argc, char **argv) { int nProcId, nProcNo; int nNameLen; char szMachineName[MPI_MAX_PROCESSOR_NAME]; MPI_Init (&argc, &argv); // Start up MPI MPI_Comm_size (MPI_COMM_WORLD,&nProcNo); // Find out number of processes MPI_Comm_rank (MPI_COMM_WORLD, &nProcId); // Find out process rank MPI_Get_processor_name (szMachineName, &nNameLen); // Get machine name printf ("Hello World from process %d on %s\r\n", nProcId, szMachineName); if (nProcId == 0) printf (

Can MPI_Publish_name be used for two separately started applications?

我的梦境 提交于 2019-11-26 14:33:11
问题 I write an OpenMPI application which consists of a server and a client part which are launched separately: me@server1:~> mpirun server and me@server2:~> mpirun client server creates a port using MPI_Open_port . The question is: Does OpenMPI have a mechanism to communicate the port to client ? I suppose that MPI_Publish_name and MPI_Lookup_name doesn't work here because server wouldn't know to which other computer the information should be sent. To me, it looks like only processes which were

assign two MPI processes per core

自古美人都是妖i 提交于 2019-11-26 14:31:48
问题 How do I assign 2 MPI processes per core? For example, if I do mpirun -np 4 ./application then it should use 2 physical cores to run 4 MPI processes (2 processes per core). I am using Open MPI 1.6. I did mpirun -np 4 -nc 2 ./application but wasn't able to run it. It complains mpirun was unable to launch the specified application as it could not find an executable: 回答1: orterun (the Open MPI SPMD/MPMD launcher; mpirun/mpiexec are just symlinks to it) has some support for process binding but it

MPI_Rank return same process number for all process

走远了吗. 提交于 2019-11-26 04:28:15
问题 I\'m trying to run this sample hello world program with openmpi and mpirun on debian 7. #include <stdio.h> #include <mpi/mpi.h> int main (int argc, char **argv) { int nProcId, nProcNo; int nNameLen; char szMachineName[MPI_MAX_PROCESSOR_NAME]; MPI_Init (&argc, &argv); // Start up MPI MPI_Comm_size (MPI_COMM_WORLD,&nProcNo); // Find out number of processes MPI_Comm_rank (MPI_COMM_WORLD, &nProcId); // Find out process rank MPI_Get_processor_name (szMachineName, &nNameLen); // Get machine name