Questions tagged [mpi]

MPI is the Message Passing Interface, a library for distributed memory parallel programming and the de facto standard method for using distributed memory clusters for high-performance technical computing. Questions about using MPI for parallel programming go under this tag; questions on, eg, installation problems with MPI implementations are best tagged with the appropriate implementation-specific tag, eg MPICH or OpenMPI.

MPI is the Message Passing Interface, a library for distributed memory parallel programming and the de facto standard method for using distributed memory clusters for high-performance technical computing. Questions about using MPI for parallel programming go under this tag; questions on, eg, installation problems with MPI implementations are best tagged with the appropriate implementation-specific tag, e.g. MPICH or OpenMPI.

The official documents for MPI can be found at the webpages of the MPI forum; a useful overview is given on the Wikipedia page for MPI. The current version of the MPI standard is 3.0; the Forum is currently working on versions 3.1, which will have smaller updates and errata fixes, and 4.0, which will have significant additions and enhancements.

Open source MPI Libraries that implement the current standard include

Versions for most common platforms can be downloaded from the links above. Platform specific implementations are also available from various vendors.

A number of excellent tutorials for learning the basics of MPI programming can be found online, typically at the websites of supercomputing centres; these include (in no particular order):

Definitive Book Guide

  1. An Introduction to Parallel Programming - Peter Pacheco.
  2. Parallel Programming in C with MPI and OpenMP - Michael J. Quinn
  3. MPI: The Complete Reference (Volume 2) - William Gropp, Steven Huss-Lederman, Andrew Lumsdaine, Ewing L. Lusk, Bill Nitzberg, William Saphir, Marc Snir
  4. Using MPI: Portable Parallel Programming with the Message-Passing Interface - William Gropp, Ewing Lusk, Anthony Skjellum
6963 questions
2
votes
2 answers

MPI segmenation fault when using Bcast, Scatter and Gather with Dynamic Allocation

I am working with some code and I encountered some problem with implementating of table size for matrix mult from concole input. First version work on: const int size = 1000; int mat_a[size][size], mat_b[size][size], mat_c[size][size]; To use…
Rag
  • 65
  • 9
2
votes
2 answers

MPI BMP Image comparison more efficient

I made a simple program in which I compare two images pixel by pixel and determine if the pictures are the same. I'm trying to adapt it to MPI, but I'm afraid that the communications are taking too long making it way more inefficient than its…
2
votes
0 answers

Specify path to execute from in hostfile for MPI

Note: I'm currently using MPICH, but I'm open to switching to Open-MPI if what I want is easier in that MPI implementation. I have a MPI program that I want to run across several nodes. However, these nodes are unfortunately not uniform with what's…
Georgia S
  • 602
  • 4
  • 14
2
votes
0 answers

Compile C++ MPI Code on Windows

I have just installed Microsoft MPI (MS-MPI) which "is a Microsoft implementation of the Message Passing Interface standard for developing and running parallel applications on the Windows platform". The site also contains a link to a featured…
Sandu Ursu
  • 1,181
  • 1
  • 18
  • 28
2
votes
2 answers

mpi4py references lipmpi.so.12 as lipmpi.so.1 (says no such file or directory)

I'm new to MPI, but I have been trying to use it on a cluster with OpenMPI. I'm having the following problem: $ python -c "from mpi4py import MPI" Traceback (most recent call last): File "", line 1, in ImportError: libmpi.so.1:…
John Smith
  • 21
  • 1
  • 3
2
votes
0 answers

Compiling with MPIcc and including MPI.h I receive the following "error: use of undeclared identifier 'MPI_COMM_WORLD'"

I have a C code I'm trying to implement in parallel using MPI. I'm using standard expressions such as: /* initialize mpi */ MPI_Init( &argc, &argv ); MPI_Comm_rank( MPI_COMM_WORLD, &rank ); MPI_Comm_size( MPI_COMM_WORLD, &size ); I however receive…
Alex
  • 59
  • 3
2
votes
1 answer

MPI Depth-first search with dynamic spawns in Python

Ok, so I want to do a multi-threaded depth-first search in a tree-like structure. I'm using threads from multiple computers in a cluster for this (localhost quad-core and raspberry pi 2 for this example). The master thread should start the process…
Enzime
  • 37
  • 5
2
votes
1 answer

MPI: Ensure an exclusive access to a shared memory (RMA)

I would like to know which is the best way to ensure an exclusive access to a shared resource (such as memory window) among n processes in MPI. I've tried MPI_Win_lock & MPI_Win_fence but they don't seem to work as expected, i.e: I can see that…
Reda94
  • 331
  • 4
  • 12
2
votes
0 answers

MPI: MPI_Get not working

The following code creates a window in process 0 (master) and the other processes put some values in it and I'm trying to get the window of the master from other processes each time to print it but I'm getting totally confusing results. Here's the…
Reda94
  • 331
  • 4
  • 12
2
votes
1 answer

Using a shared array in Message Passing Interface

I would like to use a shared array in an MPI program such that after one process finishes its work, it will put its rank into that array and "update" it by sending it to others without waiting. I tried MPI_Bcast by giving it as "root" parameter the…
Reda94
  • 331
  • 4
  • 12
2
votes
0 answers

CMake: build fails when compiling a C++ shared lib depending on MPI

Problem overview I'm using CMake to build a project which contains a library based on MPI. The lib has to be built as both shared and static. When I build the project on my Fedora 20 system, everything goes without problem. When I build it on a Mac…
M4urice
  • 571
  • 6
  • 15
2
votes
2 answers

Set of MPI tags

Is is possible to specify a set of tags to an MPI_Recv function? My scenario: I'm working with an application that has multiple threads that simultaneously execute the MPI_Recv function. I intend to use the mpi tag to control which thread will…
dv_
  • 169
  • 10
2
votes
0 answers

MPI_Bcast one of proces does not recive

I have a problem with mpi_Bcast. I want to sent an array of calculate how many numbers are per process to another process and the random process don't recive anything and crash(the process rank 2 and last-1).The number per process can be differ…
Przemysław Zamorski
  • 741
  • 2
  • 14
  • 31
2
votes
1 answer

Setting displacement array in MPI_IScatterv() issue

Hey I have a problem with setting displacement array in MPI_IScatterv() function. I need to scatter parts of picture to different processes. I have made my own MPI_Datatype: private: MPI_Datatype createMPI_PixelType(){ MPI_Datatype new_type; int…
Phate P
  • 1,572
  • 1
  • 22
  • 31
2
votes
1 answer

Term for an algorithm that works in a distributed or sequential fashion

I am working on an algorithm which subdivides a large data problem and performs work on it across many nodes. The local solution to each subdivision of the problem can be modified to match a global solution if each subdivision knows a limited amount…
Richard
  • 56,349
  • 34
  • 180
  • 251