Questions tagged [mpi]

MPI is the Message Passing Interface, a library for distributed memory parallel programming and the de facto standard method for using distributed memory clusters for high-performance technical computing. Questions about using MPI for parallel programming go under this tag; questions on, eg, installation problems with MPI implementations are best tagged with the appropriate implementation-specific tag, eg MPICH or OpenMPI.

MPI is the Message Passing Interface, a library for distributed memory parallel programming and the de facto standard method for using distributed memory clusters for high-performance technical computing. Questions about using MPI for parallel programming go under this tag; questions on, eg, installation problems with MPI implementations are best tagged with the appropriate implementation-specific tag, e.g. MPICH or OpenMPI.

The official documents for MPI can be found at the webpages of the MPI forum; a useful overview is given on the Wikipedia page for MPI. The current version of the MPI standard is 3.0; the Forum is currently working on versions 3.1, which will have smaller updates and errata fixes, and 4.0, which will have significant additions and enhancements.

Open source MPI Libraries that implement the current standard include

Versions for most common platforms can be downloaded from the links above. Platform specific implementations are also available from various vendors.

A number of excellent tutorials for learning the basics of MPI programming can be found online, typically at the websites of supercomputing centres; these include (in no particular order):

Definitive Book Guide

  1. An Introduction to Parallel Programming - Peter Pacheco.
  2. Parallel Programming in C with MPI and OpenMP - Michael J. Quinn
  3. MPI: The Complete Reference (Volume 2) - William Gropp, Steven Huss-Lederman, Andrew Lumsdaine, Ewing L. Lusk, Bill Nitzberg, William Saphir, Marc Snir
  4. Using MPI: Portable Parallel Programming with the Message-Passing Interface - William Gropp, Ewing Lusk, Anthony Skjellum
6963 questions
38
votes
1 answer

OpenMPI-bin error after update (K)Ubuntu 18.04 to 20.04

I have just upgraded my Kubuntu from 18.04 to 20.04. Unfortunately there is an error that keeps showing up everytime I use apt upgrade or installing something with apt. The error is: update-alternatives: error: /var/lib/dpkg/alternatives/mpi…
Muhammad Radifar
  • 1,267
  • 1
  • 7
  • 8
38
votes
6 answers

Why isn't Hadoop implemented using MPI?

Correct me if I'm wrong, but my understanding is that Hadoop does not use MPI for communication between different nodes. What are the technical reasons for this? I could hazard a few guesses, but I do not know enough of how MPI is implemented "under…
artif
  • 907
  • 1
  • 7
  • 12
34
votes
2 answers

mpirun - not enough slots available

Usually when I use mpirun, I can "overload" it, using more processors than there acctually are on my computer. For example, on my four-core mac, I can run mpirun -np 29 python -c "print 'hey'" no problem. I'm on another machine now, which is…
kilojoules
  • 9,768
  • 18
  • 77
  • 149
34
votes
1 answer

How to get block cyclic distribution?

I am trying to distribute my matrix in block cyclic fashion. I learned a lot from this question (MPI IO Reading and Writing Block Cyclic Matrix), but that is not what I really need. Let me explain my problem. Suppose I have this matrix of dimension…
matiska
  • 525
  • 6
  • 14
34
votes
7 answers

Why is MPI considered harder than shared memory and Erlang considered easier, when they are both message-passing?

There's a lot of interest these days in Erlang as a language for writing parallel programs on multicore. I've heard people argue that Erlang's message-passing model is easier to program than the dominant shared-memory models such as threads.…
Lorin Hochstein
  • 57,372
  • 31
  • 105
  • 141
33
votes
3 answers

When do I need to use MPI_Barrier()?

I wonder when do I need to use barrier? Do I need it before/after a scatter/gather for example? Or should OMPI ensure all processes have reached that point before scatter/gather-ing? Similarly, after a broadcast can I expect all processes to already…
Jiew Meng
  • 84,767
  • 185
  • 495
  • 805
32
votes
6 answers

How to compile MPI with gcc?

Does anyone know if it is possible to compile MPI with gcc?. I need to use gcc, no mpicc.
user1260391
  • 1,237
  • 2
  • 10
  • 6
31
votes
2 answers

MPI recv from an unknown source

I am implementing in MPI a program in which the main process (with rank=0) should be able to receive requests from the other processes who ask for values of variables that are only known by the root. If I make MPI_Recv(...) by the rank 0, I have to…
shkk
  • 313
  • 1
  • 3
  • 4
31
votes
5 answers

What are some scenarios for which MPI is a better fit than MapReduce?

As far as I understand, MPI gives me much more control over how exactly different nodes in the cluster will communicate. In MapReduce/Hadoop, each node does some computation, exchanges data with other nodes, and then collates its partition of…
Igor ostrovsky
  • 7,282
  • 2
  • 29
  • 28
30
votes
3 answers

When to use tags when sending and receiving messages in MPI?

I'm not sure when I have to use different numbers for the tag field in MPI send, receive calls. I've read this, but I can't understand it. Sometimes there are cases when A might have to send many different types of messages to B. Instead of B…
FrancescoN
  • 2,146
  • 12
  • 34
  • 45
30
votes
5 answers

Why does MPI_Init accept pointers to argc and argv?

this is how we use MPI_Init function int main(int argc, char **argv) { MPI_Init(&argc, &argv); … } why does MPI_Init use pointers to argc and argv instead of values of argv?
Rohit Banga
  • 18,458
  • 31
  • 113
  • 191
28
votes
6 answers

MPI: cores or processors?

Hi I am kind of MPI noob so please bear with me on this one. :) Say I have an MPI program called foo.c and I run the executable with mpirun -np 3 ./foo Now this means the program will be run in parallel using 3 processors (1 process per…
smilingbuddha
  • 14,334
  • 33
  • 112
  • 189
28
votes
6 answers

MPICH2 gethostbyname failed

I don't understand the error message. I am trying to do is to run a MPICH2 application after I installed mpich2 version 1.4 or 1.5 to /opt/mpich2 (both version failed with the same error). My MPI application was compiled with 1.3 but I am able to…
biocyberman
  • 5,675
  • 8
  • 38
  • 50
28
votes
1 answer

How are MPI_Scatter and MPI_Gather used from C?

So far, my application is reading in a txt file with a list of integers. These integers needs to be stored in an array by the master process i.e. processor with rank 0. This is working fine. Now, when I run the program I have an if statement…
Force444
  • 3,321
  • 9
  • 39
  • 77
28
votes
4 answers

MPI vs openMP for a shared memory

Lets say there is a computer with 4 CPUs each having 2 cores, so totally 8 cores. With my limited understanding I think that all processors share same memory in this case. Now, is it better to directly use openMP or to use MPI to make it general so…
Shibli
  • 5,879
  • 13
  • 62
  • 126