0

For example I have 6 MPI nodes forming a 1D grid. On the master process I have some values for the edges of the grid:

[1 2 3 4 5]

And I want to distribute these values to put each value to both nodes that are adjacent to the corresponding edge. That is, I want to get the following data distribution among the nodes:

1 | 1 2 | 2 3 | 3 4 | 4 5 | 5

What is the best way to perform this? Seems that this cannot be done with a single MPI_Scatter call.

Nikolai
  • 1,499
  • 12
  • 24
  • 1
    Take a look at http://stackoverflow.com/q/25813593/620382 – Zulan Sep 08 '16 at 12:07
  • @Zulan, thanks for the link. Though, that's exactly what I'm trying to avoid. – Nikolai Sep 08 '16 at 12:50
  • I just wonder if it is possible to somehow tamper with MPI_Datatypes and create a self-overlapping type which could be scattered in an overlapping manner. – Nikolai Sep 08 '16 at 12:50
  • 1
    You will be interested by the DMDA structure of the Petsc library. In particular, tools related to structured grids are detailed in pages 49+ of the [manual](http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf). Take a look at the concept of ghost nodes and see if it helps. See [DMDACreate1d](http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMDACreate1d.html). – francis Sep 08 '16 at 18:59
  • Thanks, @francis! Seems that DMDA structure has this sort of overlapping. It would be interesting to have a look at petsc source code and see how they have implemented the communications. – Nikolai Sep 08 '16 at 19:59

0 Answers0