0

I'm trying to perform a MPI_Reduce operation but I'm getting following errors:

Caught signal 11 (Segmentation fault: invalid permissions for mapped object at address 0x533ea5)
==== backtrace (tid: 105696) ====
 0 0x000000000004b7f1 mpi_reduce_()  ???:0
 1 0x0000000000533edf itrdrv_()  *Some address*/itrdrv.f:1072
=================================
 *** Process received signal ***
Signal: Segmentation fault (11)
Signal code:  (-6)

The snippet of the code is shown below:

  include "mpif.h"
  include "auxmpi.h"

  real*8, dimension(2,4) :: wtforce
  real*8, dimension(4) :: tanf
  real*8, dimension(4) :: normf
  real*8, dimension(4) :: torque
  real*8, dimension(4) :: thrust
  real*8, dimension(4) :: pow_wt
  real*8, dimension(4) :: thr_wt
  normf = 0.0d0
  tanf = 0.0d0
  torque = 0.0d0
  thrust = 0.0d0
  pow_wt = 0.0d0
  thr_wt = 0.0d0
  
  call Solflow(.....,wtforce)

  do i = 1, 4
    normf(i) = wtforce(1,i)
    tanf(i) = wtforce(2,i)
  end do

  write(*,*) "Tangent Force, for Rank ",myrank," =",tanf(1)

  call MPI_REDUCE(tanf(1),torque(1),1,MPI_DOUBLE_PRECISION,
 &                     MPI_SUM,0, MPI_COMM_WORLD)

   if(myrank==0) then

       pow_wt(1) = torque(1)*2.285714285714d0
       write(*,*) "Power on top right rotor = ",pow_wt(1)

   end if

The write statement just before the MPI_Reduce function shows correct values. What am I doing wrong with the Reduce function?

  • at first glance the snippet looks good, but this is not what you used when the program crashed. Please trim your application down to a [MCVE] so you can get some help here. – Gilles Gouaillardet Jul 05 '20 at 23:27
  • I tried editing the snippet to make it more informative, but this is the code I have used when I got the error. The code actually is extensive and this is the best I can think of as a minimal example. – ScarletKnight3 Jul 05 '20 at 23:42
  • I am not arguing this is minimal, I am arguing this i **not** reproducible since the code does not compile. your code would likely work with `use_mpif08` but I have no way to tell whether you did that or not. – Gilles Gouaillardet Jul 06 '20 at 00:02
  • Oh I'm sorry, lemme try making it reproducible. – ScarletKnight3 Jul 06 '20 at 00:49
  • that part looks good to me. You should really avoid `include 'mpif.h' and use `use mpi_f08` instead (or if your compiler does not support it yet, `use mpi`). That might point you to an issue that could be the root cause. – Gilles Gouaillardet Jul 06 '20 at 01:17
  • 1
    So the code worked by adding "ierr" at the end of MPI_REDUCE. For consistency I changed MPI_DOUBLE_PRECISION with MPI_REAL8. `call MPI_REDUCE(tanf(1),torque(1),1,MPI_REAL8, MPI_SUM,0, MPI_COMM_WORLD,ierr)` – ScarletKnight3 Jul 06 '20 at 01:25
  • As well as use mpi_f08 you should also avoid the non-standard and so potentially non-portable real*8 – Ian Bush Jul 06 '20 at 06:38
  • 1
    Forgetting the `ierr` argument is the reason for probably more than 50% of the programming errors when writing MPI code in Fortran and the #1 reason to never use `include 'mpif.h'`. – Hristo Iliev Jul 06 '20 at 06:58

0 Answers0