Member-only story

Blocking and Non-blocking Communication in MPI

Afzal Badshah, PhD
3 min readMar 27, 2024

In parallel computing with MPI (Message Passing Interface), communication between processes plays a crucial role in achieving efficient parallelization of algorithms. Two common approaches to communication are blocking and non-blocking communication. You can visit the detailed tutorial on MPI with Python here.

Blocking Communication

Blocking communication involves processes halting their execution until the communication operation is complete. In MPI, blocking communication functions like comm.send() and comm.recv() ensure that the sender waits until the receiver receives the message, and vice versa. Blocking communication is often used when processes need to synchronize their execution or when the sender and receiver must coordinate closely. While blocking communication simplifies program logic and synchronization, it can lead to potential performance bottlenecks if processes spend significant time waiting for communication to complete. Let's see the below code;

from mpi4py import MPI
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
if rank == 0:
data = {'a': 7, 'b': 3.14}
comm.send(data, dest=1, tag=11)
elif rank == 1:
data = comm.recv(source=0, tag=11)

Explanation

  • Import MPI: The code begins by importing the MPI module from mpi4py…

--

--

Afzal Badshah, PhD
Afzal Badshah, PhD

Written by Afzal Badshah, PhD

Dr Afzal Badshah focuses on academic skills, pedagogy (teaching skills) and life skills.

Responses (2)