MPI part 2 - Lab
Min/Max
The goal of this exercise is to experiment with MPI collective communication.
The program min_max creates an array of integers initialized with random values. It then searches for the minimum and maximum values.
In this exercise, we want to parallelize the application with MPI, and only use collective communication? In this program, rank 0 broadcasts the array, then each MPI rank searches for the minimum and maximum values in a subset of the array. Then, all the ranks transfer the results to rank 0.
Modify min_max.c to initialize MPI, and compute, for each MPI rank, the lower and upper bounds of the array to be processed.
First, print the MPI rank and the bounds of the array. Make sure that all the array entries will be processed.
Projet
The goal of this exercise ot to work on the parallelization of your project with MPI. To do so, analyze the application, and identify the values that need to be synchronized.
Depending on the program to parallelize, several approaches are possible. Estimate the frequency of the communication, and the quantity of data to transfer. Is it possible to "hide" the cost of communication using non-blocking communication ? Is it better to use point-to-point communication, or collective communication ?