MPICH

From Lazarus wiki
Jump to navigationJump to search

Overview

MPICH is a free implementation of MPI (Message Passing Interface) for distributed memory applications. The latest version MPICH2 implements the MPI-2 standard.

NOTE: The MPICH2 package is under construction and will be announced soon on this page.

The MPICH2 pascal package contains the bindings (= pascal translations of the c header files), which allows pascal programs to use the same variables and functions as c programs would do. So, you can easily convert c examples for MPI to pascal.

Tutorial

Installation

First you need to install the MPICH2 library on all computers of your cluster. A cluster can be any set of computers / virtual machines. They do not need to be homogeneous. Your cluster can contain for example Windows and Linux machines, with different number of CPUs and memory. On your development machines, where you compile your application, you must install the development libraries and FPC too.

Ubuntu / Debian

Under ubuntu/debian you can install the following packages: libmpich-mpd1.0-dev mpich-mpd-bin

There's a how-to for installing Mpich2 in Ubuntu/Debian in Ubuntu Wiki : https://wiki.ubuntu.com/MpichCluster

Install from source

Download the mpich2-1.0.6.tar.gz or any newer version from http://www-unix.mcs.anl.gov/mpi/mpich2/ and unpack it.

Read the README carefully. It describes, what you need to compile mpich. Under ubuntu feisty: sudo apt-get install build-essential.

You need a shared directory for all nodes. In the following steps it is assumed that the home directory is shared.

 ./configure --prefix=/home/you/mpich-install
 make
 sudo make install

This will install the libraries in /home/you/mpich-install/lib. This path must be added to the linking search path of FPC. There are two common possibilities:

  • add the following line to /etc/fpc.cfg

-Fl/home/you/mpich-install/lib

  • add the path to IDE menu / Project / Compiler Options / Paths / Libraries

Otherwise you will get errors like /usr/bin/ld: cannot find -lmpich.

Configuration

Make sure your PATH contains the path to the mpich binaries. If not extend your path:

 export PATH=/home/you/mpich-install/bin:$PATH

Check everything works:

 which mpd
 which mpiexec
 which mpirun

MPI expects the configuration file in your home directory named /home/you/.mpd.conf (/etc/mpd.conf if root). It should contain one line:

 secretword=<secretword>

where <secretword> is a password, that should not be your user password. Make it readable/writable only by you:

 chmod 600 .mpd.conf

If your home is not shared, it must be copied to all cluster nodes. Check that you can login via ssh without password to all cluster nodes:

 ssh othermachine date

should not ask for password and give only the date - nothing else.

Create a file named /home/you/mpd.hosts with one line per node (hostnames). For example

 host1
 host2

Test MPD

MPD is the MPICH daemon, which controls/runs/stops the proccesses on the cluster nodes. Bring up the ring:

  mpd &
  mpdtrace
  mpdallexit

mpdtrace should output the hostname of your current working host. mpdallexit stops the daemon.

Start the mpd on some machines

 mpdboot -n <number to start> -f /home/username/mpd.hosts

If the current machine is not part of the cluster (not in the mpd.hosts), then you need one additional mpd (add +1).

Test:

 mpdtrace
 mpdringtest
 mpdringtest 100
 mpiexec -l -n 30 hostname

Test a MPI program

Don't forget to start mpd via mpdboot.

Then copy the cpi example to a shared location:

 cp mpich2-1.0.6/examples/cpi ~/cpi
 mpiexec -n 5 /home/you/cpi

The number of proccess (here: 5) can exceed the number of hosts. mpiexec has a lots of options. See mpiexec -help and read the README.

Compile a MPI program

Download the MPICH bindings as lazarus package from [1].

Extract the zip file, use IDE / components / open package file (.lpk) to open the mpich2.lpk file and compile the package.

Create a new project (custom project, not an application). Save the project as /home/username/helloworld.lpi.

Open the mpich2 package (e.g. IDE / Components / Open recent package / somepath/mpich2.lpk). This opens the package editor. Then do More / Add to project. The project can now use the mpich2 bindings.

Here is a very small MPI program:

program HelloWorld1;

{$mode objfpc}{$H+}
{$Linklib c}

uses
  MPI;
  
var myid: integer;
begin
  MPI_Init(@argc,@argv);
  MPI_Comm_rank(MPI_COMM_WORLD,@myid);
  writeln('id=',myid);
  MPI_Finalize;
end.

You can find this program in the examples directory too.

Compile the program to create helloworld executable.

Now run it like the above cpi example. That means: First check with mpdtrace if the mpd is still running. If not, then cd to the directory with your mpd.hosts file and use mpdboot -n <number of hosts> to start it.