Sections:
Create a directory, which will contain all libraries, source code, tools, and documentation for magpar, and set the environment variable MAGPAR_HOME:
# change into any directory, where you want to install magpar # for example in your $HOME/work directory cd $HOME; mkdir work; cd work # download the magpar source archive by hand or using wget: lib=magpar wget http://www.magpar.net/$lib/download/$lib.tar.gz # unpack the archive tar xzvf $lib.tar.gz cd $lib MAGPAR_HOME=$PWD; export MAGPAR_HOME # sh/bash syntax (use "setenv" for csh) PD=$MAGPAR_HOME/libs; export PD
If you are upgrading from a previous version of magpar, you can usually reuse the libraries, which you have already compiled and installed. However, please note the ChangeLog and upgrade libraries as required or recommended.
The current version of magpar has been developed and tested with the configuration and library versions defined in Makefile.in.defaults .
Links to the websites of the libraries can be found in the list of Required Software.
Create Makefile.in.$HOSTNAME using Makefile.in.defaults (or one of the other Makefile.in.host_*):
cp Makefile.in.defaults Makefile.in.$HOSTNAME
and edit it:
It should not be necessary to modify Makefile or Makefile.in at all any more!
Please refer to the FAQ for tips and suggestions for the installation of magpar on specific systems and software environments.
The (manual) installation procedures described below are now conveniently combined in Makefile.libs .
Just simply do
cd $MAGPAR_HOME/src make -f Makefile.libs
All libraries will be downloaded automatically using "wget", configured, compiled, and installed in $PD in the following order:
atlas lapack mpi parmetis sundials petsc tao zlib libpng
It is also possible to install the libraries one at a time like this (e.g. PETSc)
cd $MAGPAR_HOME/src make -f Makefile.libs petsc
using the names listed above. This makes it easier to use precompiled packages (e.g. Precompiled packages on Ubuntu/Debian) and then just install the remaining ones with the convenience of using Makefile.libs .
If this does not work, then please follow the manual installation instructions below.
Once all libraries are compiled and installed, compile magpar as described below.
Check for Required Software which is already preinstalled on your machine. For example, there are Precompiled packages on Ubuntu/Debian available. Download, unpack, compile and install all other required libraries in the order given below in the directory $PD. Some libraries are optional,and only required if you want to try different linear solvers (e.g. BlockSolve95, hypre, SuperLU). It is highly recommended to use any machine specific (vendor provided and highly optimized) libraries. On most high performance machines there are optimized BLAS, LAPACK, and MPI libraries available (cf. FAQ : Optimized BLAS libraries). In this case you just have to set the paths to your libraries properly, when you configure and compile various packages.
If you have trouble installing any of the required libraries, please check their respective installation guides/documentation/FAQs/website first. The URLs of their websites can be found on the Required Software page.
For your convenience get one of the binary packages for your hardware platform from the stable branch of ATLAS (unless you do already have Optimized BLAS libraries).
cd $PD # set lib with the name of your ATLAS library lib=atlas3.6.0_Linux_PIIISSE1.tar.gz tar xzvf $lib # create a symbolic link to the directory with the ATLAS binaries: ln -s Linux_* atlas lapacklib=$PD/atlas/lib/liblapack.a # rename (incomplete) lapack library provided by ATLAS (cf. LAPACK below) mv $lapacklib $lapacklib.atlas
Debian, RedHat and other distributors provide precompiled binaries of LAPACK. Try the Precompiled packages on Ubuntu/Debian or check the web for availablility.
http://www.debian.org/distrib/packages
http://rpmfind.net/
http://www.redhat.com/
You may also recompile it from source:
# set Fortran compiler # GNU GCC >=4.0 Fortran 77/95: gfortran FC=gfortran; TIMER=INT_ETIME # # GNU GCC < 4.0 Fortran 77: g77 #FC=g77; TIMER=EXT_ETIME # # check that Fortran compiler works $FC --version # cd $PD lib=lapack.tgz wget http://www.netlib.org/lapack/$lib tar xzvf $lib cd lapack-* cp INSTALL/make.inc.LINUX make.inc # # add CPU specific options to OPTS, e.g. -march=pentium4 -msse2 (cf. man gcc) # set correct Fortran compiler (check additional options in make.inc!): make "FORTRAN=$FC" "LOADER=$FC" \ "TIMER=$TIMER" \ "BLASLIB=$PD/atlas/lib/libf77blas.a $PD/atlas/lib/libatlas.a" \ "OPTS=-funroll-all-loops -O3 $OOPTS" \ lapacklib # # run tests (optional) make lapack_testing
Now we have to add the missing LAPACK functions to the ATLAS library:
(cf. $PD/atlas/README, Building a complete LAPACK library)
cd $PD/atlas/lib cp $lapacklib.atlas $lapacklib mkdir tmp; cd tmp ar x $PD/lapack-*/lapack_LINUX.a ar r ../liblapack.a *.o cd ..; rm -rf tmp
MPICH, OpenMPI, LAM/MPI, or any other MPI library, which implements the MPI standard (version 1 or 2) may be used.
You need rsh (recommended) or ssh to be installed and configured properly! Don't forget to create a ".rhosts" file with the names of all machines (also your local machine!) in your home directory (cf. "man rhosts"). The configuration can be tested with "rsh $HOSTNAME uname -a". You may also use ssh for encrypted communication between processors.
The directory $PD/mpi/bin should be added to the $PATH variable. Update your login scripts, e.g. .bashrc, .login, .profile, to make this permanent by appending the following code snippet:
PATH=$PD/mpi/bin:$PATH export PATH
or the programs installed in $PD/mpi/bin should be copied to $HOME/bin or any other directory within your $PATH, so that mpirun and other MPI tools can be called from the command line.
If ssh should be used instead of rsh for login on remote machines use "./configure -rsh=ssh" when compiling MPICH. In this case public key authentication should be configured for ssh to enable login without passwords (cf. "man ssh").
The configure-script of MPICH will try both, rsh and ssh, and print a warning if neither service is configured properly.
cd $PD lib=mpich2.tar.gz wget -N --retr-symlinks ftp://ftp.mcs.anl.gov/pub/mpi/$lib # better download the latest version from # http://www.mcs.anl.gov/research/projects/mpich2/ tar xzvf $lib # # change into mpich2 subdirectory (adjust to the downloaded version) cd mpich2-* # # use "ssm" (sockets and shared memory) for use on clusters of SMPs # (communication on the same machine goes through shared memory; # communication between different machines goes over sockets) # instead of default "sock" ./configure --prefix=$PD/mpich2 --with-device=ch3:ssm 2>&1 | tee configure.log # make -j 1 2>&1 | tee make.log make install 2>&1 | tee install.log # # set symbolic link to MPICH installation directory ln -s mpich2 $PD/mpi # # Please refer to $PD/mpi/README or # $PD/mpi/doc on how to use MPICH2 and # start a ring of MPI's process managers mpd!
Installation instructions for MPICH1 and LAM/MPI have been moved to the FAQ.
cd $PD lib=ParMetis-3.1.1 wget -N http://glaros.dtc.umn.edu/gkhome/fetch/sw/parmetis/$lib.tar.gz tar xzvf $lib cd $lib make "CC=$PD/mpi/bin/mpicc" "LD=$PD/mpi/bin/mpicc" # # run tests (optional) cd Graphs $PD/mpi/bin/mpirun -np 4 ptest rotor.graph # more tests in ParMetis-3.1.1/INSTALL
(SUNDIALS version 2.3.0)
Download the library from the SUNDIALS website (registration required).
cd $PD lib=sundials-2.3.0 tar xzvf $lib.tar.gz cd $(PD)/$lib # set compiler options (modify for your setup!) # add CPU specific options, e.g. -march=pentium4 -msse2 (cf. man gcc) CFLAGS="-O3" export CFLAGS ./configure --prefix=$PD/$lib --with-mpi-root=$PD/mpi make && make -i install # (generates static libraries and installs libraries and include files # in $PD/$lib/libs and $PD/$lib/include)
(PETSc version 2.3.0 and later)
Starting with PETSc version 2.3.0 you have to use the automatic Python-based configure system, which requires Python 2.2 or later. Please refer to the FAQ Installing Python if you need to install Python by hand.
cd $PD lib=petsc-2.3.3-p15 wget ftp://ftp.mcs.anl.gov/pub/petsc/release-snapshots/$lib.tar.gz tar xzvf $lib.tar.gz cd $lib # # set environment variables # (here: bash style - use "setenv" in sh/csh) # PETSC_DIR=$PD/$lib export PETSC_DIR PETSC_ARCH=PETSc-config-magpar export PETSC_ARCH PRECISION=double export PRECISION # # edit PETSc-config-magpar.py # (select MPI, optional libraries, optimization options, etc.) # use the templates in $PETSC_DIR/config/ for platforms other than Linux # copy PETSc configuration script # (needs to be a copy - must not be a symbolic link!) # cp $MAGPAR_HOME/src/PETSc-config-magpar.py $PETSC_DIR/config/ # # Run # ./config/configure.py --help # to see all command line options for configure.py. # # for static binaries edit # $PETSC_DIR/bmake/PETSc-config-magpar/petscconf (recommended): # remove all occurences of "-lgcc_s" and add "-static" to the linker flags: # CC_LINKER_FLAGS = -Wall -O3 -static # ./config/PETSc-config-magpar.py make all # # run tests (optional) make test
Also refer to the installation instructions on the PETSc homepage!
magpar requires
(PETSc version 2.3.3 and TAO 1.9) (highly recommended) or
(PETSc version 2.3.2 and TAO 1.8.2) or
(PETSc version 2.3.0 and TAO 1.8) or
(PETSc version 2.2.1 and TAO 1.7) or
(PETSc version 2.2.0 and TAO 1.6)
lib=tao-1.9 wget -N http://www.mcs.anl.gov/research/projects/tao/download/$lib tar xzvf $lib TAO_DIR=$PD/$lib; export TAO_DIR cd $TAO_DIR make
cd $PD lib=zlib-1.2.3 wget -N http://downloads.sourceforge.net/libpng/$lib.tar.gz tar xzvf $lib.tar.gz ln -s $lib zlib cd $lib make CFLAGS="-O -fPIC" && make test
cd $PD lib=libpng-1.2.33 wget -N http://downloads.sourceforge.net/libpng/$lib.tar.gz?download tar xzvf $lib.tar.gz ln -s $lib libpng cd $lib instdir=$(PD)/$lib ./configure --prefix=$instdir --enable-shared=no 2>&1 | tee configure.log CFLAGS="-I$PD/zlib"; export CFLAGS LDFLAGS="-L$PD/zlib"; export LDFLAGS make 2>&1 | tee make.log make install 2>&1 | tee makeinst.log make check 2>&1 | tee makecheck.log # alternatively use the old method with a static Makefile: cp scripts/makefile.linux Makefile make ZLIBLIB=../zlib ZLIBINC=../zlib && make test
Once all libraries are compiled and installed, compile magpar with
cd $MAGPAR_HOME/src make
If everything compiled (hopefully) ok, you should get the executable magpar.exe.