Molpro 2024.1.0
Webpage
Version
2024.1.0
Build Environment
- GCC 12.1.1 (gcc-toolset-12)
- Intel MPI 2021.11 / Open MPI 4.1.6
- Eigen 3.4.0
- MKL 2024.0
Files Required
- molpro-2024.1.0.tar.gz
- ga-5.8.2.tar.gz
- work.patch
- patch-argos-binput.F
- patch-cic-ItfFortranInt.h
- patch-common_modules-common_cconf1
- Change some parameters for huge CI calculations and modify default path for temporary files.
- Patch files are placed at /apl/molpro/2024.1.0/patches directory.
- token
Build Procedure
#!/bin/sh
GA_VERSION=5.8.2
GA_ARCHIVE=/home/users/${USER}/Software/GlobalArrays/${GA_VERSION}/ga-${GA_VERSION}.tar.gz
MOLPRO_VERSION=2024.1.0
MOLPRO_DIRNAME=molpro-${MOLPRO_VERSION}
PARALLEL=12
BASEDIR=/home/users/${USER}/Software/Molpro/${MOLPRO_VERSION}
MOLPRO_TARBALL=${BASEDIR}/${MOLPRO_DIRNAME}.tar.gz
PATCH0=${BASEDIR}/work.patch
PATCH1=${BASEDIR}/patch-argos-binput.F
PATCH2=${BASEDIR}/patch-cic-ItfFortranInt.h
PATCH3=${BASEDIR}/patch-common_modules-common_cconf1
TOKEN=${BASEDIR}/token
WORKDIR=/gwork/users/${USER}
GA_INSTALLDIR=${WORKDIR}/ga-temporary
INSTALLDIR=/apl/molpro/${MOLPRO_VERSION}
#------------------------------------------
umask 0022
ulimit -s unlimited
export LANG=
export LC_ALL=C
export OMP_NUM_THREADS=1
cd $WORKDIR
if [ -d ga-${GA_VERSION} ]; then
mv ga-${GA_VERSION} ga_tmp
rm -rf ga_tmp &
fi
if [ -d ga-temporary ]; then
mv ga-temporary ga_tmp_tmp
rm -rf ga_tmp_tmp &
fi
if [ -d ${MOLPRO_DIRNAME} ]; then
mv ${MOLPRO_DIRNAME} molpro_tmp
rm -rf molpro_tmp &
fi
module -s purge
module -s load gcc-toolset/12
module -s load intelmpi/2021.11
#module -s load openmpi/4.1.6/gcc12
module -s load eigen/3.4.0
tar zxf ${GA_ARCHIVE}
cd ga-${GA_VERSION}
export CFLAGS="-mpc80"
export FFLAGS="-mpc80"
export FCFLAGS="-mpc80"
export CXXFLAGS="-mpc80"
export F77=mpif90
export F90=mpif90
export FC=mpif90
export CC=mpicc
export CXX=mpicxx
export MPIF77=mpif90
export MPICC=mpicc
export MPICXX=mpicxx
export GA_FOPT="-O3"
export GA_COPT="-O3"
export GA_CXXOPT="-O3"
./autogen.sh
./configure --enable-i8 \
--with-mpi-pr \
--prefix=${GA_INSTALLDIR}
make -j ${PARALLEL}
make check
make install
# mkl for molpro
module -s load mkl/2024.0
cd ${WORKDIR}
tar zxf ${MOLPRO_TARBALL}
cd ${MOLPRO_DIRNAME}
patch -p0 < ${PATCH0}
patch -p0 < ${PATCH1}
patch -p0 < ${PATCH2}
patch -p0 < ${PATCH3}
export PATH="${GA_INSTALLDIR}/bin:$PATH" # where ga-config exists
CPPFLAGS="-I${GA_INSTALLDIR}/include" \
LDFLAGS="-L${GA_INSTALLDIR}/lib64" \
./configure --prefix=${INSTALLDIR} \
--enable-slater
make -j ${PARALLEL}
cp $TOKEN lib/.token
make tuning
MOLPRO_OPTIONS="" make quicktest
MOLPRO_OPTIONS="-n2" make test
make install
cp -a testjobs ${INSTALLDIR}/molpro*
cp -a bench ${INSTALLDIR}/molpro*
Test Results
Failed on h2o_rvci_dip test. The value of RVCI_DIP(4:6) may be the problem.
SETTING RVCI_TEST(1) = 140.24404946 CM-1
SETTING RVCI_TEST(2) = 2040.72148410 CM-1
SETTING RVCI_TEST(3) = 4309.43539874 CM-1
SETTING RVCI_DIP(1) = 6.6023777D-20
SETTING RVCI_DIP(2) = 6.9439373D-20
SETTING RVCI_DIP(3) = 5.6819299D-19
SETTING RVCI_DIP(4) = 9.6702224D-20
SETTING RVCI_DIP(5) = 1.2687290D-18
SETTING RVCI_DIP(6) = 4.3975422D-19
SETTING RVCI_DIP(7) = 4.5351150D-20
SETTING RVCI_DIP(8) = 3.5013711D-19
SETTING RVCI_DIP(9) = 1.3565481D-20
RVCI_TEST(1:3) = [ 140.24404946 2040.72148410 4309.43539874] CM-1
SETTING RVCIFREQ_REF(1)= 140.24000000
SETTING RVCIFREQ_REF(2)= 2040.72000000
SETTING RVCIFREQ_REF(3)= 4309.44000000
SETTING RVCIDIP_REF(1) = 6.6023777D-20
SETTING RVCIDIP_REF(2) = 6.9439373D-20
SETTING RVCIDIP_REF(3) = 5.6819299D-19
SETTING RVCIDIP_REF(4) = 3.2234075D-20
SETTING RVCIDIP_REF(5) = 4.2290966D-19
SETTING RVCIDIP_REF(6) = 1.3192627D-18
SETTING RVCIDIP_REF(7) = 4.5351150D-20
SETTING RVCIDIP_REF(8) = 3.5013711D-19
SETTING RVCIDIP_REF(9) = 1.3565481D-20
SETTING DRVCI = 0.00460126 CM-1
SETTING DRVCI_DIP = 2.00000008
This error occurs both on Open MPI and Intel MPI. GCC11, Non-parallel (without -n2 option), and OpenBLAS versions also failed with this test.
Notes
- (openmpi version was built using "module -s load openmpi/4.1.6/gcc12" instead of intelmpi line.)
- Intel MPI version is so far free from the disk option problem.
- Open MPI version occasionally hung during the calcualtion if disk option is enabled.
- (Regarding performance, Disk option can be faster or slower than GA option. It depends on the input.)
- On average, Open MPI version is slightly faster than Intel MPI version. But Open MPI version has an issue regarding disk option. We thus chose Intel MPI version as a standard version.
- Eigen 3.4 is necessary? We failed to build with Eigen 3.3 (Rocky Linux rpm).
- MKL+Eigen 3.3 failed on configure. OpenBLAS+Eigen 3.3 failed during the compilation.