MPI bug: internal error in: vhdf5_base.F at line: 2270
Posted: Tue Feb 07, 2023 4:03 pm
Dear developers and users,
I am trying to run some energy calculations on VASP/6.2.3 for organic small molecules and I get the following bug when running array jobs. This is the slurm.out file:
I have run the similar job arrays for the structures (Gamma-centred k-point mesh) with different INCAR values and it has worked fine. I have attached the input files used for this batch calculation. Please let me know if you would need any further information. Many thanks for all your help and advice.
Best,
Amrita
I am trying to run some energy calculations on VASP/6.2.3 for organic small molecules and I get the following bug when running array jobs. This is the slurm.out file:
Code: Select all
cp: -r not specified; omitting directory 'VASP_AC-2990894-11'
-----------------------------------------------------------------------------
| _ ____ _ _ _____ _ |
| | | | _ \ | | | | / ____| | | |
| | | | |_) | | | | | | | __ | | |
| |_| | _ < | | | | | | |_ | |_| |
| _ | |_) | | |__| | | |__| | _ |
| (_) |____/ \____/ \_____| (_) |
| |
| internal error in: vhdf5_base.F at line: 2270 |
| |
| HDF5 call in vhdf5_base.F:2270 produced error: 1 |
| |
| If you are not a developer, you should not encounter this problem. |
| Please submit a bug report. |
| |
-----------------------------------------------------------------------------
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
Best,
Amrita