MPI bug: internal error in: vhdf5_base.F at line: 2270

Problems running VASP: crashes, internal errors, "wrong" results.


Moderators: Global Moderator, Moderator

Post Reply
Message
Author
amritachatto
Newbie
Newbie
Posts: 1
Joined: Fri Nov 11, 2022 11:16 am

MPI bug: internal error in: vhdf5_base.F at line: 2270

#1 Post by amritachatto » Tue Feb 07, 2023 4:03 pm

Dear developers and users,

I am trying to run some energy calculations on VASP/6.2.3 for organic small molecules and I get the following bug when running array jobs. This is the slurm.out file:

Code: Select all

cp: -r not specified; omitting directory 'VASP_AC-2990894-11'
 -----------------------------------------------------------------------------
|                     _     ____    _    _    _____     _                     |
|                    | |   |  _ \  | |  | |  / ____|   | |                    |
|                    | |   | |_) | | |  | | | |  __    | |                    |
|                    |_|   |  _ <  | |  | | | | |_ |   |_|                    |
|                     _    | |_) | | |__| | | |__| |    _                     |
|                    (_)   |____/   \____/   \_____|   (_)                    |
|                                                                             |
|     internal error in: vhdf5_base.F  at line: 2270                          |
|                                                                             |
|     HDF5 call in vhdf5_base.F:2270 produced error: 1                        |
|                                                                             |
|     If you are not a developer, you should not encounter this problem.      |
|     Please submit a bug report.                                             |
|                                                                             |
 -----------------------------------------------------------------------------

--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
I have run the similar job arrays for the structures (Gamma-centred k-point mesh) with different INCAR values and it has worked fine. I have attached the input files used for this batch calculation. Please let me know if you would need any further information. Many thanks for all your help and advice.

Best,
Amrita
You do not have the required permissions to view the files attached to this post.

fabien_tran1
Global Moderator
Global Moderator
Posts: 418
Joined: Mon Sep 13, 2021 11:02 am

Re: MPI bug: internal error in: vhdf5_base.F at line: 2270

#2 Post by fabien_tran1 » Wed Feb 08, 2023 3:13 pm

Hi,

We would need more information:
-Have you tried to run this particular array job several times? If yes, was it always crashing ?
-Have you identified which calculation of the array job is producing the error message? Is it the one corresponding to the input files that you provided? If yes, does it crash if you run it separately?
-You mentioned that other array jobs did not crash. In what respect these other array jobs differ? Different setting in the slurm job script?

Post Reply