All plugins
CalcJobs and calculation functions aiida.calculations CalcJob parsers aiida.parsers WorkChains and work functions aiida.workflows
aiida-pyscf
AiiDA plugin for the Python-based Simulations of Chemistry Framework (PySCF).
status alpha
AiiDA
>=2.5,<3.0
General information
Registry checks
All checks passed
Plugins provided
Calculations 1 Parsers 1 Workflows 1
Entry points
-
pyscf.base
class:aiida_pyscf.calculations.base:PyscfCalculation``CalcJob`` plugin for PySCF.
Input Required Valid types Description codetrue AbstractCode, NoneTypeThe `Code` to use for this job. This input is required, unless the `remote_folder` input is specified, which means an existing job is being imported and no code will actually be run. structuretrue StructureDataInput structure with molecular structure definition. checkpointfalse SinglefileData, NoneTypeCheckpoint of a previously completed calculation that failed to converge. metadatafalse monitorsfalse DictAdd monitoring functions that can inspect output files while the job is running and decide to prematurely terminate the job. parametersfalse Dict, NoneTypeInput parameters used to render the PySCF script template. remote_folderfalse RemoteData, NoneTypeRemote directory containing the results of an already completed calculation job without AiiDA. The inputs should be passed to the `CalcJob` as normal but instead of launching the actual job, the engine will recreate the input files and then proceed straight to the retrieve step where the files of this `RemoteData` will be retrieved as if it had been actually launched through AiiDA. If a parser is defined in the inputs, the results are parsed and attached as output nodes as usual. Output Required Valid types Description remote_foldertrue RemoteDataInput files necessary to run the process will be stored in this folder node. retrievedtrue FolderDataFiles that are retrieved by the daemon will be stored in this node. By default the stdout and stderr of the scheduler will be added, but one can add more by specifying them in `CalcInfo.retrieve_list`. checkpointfalse SinglefileDataThe checkpoint file in case the calculation did not converge. Can be used as an input for a restart. cubegenfalse fcidumpfalse SinglefileDataComputed fcidump files. hessianfalse ArrayDataThe computed Hessian. modelfalse PickledDataThe model in serialized form. Can be deserialized and used without having to run the kernel again. parametersfalse DictVarious computed properties parsed from the `FILENAME_RESULTS` output file. remote_stashfalse RemoteStashDataContents of the `stash.source_list` option are stored in this remote folder after job completion. structurefalse StructureDataThe optimized structure if the input parameters contained the `optimizer` key. trajectoryfalse TrajectoryDataThe geometry optimization trajectory if the input parameters contained the `optimizer` key. Exit status Message 1 The process has failed with an unspecified error. 2 The process failed with legacy failure mode. 10 The process returned an invalid output. 11 The process did not register a required output. 100 The process did not have the required `retrieved` output. 110 The job ran out of memory. 120 The job ran out of walltime. 131 The specified account is invalid. 140 The node running the job failed. 150 {message} 302 The stdout output file was not retrieved. 303 The results JSON file was not retrieved. 410 The electronic minimization cycle did not reach self-consistency. 500 The ionic minimization cycle did not converge for the given thresholds.
-
pyscf.base
aiida_pyscf.parsers.base:PyscfParser
-
pyscf.base
class:aiida_pyscf.workflows.base:PyscfBaseWorkChainWorkchain to run a pyscf calculation with automated error handling and restarts.
Input Required Valid types Description pyscftrue Dataclean_workdirfalse BoolIf `True`, work directories of all called calculation jobs will be cleaned at the end of execution. handler_overridesfalse Dict, NoneTypeMapping where keys are process handler names and the values are a dictionary, where each dictionary can define the ``enabled`` and ``priority`` key, which can be used to toggle the values set on the original process handler declaration. max_iterationsfalse IntMaximum number of iterations the work chain will restart the process to finish successfully. metadatafalse Output Required Valid types Description remote_foldertrue RemoteDataInput files necessary to run the process will be stored in this folder node. retrievedtrue FolderDataFiles that are retrieved by the daemon will be stored in this node. By default the stdout and stderr of the scheduler will be added, but one can add more by specifying them in `CalcInfo.retrieve_list`. checkpointfalse SinglefileDataThe checkpoint file in case the calculation did not converge. Can be used as an input for a restart. cubegenfalse fcidumpfalse SinglefileDataComputed fcidump files. hessianfalse ArrayDataThe computed Hessian. modelfalse PickledDataThe model in serialized form. Can be deserialized and used without having to run the kernel again. parametersfalse DictVarious computed properties parsed from the `FILENAME_RESULTS` output file. remote_stashfalse RemoteStashDataContents of the `stash.source_list` option are stored in this remote folder after job completion. structurefalse StructureDataThe optimized structure if the input parameters contained the `optimizer` key. trajectoryfalse TrajectoryDataThe geometry optimization trajectory if the input parameters contained the `optimizer` key. Exit status Message 1 The process has failed with an unspecified error. 2 The process failed with legacy failure mode. 10 The process returned an invalid output. 11 The process did not register a required output. 300 The calculation failed with an unrecoverable error. 301 The sub process excepted. 302 The sub process was killed. 310 The calculation failed and did not retrieve a checkpoint file from which can be restarted. 401 The maximum number of iterations was exceeded. 402 The process failed for an unknown reason, twice in a row.