Update Colvars library to version 2025-04-18

The following is a list of pull requests relevant to LAMMPS in the Colvars repository since 2024-08-06:

- 752 New tool poisson_integrator_conv
  https://github.com/Colvars/colvars/pull/752 (@jhenin)

- 733 Custom grids for all biases
  https://github.com/Colvars/colvars/pull/733 (@giacomofiorin, @jhenin)

- 776 Avoid error in acos and asin with fast-math
  https://github.com/Colvars/colvars/pull/776 (@jhenin)

- 773 fix: fix the clang build test failure of OPES
  https://github.com/Colvars/colvars/pull/773 (@HanatoK)

- 768 fix: clamp the input values of asin and acos in case of fast math on aarch64
  https://github.com/Colvars/colvars/pull/768 (@HanatoK)

- 761 Add debug code for the Jacobi failure
  https://github.com/Colvars/colvars/pull/761 (@HanatoK)

- 759 min_image fix; Saves long runs from crashes;
  https://github.com/Colvars/colvars/pull/759 (@PolyachenkoYA)

- 757 Fix MSVC OpenMP issue
  https://github.com/Colvars/colvars/pull/757 (@HanatoK)

- 755 Fix indentation of 'Init CVC' message in standard output
  https://github.com/Colvars/colvars/pull/755 (@jhenin)

- 750 Optimize and simplify the calculation of dihedral gradients
  https://github.com/Colvars/colvars/pull/750 (@HanatoK)

- 749 Add references to new Colvars paper
  https://github.com/Colvars/colvars/pull/749 (@jhenin, @giacomofiorin)

- 740 Report the specific C++ standard at init time, stop warning about C++97/03
  https://github.com/Colvars/colvars/pull/740 (@giacomofiorin)

- 731 Improve detection of hard/mathematical boundaries
  https://github.com/Colvars/colvars/pull/731 (@giacomofiorin)

- 729 Optimize the fit gradients
  https://github.com/Colvars/colvars/pull/729 (@HanatoK, @jhenin)

- 728 Fix undefined behavior when getting the current working directory from std::filesystem
  https://github.com/Colvars/colvars/pull/728 (@giacomofiorin)

- 727 Add patchversion scripting command
  https://github.com/Colvars/colvars/pull/727 (@giacomofiorin)

- 724 Fix gradients and metric functions of distanceDir
  https://github.com/Colvars/colvars/pull/724 (@giacomofiorin)

- 715 Add missing rotation in orientation component
  https://github.com/Colvars/colvars/pull/715 (@giacomofiorin)

- 713 fix: try to solve #87 for non-scala components
  https://github.com/Colvars/colvars/pull/713 (@HanatoK)

- 709 Implementation of OPES in Colvars
  https://github.com/Colvars/colvars/pull/709 (@HanatoK, @giacomofiorin, @jhenin)

- 706 BUGFIX for Segmentation fault in colvarbias_meta::calc_energy() with useGrids off
  https://github.com/Colvars/colvars/pull/706 (@alphataubio)

- 570 enable use of CVs defined by PyTorch neural network models
  https://github.com/Colvars/colvars/pull/570 (@zwpku, @giacomofiorin, @HanatoK, @jhenin)

Authors: @alphataubio, @EzryStIago, @giacomofiorin, @HanatoK, @jhenin, @PolyachenkoYA, @zwpku
This commit is contained in:
Giacomo Fiorin
2025-04-08 12:18:07 -04:00
parent 440e24c60e
commit cba479bf6e
57 changed files with 4346 additions and 1199 deletions

View File

@ -261,7 +261,6 @@ int colvar::cvc::init_dependencies() {
require_feature_children(f_cvc_explicit_gradient, f_ag_explicit_gradient);
init_feature(f_cvc_inv_gradient, "inverse_gradient", f_type_dynamic);
require_feature_self(f_cvc_inv_gradient, f_cvc_gradient);
init_feature(f_cvc_debug_gradient, "debug_gradient", f_type_user);
require_feature_self(f_cvc_debug_gradient, f_cvc_gradient);
@ -525,7 +524,7 @@ void colvar::cvc::calc_force_invgrads()
void colvar::cvc::calc_Jacobian_derivative()
{
cvm::error("Error: calculation of inverse gradients is not implemented "
cvm::error("Error: calculation of Jacobian derivatives is not implemented "
"for colvar components of type \""+function_type()+"\".\n",
COLVARS_NOT_IMPLEMENTED);
}
@ -533,8 +532,10 @@ void colvar::cvc::calc_Jacobian_derivative()
void colvar::cvc::calc_fit_gradients()
{
for (size_t ig = 0; ig < atom_groups.size(); ig++) {
atom_groups[ig]->calc_fit_gradients();
if (is_enabled(f_cvc_explicit_gradient)) {
for (size_t ig = 0; ig < atom_groups.size(); ig++) {
atom_groups[ig]->calc_fit_gradients();
}
}
}