Hi everyone!
I'm running NVT AIMD simulations on pristine tantalum and tungsten cells at several temperatures ranging from 300 K to 1800 K. I've observed that decreasing the Nosé–Hoover thermostat effective mass (SMASS), corresponding to shorter thermal fluctuation periods (e.g., ~20 and ~10 time steps), leads to smaller temperature fluctuations.
From my understanding, a smaller SMASS increases the coupling strength between the thermostat and the system, which should suppress temperature fluctuations. If that’s the case, why not simply reduce SMASS as much as possible to minimize these fluctuations? How should one determine an appropriate SMASS value for a given simulation? Are there other factors that might contribute to large temperature fluctuations in AIMD?
I've attached plots of temperature vs. simulation step for both Ta and W at 1800 K, showing the reduction in temperature fluctuation with decreasing SMASS. I’ve also included the corresponding project directories for the default SMASS (period of ~40 time steps).
Any insight or recommendations would be greatly appreciated!
