-
Notifications
You must be signed in to change notification settings - Fork 200
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP]Update constants to use values from CODATA2022 #5661
base: development
Are you sure you want to change the base?
[WIP]Update constants to use values from CODATA2022 #5661
Conversation
Thanks, Dave. If there is a large number of benchmarks that need to be reset, this could be a good opportunity to test again our tool Tools/DevUtils/update_benchmarks_from_azure_output.py and its instructions in our documentation. In theory, I updated and tested the tool manually in #5372. However, it is not tested automatically yet. |
Thanks @EZoni ! It worked and was easy to do. BTW, to download the raw log file, I copied the URL from he location bar and pasted it into the Note that almost all of the changes in the benchmarks are small as expected, ~1.e-9 or smaller. One exception is the |
I agree. I think that test has relatively large tolerances anyways, if I remember correctly. @aeriforme, what do you think? |
Thanks for pointing this out! I added this hint to our documentation in #5663. |
…tion_psb analysis test
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @dpgrote ! I think that updating constants to CODATA 2022 is a good idea.
I've reviewed the logs of the tests that have failed.
I agree that issues with test_3d_beam_beam_collision.checksum
(and also with test_2d_collision_xz_picmi.checksum
) are likely due to the fact that these tests are relatively long.
We need to investigate a bit better the cases test_1d_ohm_solver_ion_beam_picmi
, test_1d_ohm_solver_ion_beam_picmi
, and test_3d_qed_schwinger_2
show non-negligible discrepancies (I will have a look at the QED-related one).
We need to increase the tolerance of several analysis scrips, since they seem to be a bit too strict :
test_2d_theta_implicit_jfnk_vandb
, test_2d_theta_implicit_jfnk_vandb_filtered
, test_2d_theta_implicit_jfnk_vandb_picmi
, test_2d_theta_implicit_strang_psatd
, test_2d_pec_field_insulator_implicit_restart
,test_3d_particle_boundaries
, test_3d_load_external_field_grid_picmi
, test_3d_load_external_field_grid_picmi
,test_3d_particle_fields_diags
, test_3d_particle_fields_diags
, test_3d_reduced_diags
.
I can provide a possible explanation for the QED-related test. Looking at the analysis script I found this comment:
Note that the analysis script uses these constants (CODATA 2014, I think):
while PICSAR-QED uses (CODATA 2018):
|
@lucafedeli88 Thanks for looking over this PR. I think a number of the issues are related to the use of scipy.constants in the analysis scripts, since the constants are inconsistent, i.e. still the CODATA2018 values. Until this issue is resolved, I don't think this PR should be merged. Unfortunately, there is no easy solution. The simplest is probably to use the most recent version of ubuntu for the CI tests, which uses the most recent scipy with updated constants. A more robust longer term solution would be to create a new light-weight Python module that has the constants with the values consistent with the ones in C++, and then use this everywhere instead of relying on scipy.constants. This would guarantee that the values are always consistent. |
We could maybe discuss this in the upcoming developers' meeting. Using Ubuntu 24.04 in CI tests should be rather straightforward. |
Yes, we can do that soon. There had been discussions about the "need" to have less strict tolerances for the checksums to be compatible with version upgrades like this one. The work done in #5456 has set up things so that we can do that easily (see point "Add logic to reset tolerances based on environment variables" in the follow-up list of that PR description). This said, Ubuntu LTS version upgrades come once every two years, so I personally think that the tolerance fine tuning is not a real roadblock for this particular update, we could simply upgrade the Ubuntu version and reset the checksums that need to be reset. I will get to this, one way or another, as soon as possible. |
This comment was marked as outdated.
This comment was marked as outdated.
Ideally I thought it could be good to have the Ubuntu upgrade run through first without prior checksums changes and record how tests fail, like I did in #5731 (comment), to make it easier to discuss if/when a tolerance upgrade could be appropriate. But let's see if we can still assess this despite the pre-existing checksums changes. |
As mentioned in of the last comments in #5731, we can try to merge # (remove system copy of Matplotlib to avoid conflict
# with version set in the requirements file - see, e.g.,
# https://github.com/matplotlib/matplotlib/issues/28768)
sudo apt remove python3-matplotlib |
If I'm not mistaken, I still see errors like
which occurr only with older versions of Matplotlib. There is a conflict between the system version (old) coming from #5736 does not seem to have resolved the issue, so I think we need the workaround in #5661 (comment), unless you have other solutions in mind. |
@@ -80,7 +80,7 @@ | |||
|
|||
## Checks whether this is the 2D or the 3D test | |||
with open("./warpx_used_inputs") as warpx_used_inputs: | |||
is_2D = re.search("geometry.dims\s*=\s*2", warpx_used_inputs.read()) | |||
is_2D = re.search(r"geometry.dims\s*=\s*2", warpx_used_inputs.read()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here I think it would be also safe to remove the \s*
, since warpx_used_inputs
is generated automatically and uses only one white space, as in key = value
, but up to you. It might make it more readable overall:
is_2D = re.search(r"geometry.dims\s*=\s*2", warpx_used_inputs.read()) | |
is_2D = re.search("geometry.dims = 2", warpx_used_inputs.read()) |
The values of the physical constants were from CODATA 2018. These should be updated to the current accepted values as specified in CODATA 2022.
This breaks many CI benchmarks since the checks are at a higher precision than the changes in the constants.
Note that scipy recently updated the constants to use CODATA 2022 (in version 1.15.0 on January 3). This may cause problems in the CI tests. However, we use Ubuntu 20.4 to run the tests, and there, the version of scipy is 1.3.3 which uses constants from CODATA 2014!
These CI tests needed to be updated.