-
Notifications
You must be signed in to change notification settings - Fork 131
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running inversion multiple times give different result #712
Comments
Thank you for reporting. This behaviour should, of course, not happen. Please make sure it also occurs with the newest version, v1.5.0 Can you, please, specify the full code and all required files so that we can reproduce the problem? |
|
I don't understand your second point. All works well. |
A core update (1.4 to 1.5) often means some update to other libraries that is sometimes not done unless explicitly written ( |
For me, both plots are shown in both jupyter lab and VSCode |
|
|
It is strange that I installed the pygimli new version in a new environment using
whereas outout of
As suggested, changing the parameters and error part in the inversion is still giving in consistent result. As shown below,
I followed the reference, where I understood |
The inverse solver just solves the system of equations, but does not determine the solution. Some of the solvers in my PhD thesis do just not apply and others just incorporate additional weightings. If is all explained there. |
As discussed in the #689, both values of 0 and 10 make much sense. Nevertheless, the results of two runs with the same parameters should be identical, at least within numerical accuracy and we don't know where the difference comes from but keep digging. Looking at the ratio between two runs, the lower the impact of regularization is (low lamda or lamdaFactor, blockyModel), i.e. the more we are inverting noise. |
Problem description
Your environment
Python 3.9.18 | packaged by conda-forge | (main, Dec 23 2023, 16:29:04) [MSC
v.1929 64 bit (AMD64)]
Steps to reproduce
Expected behavior
Running
velInv
should generate same inverted result every time I run it. But its giving me different results. Seems like inversion is updating the earlier model without clearing the memory.The text was updated successfully, but these errors were encountered: