Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Start with the last failed scenario #139

Closed
ndkoval opened this issue Nov 30, 2022 · 7 comments
Closed

Start with the last failed scenario #139

ndkoval opened this issue Nov 30, 2022 · 7 comments
Assignees

Comments

@ndkoval
Copy link
Collaborator

ndkoval commented Nov 30, 2022

When a Lincheck test fails, users usually fix the bug and re-run the test. The same scenario will likely fail if the bug has not been resolved. To detect that faster, Lincheck could save the number of failed scenario somewhere, starting next time with it.

@alefedor
Copy link
Contributor

alefedor commented Nov 30, 2022

A similar issue was raised in #111.

Do you propose that Lincheck should not just give the number of failed iteration and a way to skip iterations before it, but should also save the number on disk for future automatic re-running?

@eupp
Copy link
Collaborator

eupp commented Nov 30, 2022

Perhaps, it would make sense to save the scenario itself, not just its number?
Saving a number requires to re-run generator and skip unused iterations. Also it imposes an assumption that scenario generator is deterministic.

Also, what about saving not only the scenario, but also the interleaving that led to failure (in case of model-checking strategy)?

@alefedor
Copy link
Contributor

Hi @eupp !

I agree. LinCheck has a way to run custom scenarios, but maybe it would make sense to add interface to run saved scenarios (e.g., run scenarios from Lincheck text output).

Also it imposes an assumption that scenario generator is deterministic.

Scenario generator should be deterministic because otherwise it complicates re-producing errors. In my opinion, this should be clearly emphasized in the interface by obliging generator constructors to receive (and use) Random instance. This would also fix the issue with correlated generators if Lincheck passes the same random to all generators.

Also, what about saving not only the scenario, but also the interleaving that led to failure (in case of model-checking strategy)?

I believe @ndkoval has a prototype for this.

@ndkoval
Copy link
Collaborator Author

ndkoval commented Nov 30, 2022

As @alefedor has highlighted, it is critical to have a deterministic generator. Therefore, it is easier to re-generate the scenario than serialize and deserialize it, especially with the class loaders magic. As for storing the interleaving, it is unclear when it is functional.

I would suggest sticking on the original cheap and efficient solution.

@ndkoval
Copy link
Collaborator Author

ndkoval commented Mar 16, 2023

For now on, I suggest providing an internal API via system properties, which we will later use in integration with IDEA. Let's introduce the following system property: -Dlincheck.startingIteration=<iteration_number>. Please remember that users are eligible to provide custom scenarios; we should count them too.

@eupp
Copy link
Collaborator

eupp commented Apr 20, 2023

I should worn you that once we will have automatic parameter tuning based on testing time #158,
the problem of scenario generator determinism becomes relevant again.

Imagine a following scenario.

  • User runs lincheck test, but there is some another computation intensive process running on his machine. Because of this process, iterations take more time, so parameter tuning algorithm decides to generate smaller scenarios (with less number of threads and less number of operations per thread). So, for example, the iteration №2 generates scenario of 2x2 size.
  • The user once again runs the same lincheck test, but this time there is no competing process in background. Because of this, iterations take less time, so parameter tuning algorithm decides to generate larger scenarios. Now, on №2 iteration it generates different scenario of size 3x3.

The point here is that with automatic parameter tuning the scenario generation process becomes non-deterministic and dependent on the scenarios running time (even if scenario generator itself is deterministic).

@ndkoval
Copy link
Collaborator Author

ndkoval commented Apr 25, 2023

We have recently discussed that #168 covers the use cases of this issue. As we are not going to provide deterministic scenario generation, the "starting from the last failed scenario" is no longer needed.

@ndkoval ndkoval closed this as not planned Won't fix, can't repro, duplicate, stale Apr 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants