wallet_reorgrestore.py failed #29392

issue glozow openend this issue on February 6, 2024
  1. glozow commented at 2:06 pm on February 6, 2024: member

    https://cirrus-ci.com/task/5616118744743936

    0 test  2024-02-06T11:27:14.369000Z TestFramework (ERROR): Assertion failed 
    1                                   Traceback (most recent call last):
    2                                     File "/ci_container_base/ci/scratch/build/bitcoin-x86_64-pc-linux-gnu/test/functional/test_framework/test_framework.py", line 131, in main
    3                                       self.run_test()
    4                                     File "/ci_container_base/ci/scratch/build/bitcoin-x86_64-pc-linux-gnu/test/functional/wallet_reorgsrestore.py", line 75, in run_test
    5                                       assert_equal(conflicted["confirmations"], -9)
    6                                     File "/ci_container_base/ci/scratch/build/bitcoin-x86_64-pc-linux-gnu/test/functional/test_framework/util.py", line 57, in assert_equal
    7                                       raise AssertionError("not(%s)" % " == ".join(str(arg) for arg in (thing1, thing2) + args))
    8                                   AssertionError: not(-8 == -9)
    

    I wonder if we just need a syncwithvalidationinterfacequeue?

  2. maflcko added the label Wallet on Feb 6, 2024
  3. maflcko added the label CI failed on Feb 6, 2024
  4. maflcko added the label Tests on Feb 6, 2024
  5. araujo88 commented at 5:04 pm on February 8, 2024: contributor

    @maflcko I’ll continue our conversation here since #29395 was closed.

    I’m not sure if I understood your idea clearly. We would add a sleep in the source code than run the related functional tests to try to force a race condition, or is the issue within the functional tests themselves? Isn’t the source of the issue clear yet for us to pin it down?

    If you can point me in the right direction, I’ll make some tweaking in the source code locally them post the results here for us to discuss.

  6. maflcko commented at 5:20 pm on February 8, 2024: member

    Isn’t the source of the issue clear

    So far I haven’t seen the source of the issue.

  7. mzumsande commented at 8:05 pm on February 8, 2024: contributor

    If you can point me in the right direction

    I don’t really think anyone has actually managed to reproduce the failure (i.e. managed to get the test fail locally with master in the same way it did in the failed CI run, possibly after putting a sleep somewhere). Or if you have managed that, you should describe what exactly you did. Therefore, at this point I don’t think anyone knows the right direction, and the main part of solving this issue - which is not necessarily easy - is figuring that out somehow. Coming up with the right fix after that is often trivial in comparison.

  8. araujo88 commented at 8:40 pm on February 8, 2024: contributor

    If you can point me in the right direction

    I don’t really think anyone has actually managed to reproduce the failure (i.e. managed to get the test fail locally with master in the same way it did in the failed CI run, possibly after putting a sleep somewhere). Or if you have managed that, you should describe what exactly you did. Therefore, at this point I don’t think anyone knows the right direction, and the main part of solving this issue - which is not necessarily easy - is figuring that out somehow. Coming up with the right fix after that is often trivial in comparison.

    Understood. I’ll try to reproduce the failure locally then report back here if I managed to pinpoint it.

  9. mzumsande commented at 8:44 pm on February 12, 2024: contributor

    I found the issue by analyzing the logs: The source of the failure is in an earlier part of the test, at height 205: There is a race at node2 between connecting the block produced by node0, and using -generate to create new blocks itself: In the failed run, the latter happened first, so that there are two conflicting blocks at height 205:

    node2 2024-02-06T11:27:13.136590Z [msghand] [validation.cpp:4015] [AcceptBlockHeader] Saw new header hash=00ed47f85a3d5292b6ed6358f1d1fdee61e54e146fd65ed96a6c9b667edbc9f5 height=205

    node2 2024-02-06T11:27:13.176319Z [httpworker.0] [validation.cpp:4015] [AcceptBlockHeader] Saw new header hash=74bed2cf73825d931f48180463d2a58d4113142cdb604e148f04783514be2231 height=205

    As a result, the final block height is smaller by 1 than expected (the block of height 214 is never mined, as it would be in successful runs).

    See #29425 for a simple fix.

  10. fanquake closed this on Feb 13, 2024

  11. fanquake referenced this in commit 3054416f62 on Feb 13, 2024

github-metadata-mirror

This is a metadata mirror of the GitHub repository bitcoin/bitcoin. This site is not affiliated with GitHub. Content is generated from a GitHub metadata backup.
generated: 2024-12-30 15:12 UTC

This site is hosted by @0xB10C
More mirrored repositories can be found on mirror.b10c.me