This fuzz target is largely the same as the timedata target, but
both are kept to allow continued testing of both for now. In the
future, the timedata input folder may be removed.
maflcko
commented at 7:06 am on March 19, 2024:
contributor
For git, renaming and creating a copy is the same cost (close to zero). So I think it would be better to create a copy, to allow the old fuzz target and fuzz inputs to continue to exist for a while, as there is only one branch in this repo.
Moreover, better than creating a copy of a single folder would be to pick fuzz inputs from all other folders with the coverage algorithm. This way the cost is still close to zero, but you’ll likely get more coverage.
Finally, the best would be to do the previous step, and then on top, let the fuzz engine run for a bit on an empty folder, as well as a the populated one, and then merge both results into a fresh folder and submit that.
In any case, probably doesn’t matter for this fuzz target, so lgtm.
stickies-v force-pushed
on Mar 19, 2024
stickies-v renamed this:
[WIP] Rename timedata folder to timeoffsets
[WIP] Add timeoffsets inputs folder
on Mar 19, 2024
stickies-v
commented at 12:27 pm on March 19, 2024:
contributor
Thanks for your comprehensive explanation, maflcko.
So I think it would be better to create a copy
Updated my approach to create a copy instead.
Moreover, better than creating a copy of a single folder would be to pick fuzz inputs from all other folders with the coverage algorithm. This way the cost is still close to zero, but you’ll likely get more coverage.
I’d be happy to try this but unfortunately am a bit out of my depth here and could not find relevant instructions on the documentation in this repo, or online. If you have any pointers for me to look at that would be helpful, but also I’m happy to keep it at the current approach if that’s not worth the hassle.
maflcko
commented at 12:31 pm on March 19, 2024:
contributor
lgtm
murchandamus
commented at 1:52 pm on March 19, 2024:
contributor
Moreover, better than creating a copy of a single folder would be to pick fuzz inputs from all other folders with the coverage algorithm. This way the cost is still close to zero, but you’ll likely get more coverage.
I’d be happy to try this but unfortunately am a bit out of my depth here and could not find relevant instructions on the documentation in this repo, or online. If you have any pointers for me to look at that would be helpful, but also I’m happy to keep it at the current approach if that’s not worth the hassle.
It’s possible to provide multiple source directories to the fuzzer. The first directory you provide is both a source and also the destination of newly added seeds. When you create a new target, you can scrounge up the seeds from all targets to see if any of them improve the coverage of your new target, e.g. like this:
murchandamus
commented at 2:21 pm on March 19, 2024:
contributor
On trying this, it might be better to go with merge=1 instead of set_cover_merge=1 when sourcing that many seeds for an initial corpus. set_cover_merge keeps running out of memory for me towards the end and is progressing with single digit executions per second, whereas a regular merge seems to be progressing much quicker. The set could then be reduced further via a set_cover_merge afterwards, but that’s probably not necessary for an initial corpus.
murchandamus
commented at 2:37 pm on March 19, 2024:
contributor
I got MERGE-OUTER: 65 new files with 1945 new features added; 62 new coverage edges with set_cover_merge
stickies-v force-pushed
on Mar 21, 2024
stickies-v
commented at 2:41 pm on March 21, 2024:
contributor
Thanks for the guidance @murchandamus, that was very helpful. I’ve updated my approach to start with seeds from all targets, then run the fuzzer on an empty dir, and then finally merge both together.
From the existing seeds, I got:
MERGE-OUTER: 109 new files with 5720 new features added; 726 new coverage edges
After merging with a fresh run, I got:
MERGE-OUTER: 37 new files with 277 new features added; 12 new coverage edges
I’ve summarized my actions (and outputs) below.
Starting with an empty timeoffsets dir:
0$ FUZZ=timeoffsets src/test/fuzz/fuzz -set_cover_merge=1 -shuffle=0 -prefer_small=1 -use_value_profile=0 ../../bitcoin-core-qa-assets/fuzz_seed_corpus/timeoffsets ../../bitcoin-core-qa-assets/fuzz_seed_corpus/*
1...
2MERGE-OUTER: successful in 1 attempt(s)
3MERGE-OUTER: the control file has 387517107 bytes
4MERGE-OUTER: consumed 193Mb (830Mb rss) to parse the control file
5MERGE-OUTER: 109 new files with 5720 new features added; 726 new coverage edges
0$ FUZZ=timeoffsets src/test/fuzz/fuzz -set_cover_merge=1 -shuffle=0 -prefer_small=1 -use_value_profile=0 ../../bitcoin-core-qa-assets/fuzz_seed_corpus/timeoffsets ../../bitcoin-core-qa-assets/fuzz_seed_corpus/timeoffsets-empty
1...
2MERGE-OUTER: successful in 1 attempt(s)
3MERGE-OUTER: the control file has 4547882 bytes
4MERGE-OUTER: consumed 2Mb (82Mb rss) to parse the control file
5MERGE-OUTER: 37 new files with 277 new features added; 12 new coverage edges
Add timeoffsets input folder722b5b4d6a
stickies-v force-pushed
on Mar 21, 2024
stickies-v
commented at 7:21 pm on March 21, 2024:
contributor
One more force-push, thanks for bearing with me here.
Turns out the timeoffsets harness had a memory leak. That’s now fixed, so I’ve recreated the inputs folder using the same process.
murchandamus
commented at 4:51 pm on March 22, 2024:
contributor
Sounds like a good initial set to me.
stickies-v
commented at 9:26 pm on March 22, 2024:
contributor
Would you like me to schedule a cronjob to fuzz that for 12 hours with 12 processors early next week?
This is a metadata mirror of the GitHub repository
bitcoin-core/qa-assets.
This site is not affiliated with GitHub.
Content is generated from a GitHub metadata backup.
generated: 2024-10-30 01:25 UTC
This site is hosted by @0xB10C More mirrored repositories can be found on mirror.b10c.me