Describe the issue
Can you reliably reproduce the issue?
If so, please list the steps to reproduce below:
In prevector.h
345 if (capacity() < new_size) {
346 change_capacity(new_size + (new_size >> 1));
347 }
This construct makes it prone to uint32_t (the default size prevector uses) overflow if new_size is very high. A loop where many many elements are added to a prevector will trigger this. std::vector does not appear to exhibit problems when dealing with excessively large amounts of data.
Expected behaviour
Nothing (or throwing an exception?)
Actual behaviour
Memory corruption
Screenshots.
What version of bitcoin-core are you using?
bitcoin-0.14.2 self-compiled
Machine specs:
- OS: Linux
- CPU:
- RAM:
- Disk size:
- Disk Type (HD/SDD):
Any extra information that might be useful in the debugging process.
My prevector fuzzer ( https://github.com/guidovranken/bitcoin/blob/fuzzing/fuzzers/fuzzer-prevector.cpp ) can be used to find this. You can compile it with the flag -DSIZETYPE_UINT16_T so that prevector uses uint16_t rather than the default uint32_t as its internal size type. So rather than having to operate on buffers that are gigabytes large in order to trigger any bugs, it can be done with buffers that involve only +/- 64K buffers.