char
-> uint8_t
in serialization where a sign doesn’t make sense (char might be signed/unsigned).
char
-> uint8_t
in serialization where a sign doesn’t make sense (char might be signed/unsigned).
Concept ACK
Explicit is better than implicit.
33@@ -34,7 +34,7 @@ int CAddrInfo::GetNewBucket(const uint256& nKey, const CNetAddr& src, const std:
34
35 int CAddrInfo::GetBucketPosition(const uint256 &nKey, bool fNew, int nBucket) const
36 {
37- uint64_t hash1 = (CHashWriter(SER_GETHASH, 0) << nKey << (fNew ? 'N' : 'K') << nBucket << GetKey()).GetCheapHash();
38+ uint64_t hash1 = (CHashWriter(SER_GETHASH, 0) << nKey << (fNew ? uint8_t{'N'} : uint8_t{'K'}) << nBucket << GetKey()).GetCheapHash();
char
type?
Not sure what you mean. This tells the compiler to select the uint8_t-serialization for the ASCII-character N
(or K
).
None of the ASCII characters use the “sign”-bit, so the serialization of them is identical, regardless of whether they are serialized as int8_t, char, unsinged char, uint8_t, or signed char
I mean that char literals have type char
.
And it is not clear for me if uint8_t{'N'}
involves any type conversion?
None of the ASCII characters use the “sign”-bit…
Right.
Is this a philosophical question?
Previously the compiler produced code to serialize the value 78
(which is ‘N’) by selecting the char
serialization template.
Now, the compiler produces code to serialize the value 78
(which is also ‘N’) by selecting the uint8_t
serialization template.
Both templates map to the same code, so the binary should be identical. (Modulo some debug symbols, maybe)
gcc
and clang
with -O2
.
The following sections might be updated with supplementary metadata relevant to reviewers and maintainers.
No conflicts as of last run.