Hi Mike,
Dear Conduition,
You did a really nice job, I was wondering if it will be hard to add the different modifications to your implementations?
As for lattice-based schemes and other assumptions, we also thought about investigations the possibilities there.
With this derivation technique you propose, am I understanding correctly that if the user signs with the hash-based scheme, then the user would reveal that the different pub keys are linked?
I think it is true, that limiting the number of signatures is the main optimisation we should look at. But if we use different parameters sets don’t we already loose the compatiability with the standardized schemes? And if we already deviate from the standards, why don’t add the modifications that can save us extra couple of hundred bytes. From the implementation complexity, of course this is subjective, but I think these modifications are pretty straight forward.
Best,
Mike
Ср, 10 дек. 2025 г. в 09:48, Mikhail Kudinov <mkud...@blockstream.com>:Dear Greg,
Thank you for your feedback, your points are important, and I appreciate the opportunity to continue the discussion and clarify a few aspects of your response.
On public-key and signature sizes:
My main point was that when we compare with other pq alternatives (such as Lattice-based schemes) we should take their public key sizes into account. For ML-DSA the public key is more than 1kB.
On verification costs:
I agree that it is important to consider how the verification cost scales with block size. At the same time, I believe it is still important to highlight the ratio between signature size and verification cost. For certain parameter sets, this ratio is significantly more favorable than for Schnorr signatures. For some parameter sets it can be more than 10 times better: if we look at the parameters sets in the Table 1, we can achieve 4480 bytes signatures (under the 2^40 signatures limit) with verification ratio being almost 9 times better. I would welcome further feedback here. Specifically, would it be reasonable to choose larger signatures if they offer lower verification costs?
On stateful vs. stateless security:
Regarding your comment, “I think schemes with weakened stateless security could be improved to ~full repeated-use security via statefulness (e.g., grinding the message to avoid revealing signatures that leak key material),” I did not fully grasp your argument. Could you please elaborate?
On combining threshold Schnorr with a PQ scheme:
You mentioned that “there may be advantages to using a threshold Schnorr in series with a single PQ scheme.” My current thinking is that such constructions could already be implemented at the scripting layer; in that sense, users could assemble them without additional opcodes (beyond the PQ signature opcode itself). While I see the potential benefits, I am also worried that such an approach risks introducing loosely-defined security models, which can lead to vulnerabilities.
Best,
Mike
Вт, 9 дек. 2025 г. в 19:20, Boris Nagaev <bna...@gmail.com>:Hi Mikhail, Jonas and all!> If we look at multi/distributed/threshold-signatures, we find that current approaches either don't give much gains compared to plain usage of multiple signatures, or require a trusted dealer, which drastically limits the use-cases.I think there's room to explore N/N Multiparty computation (MPC) for hash-based signatures. In principle you can secret-share the seed and run MPC to (a) derive the pubkey and (b) do the per-signature WOTS/FORS work, so the chain sees a single normal hash-based signature while it is a result of cosigning by all N parties. The output stays a single standard-size signature (e.g., a 2-of-2 compressed to one sig), so you save roughly by N times versus N separate signatures, but the cost is a heavy MPC protocol to derive the pubkey and to produce each signature. There's no linearity to leverage (unlike MuSig-style Schnorr), so generic MPC is heavy, but it could be interesting to quantify the overhead versus just collecting N independent signatures.
As a small reference point, here's a two-party SHA-256 MPC demo I recently wrote (not PQ-safe, EC-based oblivious transfer, semi-honest): https://github.com/markkurossi/mpc/tree/master/sha2pc . The protocol moves about 700 KB of messages and completes in three rounds while privately computing SHA256(XOR(a, b)) for two 32-byte inputs. The two-party restriction, quantum-vulnerable OT, and semi-honest model could all be upgraded, but it shows the shape of the protocol.With a malicious-secure upgrade and PQ OT, sha2pc would already be enough for plain Lamport signatures by repeating it 256x2 times. For WOTS-like signatures you'd need another circuit, but the same repo has tooling for arbitrary circuits, and WOTS is just a hash chain, so it is doable; circuit and message sizes should grow linearly with the WOTS chain depth.
Curious to hear thoughts on whether N/N MPC with hash-based sigs is worth prototyping, and what overhead targets would make it compelling versus plain multisig.Best,Boris--On Monday, December 8, 2025 at 5:47:49 PM UTC-3 Mikhail Kudinov wrote:Hi everyone,
We'd like to share our analysis of post-quantum options for Bitcoin, focusing specifically on hash-based schemes. The Bitcoin community has already discussed SPHINCS+ adoption in previous mailing list threads. We also looked at this option. A detailed technical report exploring these schemes, parameter selections, security analysis, and implementation considerations is available at https://eprint.iacr.org/2025/2203.pdf. This report can also serve as a gentle introduction into hash-based schemes, covering the recent optimization techniques. The scripts that support this report are available at https://github.com/BlockstreamResearch/SPHINCS-Parameters .
Below, we give a quick summary of our findings.
We find hash-based signatures to be a compelling post-quantum solution for several reasons. They rely solely on the security of hash functions (Bitcoin already depends on the collision resistance of SHA-256) and are conceptually simple. Moreover, these schemes have undergone extensive cryptanalysis during the NIST post-quantum standardization process, adding confidence in their robustness.
One of the biggest drawbacks is the signature sizes. Standard SPHINCS+ signatures are almost 8KB. An important observation is that SPHINCS+ is designed to support up to 2^64 signatures. We argue that this bound can be set lower for Bitcoin use-cases. Moreover, there are several different optimizations (like WOTS+C, FORS+C, PORS+FP) to the standard SPHINCS+ scheme, that can reduce the signature size even more.
For example, with these optimizations and a bound on 2^40 signatures we can get signatures of size 4036 bytes. For 2^30 signatures, we can achieve 3440 bytes, and for 2^20, one can get 3128 bytes, while keeping the signing time reasonable.
We should not forget that for Bitcoin, it is important that the size of the public key plus the size of the signature remains small. Hash-based schemes have one of the smallest sizes of public keys, which can be around 256 bits. For comparison, ML-DSA pk+sig size is at least 3732 bytes.
Verification cost per byte is comparable to current Schnorr signatures, alleviating concerns about blockchain validation overhead.
As for security targets, we argue that NIST Level 1 (128-bit security) provides sufficient protection. Quantum attacks require not just O(2^64) operations but approximately 2^78 Toffoli depth operations in practice, with limited parallelization benefits.
One of the key design decisions for Bitcoin is whether to rely exclusively on stateless schemes (where the secret key need not be updated for each signature) or whether stateful schemes could be viable. Stateful schemes introduce operational complexity in key management but can offer better performance.
We explored the possibilities of using hash-based schemes with Hierarchical Deterministic Wallets. The public child key derivation does not seem to be efficiently achievable. The hardened derivation is naturally possible for hash-based schemes.
If we look at multi/distributed/threshold-signatures, we find that current approaches either don't give much gains compared to plain usage of multiple signatures, or require a trusted dealer, which drastically limits the use-cases.
We welcome community feedback on this approach and hope to contribute to the broader discourse on ensuring Bitcoin's long-term security in the post-quantum era. In particular, we are interested in your thoughts on the following questions:
1) What are the concrete performance requirements across various hardware, including low-power devices?
2) Should multiple schemes with different signature limits be standardized?
3) Is there value in supporting stateful schemes alongside stateless ones?
Best regards,
Mikhail Kudinov and Jonas Nick
Blockstream Research
You received this message because you are subscribed to the Google Groups "Bitcoin Development Mailing List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bitcoindev+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/bitcoindev/492feee7-e0da-4d4d-bb7a-e903b321a977n%40googlegroups.com.