-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BIP-360: QuBit - Pay to Quantum Resistant Hash #1670
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interesting (the question of resistance to quantum computing may have resurged lately with the publication of https://scottaaronson.blog/?p=8329, see also https://x.com/n1ckler/status/1839215426091249778).
b6ed2c3
to
d6d15ad
Compare
0608cc1
to
a595bf0
Compare
19d4592
to
7f4456d
Compare
@cryptoquick Can you begin to write up the sections currently marked as TBD, along with a backwards compatibility section (to describe incompatibilities, severity, and suggest mitigations, where applicable/relevant)? We've begun to reserve a range of BIP numbers for this topic, pending continued progress here. |
@cryptoquick ping for an update here. Have you seen https://groups.google.com/g/bitcoindev/c/p8xz08YTvkw / https://github.com/chucrut/bips/blob/master/bip-xxxx.md? It may be interesting to review each other and possibly collaborate. |
most post-quantum algorithms that are not yet cracked are lattice structures (unless we want the signature size to be several megabytes) their mathematical basis does not allow for tweaks, as in eleptic curves. These are different mathematical structures, for example RSA, the problem with dilithium and falcon is that there is no analogous G, even potential components f and g are secrets This is not native functions of of clean , i created self based at exist function ( you can see it via added memory cleanse .. native does not clear temp buffers )
|
@mraksoll4 I'm not sure I understand the point you're trying to make. |
I meant that the capabilities of elliptic curves cannot and should not be directly projected onto post-quantum algorithms. A different approach is needed. BIP32 and the ability to derive public keys from public keys is a unique feature of elliptic curves. Even isogeny-based cryptography, which no longer holds against certain attacks, doesn’t support such functionality. The key difference lies in the underlying mathematical structures. But we can still build a key tree for private keys using abstractions on top of the seed. The problem lies specifically with public keys. |
No, it's not. BIP-32 doesn't rely on key tweaking. It just produces entropy (private keys) in a deterministic way. I think you're confused because you're only working from one implementation. For example, this implementation of FALCON would support BIP-32: |
As far as I understand, this also works directly with key pairs. Earlier, I mentioned that we can manipulate the seed to generate child key pairs. The issue lies in the fact that we cannot derive public keys from other public key
|
I see your point now. For example, using an xpub alone to generate more keys for, say, a watch-only wallet, might not be possible with FALCON. I'll need to think about that. |
Yes, but nothing prevents us from using single keys for watch-only wallets. For example, if a user tries to export an xpub for a post-quantum algorithm, we could display a message like: "Post-quantum algorithm [name] does not support xpub keys. You need to explicitly export the key using a command to retrieve it from the descriptor or key cache." Alternatively, we could provide a list of existing keys in a serialized format. For instance: Generate 2000 keys on-demand when the user requests a key list. |
There is another option where multiple addresses can be linked to a single key pair through key packing and unpacking using additional parameters, with the index influencing the final address. However, this approach has limited practical value. While it makes it impossible to determine any connection between the addresses before a transaction is conducted, it resembles the concept of non-hardened derivation. |
…erns. Refactor language from long-range attack to long-exposure so as to not be confused with the language around block re-org attacks.
Perhaps it would make sense to describe the general output type mechanism, and then to have a separate BIP per signature algorithm to plug in? |
That's an interesting idea. Do you think it would make sense to showcase the new transaction structure and commitment scheme, which by default still requires Schnorr signatures, just without the additional signature algorithms? Then the test vectors for this BIP would only be for secp256k1, and then we work out the requirements for specific PQC algorithms into a separate BIP...? This would have the advantage that it would simplify large sections of the BIP dedicated to signature algorithm selection and we can have those discussions separately. The only problem with that is, it really takes the fangs out of BIP-360 as a quantum hardening BIP. It does make the improvement that it secures Taproot against long exposure attacks, so we don't need a BIP, like say, Pay to Taproot Hash. Do you think it might make sense to implement just one algorithm in this also, maybe FALCON? Maybe we make the stipulation that FALCON is provisional and that a more authoritative set of signature algorithms will be provided in a separate BIP? One argument for having a separate P2TRH BIP is that it's a much simpler implementation, since it doesn't require a change to transaction structure and commitment scheme. I've already worked on a draft of that, but it's not polished and I'm not sure if I want to announce it since we might go with an abbreviated BIP-360. |
I believe the solution would be to use multithreading for signing and verification processes to mitigate the slowness of post-quantum algorithms. In my opinion, the most promising algorithms so far are lattice-based ones. Ideally, they should be used in combination with hash-based approaches, such as sponge constructions like Keccak (e.g., SHA-3, SHAKE256). In the case of hybrid public keys, for each ECDSA-derived child address generation, there should be an automatic request for parallel derivation of a corresponding public key - using a post-quantum algorithm (if the attestation field method is used). I’m talking about real-world scenarios that are already available now. So, while there might be something new in the future, it’s likely that the evolution of post-quantum addresses will follow a similar path as P2SH -> Taproot. Regarding old addresses, there could be a separate rule with flexible timeframes where the attestation field would be required |
Some things are not easy to multithread, such as WASM implementations, like that used by BDK web wallets, so we can't rely on that to address those concerns. SQIsign's performance was particularly terrible, and so it was removed.
Given that the threat Grover's poses is more remote than the threat posed by Shor's, I'd really rather restrict this BIP to only changes to signature algorithms and not to any hash functions used by Bitcoin.
Absolutely not. It's fine to support secp256k1 even in a post-quantum regime so long as it's combined with PQC. |
It’s more convenient to store a separate descriptor with a separate seed for post quantum addresses and separation of generation; in addition to security, this is convenient if we later want to add more signature algorithms in case of combination it is also important what do we want to use for quantum algorithm as a seed? If keys are what they are and convert them to 48 bytes of seed, then what is the point? It will be enough for an attacker to crack ecdsa, because they are connected, if they have their keygen with their seed and are connected almost exclusively by paths, that is, an additional level of protection. Sha3 algorithms are less susceptible to Sha256 problems, shake256 is simply convenient as a converter for obtaining good cryptographic bytes from combined paths with a base seed.
this is a simplified scheme without scripts, in a real scheme we also include scripts in essence, 2 signature algorithms converge for us through the use of 1 path in relation to the child keys of ecdsa, simultaneously requesting for the attestation field a public key for PQ, or a private key when we need a signature it is also possible to do caching, but due to the fact that the keys are huge, this should be optional for anyone who needs it, for everyday use we store only the master and the paths associated with it |
@mraksoll4 That makes sense. Just so you're aware, this BIP is for the consensus layer. A separate BIP will be needed to describe wallet behavior in the application layer. |
Honestly, this descriptor logic has completely broken my brain... Need to create not just a separate descriptor but an entirely separate seed storage. In principle, there's no major issue in implementing an HMAC-like derivation using SHA3 and Shake256, where it will be necessary to operate with seeds (though FIPS prohibits any form of derivation if certification is required). The optimal version remains the padded Falcon 512 with a fixed signature size—1024 already seems excessive to me. |
Falcon 1024 corresponds to NIST Level V which corresponds to 256 bits of security. I originally thought that would be analogous to the security level provided by secp256k1, so there would be no regression in security assumptions. Technically that's not true due to Pollard's rho attack (Ideally a cryptographer could help demystify this assumption), but the intention was clearly there to use 256 bit security was there when Satoshi chose the curve. |
As far as I remember falcon 512 ≈ secp256k1 security as an acceptable minimum ≈ AES-128 that is, for a classical system they are approximately equal, but secp256k1 has a vulnerability in the case of a quantum environment; post-quantum algorithms try to solve this problem. Most likely, the choice fell on secp256k1 256 bits (it seems even smaller there) as a balance between security and performance when Satoshi chose what to use. The primary security relied on the system of change addresses, since it was assumed that no one would reuse the same address and that the address space would be sufficient even considering the change addresses. In fact, when executing a transaction there should be at least two outputs - one for the intended transaction and one for the change. However, it turned out that many people needed to reuse the same address. |
2. The attestation must include: | ||
|
||
* The quantum-resistant public key(s) whose HASH256 concatenated and hashed again matches the <nowiki><hash></nowiki> in | ||
the <code>scriptPubKey</code>. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you clarify whether all leafs (all public keys) must be revealed or whether it is sufficient to reveal only the leaf hashes (and at least a single public key I suppose).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ideally, if an address is being reused, to save space we can store a virtual link to the existing public key if it has already been revealed.
For greater address security, it makes sense to also hash the key from the post quantum algorithm in sha3
Regarding the level of security, it is possible to consider a system where for multi-signature addresses you can use a more secure version of the algorithm; for regular addresses where new addresses for change can be created, we can use a lighter version of the algorithm.
using Falcon as an example
Falcon 512 for regular address.
Falcon 1024 for Multisig (since such addresses will not change and here it will also be useful to use a link if the public key has already been revealed)
when I tried to make a test solution, I got stuck on organizing the storage of additional keys in the db. I even had an idea of completely separate storage up to a separate wallet.dat, that is, when creating a wallet, we have a choice of transition, and the wallet will be able to only generate post quantum addresses, from the rest only accept without generating old types of addresses, transferring all types to legacy + format. If we need old types of addresses, we will have to use a separate wallet.dat.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jonasnick The point of using a merkle tree was to make it so that it is sufficient to reveal only the leaf hashes. However, without committing to multisig semantics in the output, there is a flaw in the security here. So, I think I'll need to add quorum and total bytes to the hash as well. One question is, will this also need to include a byte for the key type so that multisigs can be committed to separately based on which signature algorithm is used.
@mraksoll4 I don't think we'll need to go back to using wallet.dat. Keys can be deterministically generated from the seed and then provided to libbitcoinpqc. There won't be full BIP-32 compatibility, but partial support.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we still need to cache keys, for classic descriptors on ecc we cache them, also we need to store seed somewhere...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ideally, there would be an algorithmic (pseudocode) description of the algorithm. Right now, it's not clear how exactly the Merkle tree is built. For example, the BIP appears to require revealing all leaves for the Merkle tree instead of using regular Merkle inclusion proofs whose size (for a single element proof) is only logarithmic in the number of elements. Also, does the BIP support a number of public keys that's not a power of two?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does the BIP support a number of public keys that's not a power of two?
That's a good point. The BIP doesn't specify how NPOT merkle trees should be treated. That will be one scenario I'd want to think through when working on the test vectors. Do you have a suggestion on how that should be handled?
the BIP appears to require revealing all leaves for the Merkle tree instead of using regular Merkle inclusion proofs whose size (for a single element proof) is only logarithmic in the number of elements.
I'm not sure I understand this. The public keys don't need to be revealed, only the hash of the public key.
Bitcoin only achieves 128-bit security in both its curve and its choice of hash function (SHA-256 only has 128-bit security against arbitrary collisions). While I won't comment on Satoshi's intent, I don't believe it's reasonable to seek a 256-bit security level, solely a level expected to at-worst decay to 128-bit. |
bip-0360.mediawiki
Outdated
[https://web.archive.org/web/20240715101040/https://www2.deloitte.com/nl/nl/pages/innovatie/artikelen/quantum-computers-and-the-bitcoin-blockchain.html this Deloitte report]. | ||
The report estimates that in 2020 approximately 25% of the Bitcoin supply is held within addresses vulnerable to | ||
quantum attack. As of the time of writing, that number is now closer to 20%. Independently, Bitcoin developer Pieter | ||
Wuille [https://x.com/pwuille/status/1108085284862713856 reasons] even more addresses might be vulnerable, representing |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just learned this link is now dead, as that developer deleted his account.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems to be available through archive.org if needed https://web.archive.org/web/20220531184542/https://twitter.com/pwuille/status/1108085284862713856
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reviewed only the new “Motivation” section so far.
I think it’s a good idea to indeed dissociate the output type itself (P2QRH) from the supported post-quantum signature algorithm (FALCON, SPHINCS+, etc) and ensuring the output type can be indifferently committed in a new witness program, as a new taproot leaf version or in the future as a g’root / grafroot style construction.
bip-0360.mediawiki
Outdated
|
||
=== Motivation === | ||
|
||
The primary threat to Bitcoin from Cryptoanalytically-Relevant Quantum Computers (CRQCs) is their potential to break |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This can be valuable to have a footnote pointing that Cryptoanalytically-Relevant Quantum Computers is an object which is only defined with loosely characteristics in quantum physics as of today. It could be understood in the context of this BIP / bitcoin that it’s a hardware-agnostic computer supposed to have the architecture to keep coherent a sufficient number of logical qubits to be able to run the Shor algorithm in an efficient fashion.
For the context of bitcoin, at least simplifying a bit, I do not think we have to reason further than on a basic quantum computer being able to computer Shor’s algorithm or Grover’s algorithm. At least, even if the energy consumed starts to matter, especially to analyze the comparison w.r.t to miners I think it matters less from a cryptanalysis viewpoint.
bip-0360.mediawiki
Outdated
relies on PQC signature algorithms. By adopting PQC, Bitcoin can enhance its quantum | ||
resistance without requiring a hard fork or block size increase. | ||
|
||
The vulnerability of existing Bitcoin addresses is investigated in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it can be valuable to come with a practical definition of a vulnerable bitcoin address and not only giving quantitative estimate of the exposed scriptpubkeys. E.g a vulnerable bitcoin address is a scriptpubkey type exposing as raw bytes in a block an elliptic curve public key solvable by a run of the Shor’s algorithm.
|
||
Ordinarily, when a transaction is signed, the public key is explicitly stated in the input script. This means that the | ||
public key is exposed on the blockchain when the transaction is spent, making it vulnerable to quantum attack until | ||
it's mined. One way to mitigate this is to submit the transaction directly to a mining pool, bypassing the mempool. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is a good description of why revealing the public key to a mining pool only do not solve quantum attack exposure - Though this is a simplification as if the mining pool is in open access given the block template has to be distributed to all pool miners as a job (— unless you assume a miner only gets a candidate header from the pool which is SPV-mining). A quantum attacker would just have to register to the mining pool to get a view of the public key.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed. I'll also include a section on how block reorg attacks also don't make this a secure solution.
the authors estimate that a CRQC with 28 million superconducting physical qubits would take 8.3 seconds to calculate a | ||
256-bit key, while a CRQC with 6.9 million physical qubits would take 58 seconds. This implies that a CRQC with 4x as | ||
many qubits would be roughly 7 times faster. | ||
</ref> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don’t think the distinction between long-exposure and short-exposure quantum attack in the bitcoin context is that relevant, for few reasons:
-
Communicating bip-32 xpubs with all its standardized derivation paths is a reality for all kind of consumer and users wallets. If you give an xpub to a third-party, this third-party can know exploit it for a quantum attack, even if there is never a coin transfer to the derived address on the public chain.
-
This is seeing the “mempool” as a black box where in fact there is at least 2 major components: the main memory buffer of transactions (i.e
CTxMempool
in bc) and the transaction-relay stack, where inv, getdata, tx messages are flowing among nodes. As soon as the tx message is outgoing from the orignal broadcasting node the hash-committed spending script (i.ewitnessScript
orredeemScript
) can be discovered either by a network listener (if no bip324 encryption respected on the link) or by the peer node, while the transaction might never get into the mempool (e.g too low fees, not standard, etc). -
As you’re pointing the objective criteria by which to dissociate between short-term and long-term exposure appears to be the “block time” and this is one sounds more in the chain versus mempool one. In average block time is of ~600 seconds in bitcoin and while the exact hashrate existent during a retarget period cannot be measured (it’s a kind of random walk to find a block), the probabilistic measure of the network hashrate can be linked back to some energy consumption estimation that can be roughly analyzed in equivalence to the energy consumed to run a quantum computer.
So I think a better long-exposure vs short-exposure kind of heuristic definition in the bitcoin world would be better grounded on the computational difficulty to mine a block according to the consensus rules.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm. I'm not sure I understand the point you're making. The separation of long-exposure and short-exposure was developed to highlight that keys exposed for a long time are more vulnerable, and additionally, a more powerful quantum computer is required to effect short range attacks, as explained in the "short-exposure" ref. This is also address a point in several papers I've seen that assume that a quantum attacker must be able to solve the key within minutes or hours, when in fact there are many keys already exposed.
bip-0360.mediawiki
Outdated
commitments. Specifically, [https://arxiv.org/pdf/quant-ph/0301141 Shor's algorithm] enables a CRQC to solve the | ||
Discrete Logarithm Problem (DLP) exponentially faster than classical methods<ref name="shor">Shor's algorithm is | ||
believed to need 10^8 operations to break a 256-bit elliptic curve public key.</ref>, allowing the derivation of | ||
private keys from public keys—a process referred to as quantum key decryption. Importantly, simply doubling the public |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: I tried to look up "quantum key decryption" because it struck me as a very confusing term, but I didn't find any references.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll change it to, "a process referred to here"
bip-0360.mediawiki
Outdated
In the distant future, following the implementation of the P2QRH output type in a QuBit soft fork, there will likely | ||
be a need for Pay to Quantum Secure (P2QS) addresses. A distinction is made between cryptography that's merely resistant | ||
to quantum attack, and cryptography that's secured by specialized quantum hardware. P2QRH is resistant to quantum | ||
attack, while P2QS is quantum secure. These will require specialized quantum hardware for signing, while still | ||
[https://quantum-journal.org/papers/q-2023-01-19-901/ using public keys that are verifiable via classical means]. | ||
Additional follow-on BIPs will be needed to implement P2QS. However, until specialized quantum cryptography | ||
hardware is widespread, quantum resistant addresses should be an adequate intermediate solution. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: I'd remove that paragraph because it's highly speculative and doesn't affect the design of this BIP as far as I can tell.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I might argue it just needs an update, to capture discussions here and on the mailing list. I'd like to add a rationale for why this is an imperfect solution, and why it doesn't need to be perfect. Signature aggregation for example can be added in a separate BIP and output type.
however is that P2QRH will encode a hash of the public key. This is a significant deviation from how Taproot works by | ||
itself, but it is necessary to avoid exposing public keys on-chain where they are vulnerable to attack. | ||
|
||
P2QRH uses a 32-byte HASH256 (specifically SHA-256 twice-over) of the public key to reduce the size of new outputs and |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a rationale for double SHA-256?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm just assuming it could increase the difficulty of running Grover's algorithm. It's hard to know whether that assumption would hold up in practice since solving Grover's for SHA-256 in a quantum circuit is still theoretical. I figured it also couldn't hurt to use the same function as is used for PoW. It just might make things more secure in ways we can't foresee.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't see how it would make a difference. It would just result in running SHA256 twice per Grover iteration instead of once. Why not hash 10 times then.
Anyway, in my experience, you will get the same questions about various design choices (like this one) over and over. The best way I know to deal with this to document them early and thoroughly, for example using footnotes as in BIP 341 or BIP 327.
bip-0360.mediawiki
Outdated
If the parent private key of an extended public key (xpub) is recovered by a CRQC, the attacker also recovers | ||
the entire extended private key, whether it uses hardened or unhardened derivation. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a bit unclear to me. Do you mean "private key corresponding to an extended public key" (and not the parent of the public key)?
And such an attack not only allows recovering the extended private key (which is trivial, as it's the same as the private key of the extended public key + the chaincode), but more importantly, it allows computing any of the child private keys.
Co-authored-by: Jonas Nick <[email protected]>
bip-0360.mediawiki
Outdated
* '''FALCON-1024:''' | ||
* Public Key Length: 1,793 bytes | ||
* Signature Length: 1,280 bytes | ||
* '''SQIsign NIST-V:''' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This proposal uses SQIsign signatures, an isogeny-based quantum-resistant signature scheme, to enhance the protocol's quantum security. While SQIsign has shown promising performance and compact signature sizes, it is worth noting that SQIsign is not part of the NIST PQC standardization process. This choice reflects a consideration of alternatives beyond NIST's selected algorithms, aiming to balance quantum resistance with efficiency and practical implementation constraints. Future analysis and community feedback will determine its suitability for adoption. All other pq alogrithms of Group 1 & 2 highlighted in pqcBitcoin repo are confirmed NIST standard and tested.
Greetings all, can we have virtual clarification meeting. Thanks |
This spent several months gathering feedback from the mailing list and from other advisors. This is hopefully polished enough to submit upstream.
Let me know if you have any questions or feedback, and of course feel free to submit suggestions.
Thank you for your time.