-
Notifications
You must be signed in to change notification settings - Fork 119
Improve auth and ioauth signature format. #170
Conversation
The previous format used a weird non-standard signature scheme. Use the standard der encoding instead. This is better for interoperability.
Note that this breaks backwards compatibility. yield-generators and sendpayment users need to upgrade. A note on the change: before, the hash to be signed was decorated with useless test before, using the electrum (?) signature scheme: The |
Please test this before merging, to double check correctness. |
Thanks for the code. We've got a few ideas of stuff that needs fixing that would break backwards compatibility. This PR can be merged (after testing) into the new protocol #171 that we'll all switch to one day. |
Thanks for this! Great to see a new person looking at this part of the code. "weird non-standard signature scheme" - Well, I'm not sure about that. ecdsa_sign instead of ecdsa_raw_sign is using the old "Bitcoin Signed Message" Bitcoin Core style of signing with Bitcoin keys (not sure about the "WTF Electrum" comment from VB, that's been in the codebase since the beginning :) ). For example you can see Mike Hearn using the same formatting here: http://plan99.net/~mike/bitcoinj/0.8/src-html/com/google/bitcoin/core/Utils.html#line.476 As for the encoding - I'm a lot more inclined to agree. But I want to research and find out what the origin is of the encoding in encode_sig before switching. |
So the bitcoin transaction signatures use the In any case, I do not care too much about the decoration (I can revert that in the PR), but I would very much like for the signature encoding to change. Why do you want to know the origin of that particular encoding? Is there a downside in using the standard |
Yes, it's a fair point that we don't need the 'Bitcoin Signed Message' text; you may have misread my intention, I was merely countering the idea that it was a 'weird nonstandard signature scheme'; I'm trying to say it's a signature scheme used in Bitcoin. see e.g. https://github.com/bitcoin/bitcoin/blob/675d2feffa84a6ffeabac32aeed37f6a7f74bee3/src/main.cpp#L100 Not appropriate here (because not human readable etc.)? I could argue that it would be nice for a person to be able to import the BTC signature into their Bitcoin Core or Electrum other wallet, and use the provided BTC address by the counterparty to verify the public key used in the E2E encryption. To do this, the "weird and useless decorating" would suddenly not be so weird and useless. So precisely for the reason you originally mentioned - interoperability - it is not necessarily true that this decoration is useless. You're thinking such an idea is irrelevant because we're not looking at human readable messages, I'm thinking that signing any message which is not a Bitcoin transaction is done using the 'standard' Bitcoin message signature, which includes that text decoration. Re: the encoding ... don't you think it's nice to know what something is before you remove it? :) |
OK, so looking at it again, there's nothing fancy or interesting in the encode_sig function, just base64 fixed length fields. So agreed, let's just use the der encoding. |
Good points on the text decoration. I was not considering that anyone would want to manually verify the auth and ioauth messages. Still not sure it would make sense, but I don't think the text decoration is very important. It is easily added or removed in interop, so either way is fine. The purist in me just thinks that it should go away when it is useless ;)
Sure, that is always good to know. I think I found the reason: https://github.com/bitcoin/bitcoin/blob/v0.7.1/src/key.cpp#L299-L303 It seems to be the compact signing format (65 bytes, I guess having a more compact sig encoding is also valuable. I guess I will start writing a patch to include that format in Haskoin, the bitcoin signature library I am currently using. Thanks for looking at this PR with critical eyes :) I am closing this, then. Feel free to keep the discussion going though, if you have anything to add. |
Yes, like so many things it's a 'can of worms' when you start looking into it more.... I think this might be interesting too: https://github.com/spesmilo/electrum/blob/master/lib/bitcoin.py#L481-L522 Plus I'll try to find out the encoding used by Bitcoin (my guess is ThomasV has tried to make it compatible) (edit: oh you've already had a look, thanks). |
This is funny, we each convinced the other of the other's view 😆
There is also something to be said for an easier encoding function for interop, it can be coded by hand more easily. |
Investigating a little further, the answer from karimkorun here is a useful starting point: http://bitcoin.stackexchange.com/questions/10759/how-does-the-signature-verification-feature-in-bitcoin-qt-work-without-a-public I feel like vbuterin hasn't addressed all details in his encoding; there are supposed to be 4 different possibilities (2 for odd/even y coord, 2 for +/- R value I think according to sec1-v2 doc), and you should also encode compressed/uncompressed nature of pubkey. The signatures are to Bitcoin addresses as noted. So yeah, I think we're in agreement that the type of encoding and format don't need to be changed, but it's possible we might need to edit the encode_sig function to bring it in line with 'standards'. |
I was wondering about the same thing. I am not really fond of spending time patching pybitcointools. So maybe we should move to the |
Apparently, the compressed signature, with the help of the first byte, encodes the public key as well. Both pybitointools however, completely ignores the first byte: I don't know what to make of that. |
He gets away with ignoring the first byte because he's ignoring the y-coordinate of the pubkey :) So he doesn't care if the key is compressed or not, and he doesn't care if the y-coordinate is odd/even (+/-). The only thing I didn't figure out (haven't looked into it) is why he doesn't have to check for one of 2 pubkeys, it might be to do with the way he implemented the determinstic signature (which is where you use hmac to generate the k-value/R-value deterministically), maybe that cuts out one of the possibilities. Sorry been a bit lazy about looking into it further. Obviously it's self-consistent (otherwise we would have seen loads of errors) but possibly not consistent with other implementations. |
The previous format used a weird non-standard signature scheme.
Use the standard der encoding instead. This is better for
interoperability.