You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
UINT32 type accepts str type containing base-10 inputs, whereas UINT64 type accepts str type in base-16 format only. This is enforced by the differing HEX_REGEX and .isdigit() checks in their respective UInt<bits>.from_value() methods.
Furthermore, the documentation states that only UINT64 type is represented as a str, to ensure no loss of precision. This begs the question: why does UINT32 type accept str inputs?
Inconsistency in the .to_json() method output:
UINT32 type returns a value of type int represented in base-10 format. UINT64 type returns a value of type str. This value is represented in base-16 format. While I understand the need for using a str type (for maintaining precision), why is base-16 the preferred output format?
Differences between xrpl.js and xrpl-py appropos handling UINT64 type
Juxtaposing xrpl-py's UInt64.from_value() method with this issue, the Javascript SDK accepts only hexa-decimal strings, whereas the Python SDK accepts both base-10 int's, and base-16 str inputs.
Examples of these inconsistencies can be observed in this branch
The text was updated successfully, but these errors were encountered:
This representation format ties back to the internal rippled types. Although this is a thorn in the user-experience, fixing it takes up enormous effort across multiple code-bases and breaks backwards compatibility.
I'll close this issue because this is not a priority change that's needed currently.
Inconsistency in
str
type input format:UINT32
type acceptsstr
type containing base-10 inputs, whereasUINT64
type acceptsstr
type in base-16 format only. This is enforced by the differingHEX_REGEX
and.isdigit()
checks in their respectiveUInt<bits>.from_value()
methods.Furthermore, the documentation states that only UINT64 type is represented as a
str
, to ensure no loss of precision. This begs the question: why doesUINT32
type acceptstr
inputs?Inconsistency in the .to_json() method output:
UINT32
type returns a value of typeint
represented in base-10 format.UINT64
type returns a value of typestr
. This value is represented in base-16 format. While I understand the need for using astr
type (for maintaining precision), why is base-16 the preferred output format?Differences between xrpl.js and xrpl-py appropos handling
UINT64
typeJuxtaposing
xrpl-py
'sUInt64.from_value()
method with this issue, the Javascript SDK accepts only hexa-decimal strings, whereas the Python SDK accepts both base-10int
's, and base-16str
inputs.Examples of these inconsistencies can be observed in this branch
The text was updated successfully, but these errors were encountered: