Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DateTime.MaxValue Does not Roundtrip #74

Closed
phil000 opened this issue Apr 16, 2015 · 8 comments
Closed

DateTime.MaxValue Does not Roundtrip #74

phil000 opened this issue Apr 16, 2015 · 8 comments
Labels
enhancement Requires or request to feature enhancement

Comments

@phil000
Copy link

phil000 commented Apr 16, 2015

Hello,
I've raised this issue before but it is still broken in 5.11.

A class with DateTime as a property set to DateTime.MaxValue does not round trip correctly. My test:

// ===============ARRANGE====================================================
MyClass value = new MyClass() { DateTimeProperty = DateTime.SpecifyKind(DateTime.MaxValue, DateTimeKind.Utc) };

// ===============ACT========================================================
MyClass result = CloneWars.Clone(value);

// ===============ASSERT=====================================================
Assert.IsTrue(value != result);
Assert.IsTrue(value.DateTimeProperty.Ticks == DateTime.MaxValue.Ticks);

//This fails as result.DateTimeProperty loses precision:
//value.DateTimeProperty.Ticks = 3155378975999999999
//result.DateTimeProperty.Ticks = 3155378975999990000
Assert.IsTrue(result.DateTimeProperty.Ticks == DateTime.MaxValue.Ticks);

@yfakariya
Copy link
Member

As I described, it is by design because other language bindings use milliseconds epoc time as default.

Although it is not so hard to implement custom serializer it just use raw Ticks of DateTime, I may able to implement built-in one and expose configuration option to switch default serializer for it. Do you really need 100 nano seconds precision?

@phil000
Copy link
Author

phil000 commented Apr 19, 2015

Maybe I miss something but if you carry more (un-needed, un-used) precision how will it affect anything in these environments? In the case of losing precision it is definitely a way of introducing subtle, hard to discover and hard to replicate bugs into applications.
Do we need nano precision, probably not no but we do have a general expectation that we will not subtly lose precision of data when round-tripped through our cache (using MsgPack). There are serialization formats where this happens we know this, but it just seems pretty unnecessary.

As an example, If I put 1.23456789 into msgpack and you gave back 1.234567 is it ok to lop off 0.00000089? Will our code still work? Depends on what we're doing, why we're doing it, and the robustness of our code. The safe route is to not do this obviously.

As a rule I'd personally:

  1. Avoid loss of any precision from all built-in .NET types as far as humanly possible
  2. Avoid changing the representation of any .NET types

For instance, we put a 'local' DateTime at 4pm into the cache and get back a DateTime at 12am in UTC. Or we put a DateTimeOffset at 4pm+6:00 and get back something UTC. Ouch. Maybe it's our poor date handling sure, but with any other serialization format our code works as we expect it to.

@phil000
Copy link
Author

phil000 commented Apr 19, 2015

PS: Thank you for excellent MsgPack. Apart from unexpected loss of precision and unexpected change in (date, date time offset) type representation we love it for speed and size.

@yfakariya
Copy link
Member

Thank you. I understand that you said losing precision was subtle and it should be avoided at a first time. I will provide an option to control default DateTIme and DateTimeOffset handling (it will serializer them 'as is'), but I won't change default behavior because of backward compatibility. What do you think about it?

@yfakariya yfakariya added the enhancement Requires or request to feature enhancement label Apr 20, 2015
@phil000
Copy link
Author

phil000 commented Apr 20, 2015

I don't mind how we can achieve it. You could either make an option like you suggest, or your could start to version the format and make it the default for v2.
How do you feel about various date types changing their representation? Is this supposed to be the case that everything is normalized to UTC and returned that way? We see that it currently is, but we have been bitten by that a lot so far.

yfakariya added a commit that referenced this issue May 24, 2015
This commit includes:
* Add new serializers which use DateTIme.ToBinary/FromBinary method.
* Unix-Epoc/Native mode switcher and related custom attributes.
* SerializationProvider to support aboves.
* Unit tests.

Backward compatiblity feature will be added via SerializationContext utilities.
@yfakariya
Copy link
Member

I fixed this issue on 0.6.0-beta2. Please confirm/verify whether it reflects your intension.

  • Default uses 100nano-sec, Kind recognized serialization.
  • There is an option to switch back to previous Unix epoc based serialization.

Thanks!

@phil000
Copy link
Author

phil000 commented Jul 28, 2015

I've had a look at this and the test I provided you now passes using MsgPack.Cli.0.6.0-Beta5. This is without us needing to change anything. I see awesome precision is the default mode.

@yfakariya
Copy link
Member

Closed. Please reopen or open a new issue for additional problem or opion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Requires or request to feature enhancement
Projects
None yet
Development

No branches or pull requests

2 participants