Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Attributes with defaultValue functions not persisting #9315

Closed
christophersansone opened this issue Apr 2, 2024 · 8 comments · Fixed by #9355
Closed

Attributes with defaultValue functions not persisting #9315

christophersansone opened this issue Apr 2, 2024 · 8 comments · Fixed by #9355

Comments

@christophersansone
Copy link
Contributor

christophersansone commented Apr 2, 2024

High Level Overview

When an attribute is defined with defaultValue as a function, those specific default values are not serialized. Instead, the defaultValue function is re-executed upon serialization, resulting in serialized data that may not match the actual model data.

Reproduction

https://github.com/christophersansone/ember-data-attr-default-value

Once the application is running...

The Attribute Value is the value of the attribute that was generated by the defaultValue method.
The Serialized Value is the value of the attribute as it is output by model.serialize().

Expected Result: The two values equal.
Actual Result: The two values do not equal.

Bonus:

  • Clicking Re-serialize will call model.serialize() again. The Serialized Value is different each time.
  • You can manually set the attribute by entering a value and clicking Set. Once that occurs, the Attribute Value then equals the specified value, and subsequent serializations output the expected value.

What's Happening?

Given an attribute definition with a dynamic default value, such as:

import Model, { attr } from '@ember-data/model';

export default class Post extends Model {
  @attr('number', { defaultValue: () => Math.round(Math.random() * 10000) })
  likes;
}

When the model is instantiated, the attribute is correctly instantiated:

let post = this.store.createRecord('post');
post.likes; // 1234

But when serializing it, it outputs a new default value every time:

post.likes; // 1234
post.serialize().data.attributes.likes; // 4958
post.serialize().data.attributes.likes; // 2423
post.serialize().data.attributes.likes; // 8463
post.likes; // still 1234

Once the value is manually set, the serialized value stabilizes:

post.set('likes', 9999);
post.serialize().data.attributes.likes; // 9999
post.serialize().data.attributes.likes; // 9999
post.serialize().data.attributes.likes; // 9999

Why It Matters

This example is a minimal reproduction of the issue and is fairly contrived, but the issue has real world implications.

If the default value is a primitive string, number, boolean, etc., it is typically defined as a static value, e.g.:

@attr('boolean', { defaultValue: false }) isPublished;

When the default value is a complex object such as an object or array, it must be defined as a function in order to get uniquely instantiated values for each model. In fact, there is a nice little assertion to ensure this is the case.

In my usage, I have a custom transform that serializes an array of objects. I instantiate the attribute as an empty array up front to avoid needing to check and instantiate it downstream.

@attr('item-array', { defaultValue: () => [] }) items;

Because of this issue, if I add items to this default array, the items will be in the attribute, but it will always serialize as an empty array (because it re-executes defaultValue().

Research

After doing some digging, this is what I was able to put together.

I added a debugger statement in one of my defaultValue methods to confirm that it is called again upon serialization, and to view the call stack at that point.

The culprit appears to be the JSON:API cache. It is hitting this line during serialization, when the expectation is that it should use the value that was already initialized on the attribute.

From what I can tell, it looks like localAttrs should contain all the values that have been modified from the original. In this case, it seems like it should contain these instantiated defaultValue values, as if they were set manually, but it does not.

Versions

├─┬ @ember/[email protected]
│ └── [email protected] deduped
├─┬ [email protected]
│ └── [email protected] deduped
├─┬ [email protected]
│ └─┬ @ember-data/[email protected]
│   └─┬ [email protected]
│     └── [email protected] deduped
├─┬ [email protected]
│ └── [email protected] deduped
├─┬ [email protected]
│ └── [email protected] deduped
├─┬ [email protected]
│ └── [email protected] deduped
├─┬ [email protected]
│ └── [email protected] deduped
└── [email protected]

├─┬ [email protected]
│ └── [email protected] deduped
└── [email protected]

└── [email protected]

First noticed on Ember Data 4.12.3

@runspired
Copy link
Contributor

runspired commented Apr 4, 2024

Roughly speaking these are the system expectations:

  • defaultValues are not considered dirty state (e.g. are not localAttrs)
  • dirty comparisons are made by referential equality checks
  • defaultValues can be included in payloads sent to the server
  • the cache contains only serialized state

Either one of these first two trips up the idea of a defaultValue being used to populate state you want to later send back to the API. Due to the former, it won't take part in any patch that gets generated, and due to the latter it also won't take part in any patch even if you've mutated it so long as the reference is the same.

These have been constraints going back really really really far in this library, so you are probably wondering, what changed?

We no longer pull the value from the record instance (a mistake of the earlier architecture was that it did). This means we ask the cache for the attr value and the cache does not (and imo should not) retain these values since they are intended to be transient.

Since classic computed properties in ember memoize their value, when we pulled the value from the record instance it would appear stable even though it was not.

So what to do about this? Generally, if the intent is to use defaultValue to initialize missing state during createRecord, probably you should not. Currently, its better to pass those values in to createRecord or have your API return defaults or add defaults during serialization.

Which gets us into a second problem: where does defaultValue live?

This was a second mistake of the early architecture. Instead of defaultValue being a concern of a transform, it is instead a concern of a record. Meanwhile, transforms also have a mistake in the early architecture in that instead of being a concern of a field they are a concern of a serializer.

Both of these mistakes are rectified by schema-record:

  • transforms for schema-record are a field concern instead of a serializer concern
  • defaultValue is a transform concern not a record concern and produces a serialized value instead of a stateful value
export type Transform<T extends Value = string, PT = unknown> = {
  serialize(value: PT, options: Record<string, unknown> | null, record: SchemaRecord): T;
  hydrate(value: T | undefined, options: Record<string, unknown> | null, record: SchemaRecord): PT;
  defaultValue?(options: Record<string, unknown> | null, identifier: StableRecordIdentifier): T;
};

This change allows us to provide the record instance to the transform and to defaultValue correctly where we couldn't really do so before, and ensures that defaultValues are serializable (currently they usually aren't).

On its own though, this still wouldn't fix your issue because defaultValues still won't be considered dirty. However, in schema-record, mutating the value returned by your defaultValue value would result in the value becoming part of the dirty state of the cache since schema-record is capable of deep-tracking.

In the meantime, I'm open to a hacky fix to stabilize the value, its not totally clear though what the ideal form of that hack would be because ultimately we probably don't want to keep a cache of populated defaultValues and it would be a mistake for defaultValues to cause records to become dirty since there's a pretty high probability then that every record ever loaded from the API is immediately in a dirty state.

@christophersansone
Copy link
Contributor Author

@runspired Thanks so much for the thorough explanation and willingness to consider a solution here, even if on the hacky side of things.

The easy solution in user land is to just set the values during store.createRecord. That's what I did to get unblocked, and it gets the job done.

Considering an internal solution, based on the system expectations you outlined, this issue exhibits when defaultValue is a function that returns inconsistent output. If defaultValue is simply defined as a primitive, it works as is: whether you reach for the value from the model instance or from defaultValue over and over, you will get the same output. If a fix was isolated to defaultValues as functions, it would isolate the change to a much smaller problem space, alleviating the concern about a high probability that every record ever loaded from the API is immediately in a dirty state.

That said, to me it seems reasonable to assert that if defaultValue is defined as a function, it is because it needs to output a unique value... and because that value is unique, that value is set on the record instance and the cache, just as if it were manually set... which therefore results in a dirty state.

If that were the case, then the fix would be to simply store these default value function results within localAttrs.

If it were really a concern that we are considering the record dirty as a result of this, then I guess I would consider defining a separate defaultAttrs variable to store these values. This way, the value can be retrieved as needed without affecting localAttrs and therefore the dirty state. It better adheres to the system expectations you outlined above (i.e. "defaultValues are not considered dirty state"), but it seems like more effort, and it feels like an exception can be made in this case, given that this functionality is already somewhat deprecated.

FWIW, I would say that in my experience, the general rule is that default values as functions only run on createRecord. Once the data is persisted, the expectation is that the server returns the persisted value, even if it's still the default. In this case, the record is dirty upon creation but then pristine once fetched. Occasionally I probably create a new attribute for existing records that requires a default value function but have not pre-populated that value on the persisted records, but that is a much smaller use case than the run of the mill new record creation. So even if a default value function causes a fetched record to be dirty, it seems like it would be a rare occurrence for those that use this pattern, and I don't know that I would notice.

@runspired
Copy link
Contributor

So even if a default value function causes a fetched record to be dirty, it seems like it would be a rare occurrence for those that use this pattern, and I don't know that I would notice.

You might not, but we accidentally made it dirty state once before and tons of apps noticed 🙈

It's unfortunately common for folks to use it either for fields the API doesn't always send, doesn't send yet, or to trim down on what gets sent (intentionally leaving off fields from payloads that would match default values)

@christophersansone
Copy link
Contributor Author

Ha, yeah you seemed more concerned than I did about tracking them as dirty... now I know why! 😄

Would a separate defaultAttrs key be worthwhile, allowing the default values to be accessible by the cache after the fact, while not storing them as if they were dirty? Could we simply augment getAttr as follows?

getAttr(...) {
  ...
  else if (cached.defaultAttrs && attr in cached.defaultAttrs) {
    return cached.defaultAttrs[attr];
  } else {
      ...
      const defaultValue = getDefaultValue(attrSchema, identifier, this._capabilities._store);
      if (typeof attrSchema?.options?.defaultValue === 'function') {
        cached.defaultAttrs = cached.defaultAttrs || {};
        cached.defaultAttrs[attr] = defaultValue;
      }
      return defaultValue;
    }

It actually seems more elegant than I would have expected. It's nice and isolated, only instantiates defaultAttrs when necessary, and only stores the value when the default value is a function. I imagine it would also need to get cleared on rollback and unload as well, though.

@runspired
Copy link
Contributor

@christophersansone I think thats allowable and we can revisit that and remove it once schema-record allows us to avoid the problem in the first place

@runspired
Copy link
Contributor

If we do this, we should probably delete the attr from defaultAttrs if a remoteAttr or localAttr is ever set

@christophersansone
Copy link
Contributor Author

Sounds good. 👍 Want me to take a crack at a PR? I can give it a whirl but may take a day or two to make it happen. I'll make sure current tests pass... but should I write additional tests for this?

We can delete the attr for the sake of cleanliness... but I don't think the value would ever be read if attr in remoteAttrs or attr in localAttrs.

@runspired
Copy link
Contributor

@christophersansone go for it! additional tests would be great, the dx of working in the monorepo though is pretty meh. If you can't figure it out from the various package.json files etc feel free to setup some time to pair.

TL;DR if you make changes to packages, run pnpm install and restart test server.

For this, tests will go in the tests in tests/main. Unsure if there's a great module in there for this already but creating a new module for defaultValue tests seems ok.

We can delete the attr for the sake of cleanliness... but I don't think the value would ever be read if attr in remoteAttrs or attr in localAttrs.

Because defaultValue might get mutated (just like you are doing) you want to be sure that if you later get an update from the server or do a rollback that you don't reuse the same value.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

2 participants