Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.optional().default() doesn't make property optional in typeScript type inference #2491

Closed
kevinwolfcr opened this issue Jun 7, 2023 · 7 comments

Comments

@kevinwolfcr
Copy link

Hello,

I've encountered an issue when trying to use .optional().default() on a Zod schema. Even when a property is marked as .optional(), the inferred TypeScript type requires the property to be present.

Here's a simple example to illustrate the problem:

import { z } from 'zod';

const schema = z.object({
  name: z.string(),
  rootDir: z.string().optional().default("default/path"),
});

type SchemaType = z.infer<typeof schema>;

function processConfig(config: SchemaType) {
  // Function implementation...
}

const config = {
  name: "myApp",
};

processConfig(config); // TypeScript error: 'rootDir' is missing in type '{ name: string; }' 

In this example, TypeScript expects both 'name' and 'rootDir' to be present in the 'config' object, even though 'rootDir' is marked as optional with a default value in the Zod schema.

My expectation would be that, since 'rootDir' is marked as .optional(), it should not be required in the inferred TypeScript type. I would expect this code to compile without errors, and for Zod to apply the default value of 'rootDir' at runtime when it's not provided.

Is this the intended behavior or am I misunderstanding the usage of .optional().default()?

Thanks for your help!

@vtgn
Copy link

vtgn commented Jun 7, 2023

Hello,

You're right: the problem is caused by the .default method that do not generate an inferred type including the | undefined case, even whether the .optional method is applied or not.

The good behaviour should be that the .default method must generate an inferred type including | undefined.

Here is a simple case to reproduce the problem:

const testSchemaObject = z.object({
  name: z.string().default("default value")
});

/* This inferred type is :
type TestSchema = {
    name: string;
}
which is incorrect and should be
type TestSchema = {
    name: string | undefined;
}
*/
export type TestSchema = z.infer<typeof testSchemaObject>;

console.log(testSchemaObject.parse({})); //=> no problem, it displays "{ name: "default value" }"
console.log(testSchemaObject.parse({ name: undefined })); //=> no problem, it displays "{ name: "default value" }"

const ts: TestSchema = {}; // => Typescript error: the "name" field  is mandatory

So for now, you can't use the inferred type to type the input objects you want to parse with your schema. :/

Regards.

@scotttrinh
Copy link
Collaborator

This is working as intended but perhaps worth calling out in the docs. default causes the value to always be present since if it isn't, the default value is used. infer gets the type of the output value, in fact it is an alias of output. If you need a type of a valid input, there is an input type helper as well.

@vtgn
Copy link

vtgn commented Jun 7, 2023

This is working as intended but perhaps worth calling out in the docs. default causes the value to always be present since if it isn't, the default value is used. infer gets the type of the output value, in fact it is an alias of output. If you need a type of a valid input, there is an input type helper as well.

Oh ok. I always used the inferred type to type my inputs and I've never had any problem because I've never used the ".default" method.

So create 2 types like this resolves the problem:

/* This input type is :
type TestInputSchema  = {
    name?: string | undefined;
}
*/
export type TestInputSchema = z.input<typeof testSchemaObject>;

/* This output type is :
type TestOutputSchema = {
    name: string;
}
*/
export type TestOutputSchema = z.infer<typeof testSchemaObject>;

It would be more clear in the documentation to just explain directly the z.input and z.output for all cases. z.infer should only be talked as an alias for z.output and may be used only for the cases for which the developer is sure the input and output inferred types are the same... or just put z.infer as deprecated to simplify.

Thanks for the information.

Regards.

@kevinwolfcr
Copy link
Author

This is working as intended but perhaps worth calling out in the docs. default causes the value to always be present since if it isn't, the default value is used. infer gets the type of the output value, in fact it is an alias of output. If you need a type of a valid input, there is an input type helper as well.

Thanks for calling that out. I didn't knew the .input() existed before this. So this is working now:

const schema = z.object({
  name: z.string(),
  rootDir: z.string().default("default/path"),
})

function processConfig(config: z.input<typeof schema>) {
  // Function implementation...
}

processConfig({ name: "myApp"})

@huksley
Copy link

huksley commented Sep 5, 2024

Could someone clarify how to get proper infer type and back in this situation?
I have a property which have default (I believe this is quite common) and it throws type inference of (Error: Type 'number | undefined' is not assignable to type 'number')

userSchema: z.object({ page: z.number().default(1) });
export type User = z.infer<typeof userSchema>;
// Type 'number | undefined' is not assignable to type 'number'.
const t: ZodSchema<User> = userSchema;

https://gist.github.com/huksley/7d8c800d22cf5fa423e277ddc745a92f

@vtgn
Copy link

vtgn commented Sep 5, 2024

Could someone clarify how to get proper infer type and back in this situation? I have a property which have default (I believe this is quite common) and it throws type inference of (Error: Type 'number | undefined' is not assignable to type 'number')

userSchema: z.object({ page: z.number().default(1) });
export type User = z.infer<typeof userSchema>;
// Type 'number | undefined' is not assignable to type 'number'.
const t: ZodSchema<User> = userSchema;

https://gist.github.com/huksley/7d8c800d22cf5fa423e277ddc745a92f

Hi !
scotttrinh gave the solution above, read correctly the comments next time: use z.input instead of z.infer.
There are 2 different types that can be built from the schema, the input one, and the output one (infer is an alias for output):

  • the input type is for typing the objects before to parse them with the schema.
  • the output one is for typing the objects returned by the parsing with the schema. This type is generally the same as the input one, except when the schema applies modifications on the input object during the parsing, and this is particularly the case when you use a default() statement, because it means a value will always be set at the exit of the parsing.

To use the types correctly, it must always be used like this:

const mySchema = ...;

type MySchemaInput = z.input<typeof mySchema>;
type MySchemaOutput = z.output<typeof mySchema>;

const myInputObject: MySchemaInput = ...;

const myOuputObject: MySchemaOutput = mySchema.parse(myInputObject);

@huksley
Copy link

huksley commented Sep 9, 2024

@vtgn Thank you! I reiterated on your's and @scotttrinh detailed explanations and went with the following: I don't expose a "input" or "output" types separately as it generates too much types (if you have many zod schemas), instead I define schema type and when derive types from of that, as needed.

export type UserSchema = ZodSchema<
  z.output<typeof userSchema>,
  ZodTypeDef,
  z.input<typeof userSchema>
>;

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants