Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Pydantic Plugin V2 #2577

Draft
wants to merge 48 commits into
base: master
Choose a base branch
from
Draft

[WIP] Pydantic Plugin V2 #2577

wants to merge 48 commits into from

Conversation

Future-Outlier
Copy link
Member

@Future-Outlier Future-Outlier commented Jul 10, 2024

Tracking issue

flyteorg/flyte#5033

Notes

  1. We can customize serialization behavior here:
    https://docs.pydantic.dev/latest/concepts/serialization/#custom-serializers
  2. We can know how deserialization from json string to pydantic objects work here:
    https://stackoverflow.com/questions/67621046/initializing-a-pydantic-dataclass-from-json
  3. We can migrate our code from Pydantic V1 to V2 with this tool:
    https://github.com/pydantic/bump-pydantic

Deserialization

from pydantic import BaseModel
from flytekit.types.file import FlyteFile
from flytekit.types.structured import StructuredDataset
from typing import Optional

class MyModel(BaseModel):
    my_field: CustomType
    ff: Optional[FlyteFile] = None
    sd: Optional[StructuredDataset] = None

jsonstr = '{"my_field":{"value":1,"field_name":"custom name"}, "ff":{"path": "s3://my-s3-bucket/a/example.txt"}, "sd": {"uri": "sd-uri", "file_format": "csv"}}'
model = MyModel.model_validate_json(jsonstr)
# my_field=CustomType<1 'custom name'> ff=s3://my-s3-bucket/a/example.txt sd=StructuredDataset(uri='sd-uri', file_format='csv')

Why are the changes needed?

What changes were proposed in this pull request?

How was this patch tested?

Example with local and remote execution

from typing import Any
# from pydantic_core import core_schema
# from pydantic import BaseModel, RootModel, GetCoreSchemaHandler, ValidationInfo
from dataclasses import dataclass
import typing
import pandas as pd
import flytekit
# from pydantc.main import model_validate_json
# from pydantic import model_validate_json
from pydantic import BaseModel, ConfigDict

from flytekit import task, ImageSpec, workflow, kwtypes
from flytekit.types.file import FlyteFile
from flytekit.types.schema import FlyteSchema
from typing import Any, Dict, List, Optional, Tuple, Type, Union
from flytekit.core.type_engine import TypeEngine
from flytekit.types.structured import StructuredDataset

from flytekit.types.directory import FlyteDirectory
# from flytekitplugins.pydanticv2 import PydanticTransformer
from flytekit.core.context_manager import FlyteContextManager
import os
# 9214e04327e6c2012e6e064a5c576453676ec7c1

key = "c9c51284bbf90270139cbcca95bbb98a3c91b9fe"
flytekit_dev_version = f"https://github.com/flyteorg/flytekit.git@{key}"
pydantic_dev_version = f"git+https://github.com/flyteorg/flytekit.git@{key}#subdirectory=plugins/flytekit-pydantic_v2"

image = ImageSpec(
    registry="localhost:30000",
    # registry="futureoutlier",
    apt_packages=["git"],
    packages=[
        pydantic_dev_version,
        f"git+{flytekit_dev_version}",
        "pandas",
    ],
    builder="default",
    # builder="envd",
)

@dataclass
class DC:
    a: int
    b: str
    c: Optional[FlyteFile] = None

TestSchema = FlyteSchema[kwtypes(some_str=str)]

CsvFile = FlyteFile[typing.TypeVar("csv")]
SvgDir = FlyteDirectory[typing.TypeVar("svg")]

class CustomType(BaseModel):
    """Custom type that stores the field it was used in."""

    value: int
    field_name: str
    ff: Optional[FlyteFile] = None
    fd: Optional[FlyteDirectory] = None


    def __repr__(self):
        return f"CustomType<{self.value} {self.field_name!r}>"

# Note: we use `__get_pydantic_core_schema__` to constuct the schema
class MyModel(BaseModel):
    ct: CustomType
    ff: Optional[FlyteFile] = None
    ff_csv: Optional[CsvFile] = None
    fd: Optional[SvgDir] = None
    fsc: Optional[TestSchema] = None
    sd: Optional[StructuredDataset] = None
    dc: Optional[DC] = None
    list_int: Optional[List[int]] = []

@task(container_image=image)
def t1() -> MyModel:
    # Create a local directory
    dir_path = "./build"
    os.makedirs(dir_path, exist_ok=True)
    file_path = os.path.join(dir_path, "fltedir_example.txt")
    with open(file_path, "w") as f:
        f.write("FlyteDirectory content")

    # write a txt file
    file_path = "./build/local_example.txt"
    os.makedirs(os.path.dirname(file_path), exist_ok=True)
    with open(file_path, "w") as f:
        f.write("Default content")
    # write a csv file
    data = {
        'Column1': ['Row1-Col1', 'Row2-Col1', 'Row3-Col1'],
        'Column2': ['Row1-Col2', 'Row2-Col2', 'Row3-Col2'],
        'Column3': ['Row1-Col3', 'Row2-Col3', 'Row3-Col3'],
    }

    # Create DataFrame
    df = pd.DataFrame(data)

    # Save DataFrame to CSV
    csv_path = './build/example.csv'
    df.to_csv(csv_path, index=False)

    # For FlyteSchema
    schema = TestSchema()
    df = pd.DataFrame(data={"some_str": ["a", "b", "c"]})
    opened_schema = schema.open()
    opened_schema.write(df)

    # SturcturedDataset
    df = pd.DataFrame({"name": ["Tom", "Joseph"], "age": [20, 22]})


    m = MyModel(
        ct=CustomType(
            value=1,
            field_name="my_field",
            ff=FlyteFile(file_path),
            # fd=FlyteDirectory(path=flytekit.current_context().working_directory)
            fd=FlyteDirectory(dir_path),
        ),
        ff=FlyteFile("s3://my-s3-bucket/a/example.txt"),
        ff_csv=CsvFile(csv_path),
        fd=SvgDir("s3://my-s3-bucket/a/"),
        fsc=schema,
        sd = StructuredDataset(dataframe=df),
        # sd=StructuredDataset(uri="s3://my-s3-bucket/a/example.txt", file_format="txt"),
        dc=DC(a=1, b="b"),
        # dc=DC(a=1, b="b", c=FlyteFile("s3://my-s3-bucket/a/example.txt")),
        list_int=[1, 2, 3],
    )
    return m

@task(container_image=image)
def t2(m: MyModel) -> MyModel:
    # print(type(m.ct.ff))
    print("@@@ running t2")
    print(m.ct.ff)
    with open(m.ct.ff, "r") as f:
        print(f"Local FlyteFile {m.ct.ff.path}: {f.read()}")
    with open(m.ff, "r") as f:
        print(f"Remote FlyteFile {m.ff.path}: {f.read()}")
    print(pd.read_csv(m.ff_csv))

    print(f"Local FlyteDirectory {m.ct.fd}: {os.listdir(m.ct.fd)}")
    print(f"Remote FlyteDirectory {m.fd}: {os.listdir(m.fd)}")

    print(f"Local FlyteSchema: {m.fsc.open().all()}")

    print(f"Local StructuredDataset: {m.sd.open(pd.DataFrame).all()}")
    print(f"Local list[int]: {m.list_int}")
    return m

@workflow
def wf() -> MyModel:
    m = t1()
    return t2(m=m)

if __name__ == "__main__":
    print(wf())
    # print(t1())

Setup process

Screenshots

Check all the applicable boxes

  • I updated the documentation accordingly.
  • All new and existing tests passed.
  • All commits are signed-off.

Related PRs

Docs link

@Future-Outlier Future-Outlier changed the title Pydantic Plugin V2 [WIP] Pydantic Plugin V2 Jul 10, 2024
@Future-Outlier Future-Outlier marked this pull request as draft July 10, 2024 23:03
@Future-Outlier
Copy link
Member Author

Future-Outlier commented Jul 15, 2024

pydantic v1

  Local Remote
Dataclass in BaseModel O O
FlyteType in BaseModel O O
FlyteType in BaseModel with BaseModel O O
FlyteType in Dataclass with BaseModel X X

@Future-Outlier
Copy link
Member Author

Note: Langchain write code like this to import pydantic.

try:
    from pydantic.v1 import *  # noqa: F403
except ImportError:
    from pydantic import *  # type: ignore # noqa: F403

https://github.com/langchain-ai/langchain/blob/master/libs/core/langchain_core/pydantic_v1/__init__.py#L14-L18

@Future-Outlier
Copy link
Member Author

I have 2 ways to serialize/deserilize flyte types now:

  1. write a recursive function to serialize and deserialize every FlyteTypes in class
  2. try to pass a encoder or something like that to tell pydantic how to serialize FlyteTypes, and do similar things when deserializing.

@Future-Outlier
Copy link
Member Author

The old method, pass an encoder to BaseModel json method is no longer supported, we have to use field serializers instead.

https://github.com/pydantic/pydantic/blob/main/pydantic/main.py#L1097-L1127

@Future-Outlier
Copy link
Member Author

reference: pydantic/pydantic#951

Copy link

codecov bot commented Jul 15, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 93.68%. Comparing base (097e9e8) to head (3c5169d).
Report is 3 commits behind head on master.

Additional details and impacted files
@@             Coverage Diff             @@
##           master    #2577       +/-   ##
===========================================
+ Coverage   76.22%   93.68%   +17.45%     
===========================================
  Files         187       42      -145     
  Lines       18938     2263    -16675     
  Branches     3706        0     -3706     
===========================================
- Hits        14435     2120    -12315     
+ Misses       3870      143     -3727     
+ Partials      633        0      -633     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
if not ctx.file_access.is_remote(uri):
return expected_python_type(uri, remote_directory=False)
return expected_python_type(uri)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have to do this to upload dir to remote storage.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When pydantic create FlyteDirectory, it will call to_python_value to create a python instance when the first time we initialize FlyteDirectory, which will set remote_directory variable to false in FlyteDirectory.
When we called to_literal, since remote_directory=False, upload variable will be false, and it will not be uploaded.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's 1 more alternative to achieve this:
Add a parameter in user space, but I think it will make the code more messy.

Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
@Future-Outlier
Copy link
Member Author

Future-Outlier commented Jul 17, 2024

Pydantic V2 Example

from dataclasses import dataclass
import typing
import pandas as pd


from flytekit import task, ImageSpec, workflow, kwtypes
from flytekit.types.file import FlyteFile
from flytekit.types.schema import FlyteSchema
from typing import Dict, List, Optional
from flytekit.types.structured import StructuredDataset

from flytekit.types.directory import FlyteDirectory
import os
from pydantic import BaseModel

key = "19b4f2f26d2e0e57e6820c603f52678bf7d0ca4f"
flytekit_dev_version = f"https://github.com/flyteorg/flytekit.git@{key}"
pydantic_dev_version = f"git+https://github.com/flyteorg/flytekit.git@{key}#subdirectory=plugins/flytekit-pydantic"

image = ImageSpec(
    registry="localhost:30000",
    # registry="futureoutlier",
    apt_packages=["git"],
    packages=[
        pydantic_dev_version,
        f"git+{flytekit_dev_version}",
        "pandas",
    ],
    builder="default",
    # builder="envd",
)

@dataclass
class DC:
    a: int
    b: str
    c: Optional[FlyteFile] = None

TestSchema = FlyteSchema[kwtypes(some_str=str)]

CsvFile = FlyteFile[typing.TypeVar("csv")]
SvgDir = FlyteDirectory[typing.TypeVar("svg")]

class CustomType(BaseModel):
    """Custom type that stores the field it was used in."""

    value: int
    field_name: str
    ff: Optional[FlyteFile] = None
    fd: Optional[FlyteDirectory] = None
    list_sd: Optional[List[StructuredDataset]] = None


    def __repr__(self):
        return f"CustomType<{self.value} {self.field_name!r}>"

# Note: we use `__get_pydantic_core_schema__` to constuct the schema
class MyModel(BaseModel):
    ct: CustomType
    ff: Optional[FlyteFile] = None
    ff_csv: Optional[CsvFile] = None
    fd: Optional[SvgDir] = None
    fsc: Optional[TestSchema] = None
    sd: Optional[StructuredDataset] = None
    dc: Optional[DC] = None
    list_int: Optional[List[int]] = []
    dict_ff: Optional[Dict[str, FlyteFile]] = None

@task(container_image=image)
def t1() -> MyModel:
    # Create a local directory
    dir_path = "./build"
    os.makedirs(dir_path, exist_ok=True)
    file_path = os.path.join(dir_path, "fltedir_example.txt")
    with open(file_path, "w") as f:
        f.write("FlyteDirectory content")

    # write a txt file
    file_path = "./build/local_example.txt"
    os.makedirs(os.path.dirname(file_path), exist_ok=True)
    with open(file_path, "w") as f:
        f.write("Default content")
    # write a csv file
    data = {
        'Column1': ['Row1-Col1', 'Row2-Col1', 'Row3-Col1'],
        'Column2': ['Row1-Col2', 'Row2-Col2', 'Row3-Col2'],
        'Column3': ['Row1-Col3', 'Row2-Col3', 'Row3-Col3'],
    }

    # Create DataFrame
    df = pd.DataFrame(data)

    # Save DataFrame to CSV
    csv_path = './build/example.csv'
    df.to_csv(csv_path, index=False)

    # For FlyteSchema
    schema = TestSchema()
    df = pd.DataFrame(data={"some_str": ["a", "b", "c"]})
    opened_schema = schema.open()
    opened_schema.write(df)

    # SturcturedDataset
    df = pd.DataFrame({"name": ["Tom", "Joseph"], "age": [20, 22]})


    m = MyModel(
        ct=CustomType(
            value=1,
            field_name="my_field",
            ff=FlyteFile(file_path),
            # fd=FlyteDirectory(path=flytekit.current_context().working_directory)
            fd=FlyteDirectory(dir_path),
            list_sd=[StructuredDataset(dataframe=df), StructuredDataset(dataframe=df)]
        ),
        ff=FlyteFile("s3://my-s3-bucket/a/example.txt"),
        ff_csv=CsvFile(csv_path),
        fd=SvgDir("s3://my-s3-bucket/a/"),
        fsc=schema,
        sd = StructuredDataset(dataframe=df),
        dc=DC(a=1, b="b"),
        list_int=[1, 2, 3],
        dict_ff={"a": FlyteFile("s3://my-s3-bucket/a/example.txt"), "b": FlyteFile(file_path)}
    )
    return m

@task(container_image=image)
def t2(m: MyModel) -> MyModel:
    # print(type(m.ct.ff))
    print("@@@ running t2")
    print(m.ct.ff)
    with open(m.ct.ff, "r") as f:
        print(f"Local FlyteFile {m.ct.ff.path}: {f.read()}")
    with open(m.ff, "r") as f:
        print(f"Remote FlyteFile {m.ff.path}: {f.read()}")
    print(pd.read_csv(m.ff_csv))

    with open(m.dict_ff["a"], "r") as f:
        print(f"Dict Local FlyteFile {m.ct.ff.path}: {f.read()}")


    print(f"Local FlyteDirectory {m.ct.fd}: {os.listdir(m.ct.fd)}")
    print(f"Remote FlyteDirectory {m.fd}: {os.listdir(m.fd)}")

    print(f"Local FlyteSchema:\n {m.fsc.open().all()}")

    print(f"Local StructuredDataset: \n{m.sd.open(pd.DataFrame).all()}")
    print(f"Local StructuredDataset: \n{m.ct.list_sd[0].open(pd.DataFrame).all()}")

    print(f"Local list[int]: {m.list_int}")

    return m

@task(container_image=image)
def t3(m: MyModel) -> MyModel:
    print("@@@ running t2")
    print(m.ct.ff)
    with open(m.ct.ff, "r") as f:
        print(f"Local FlyteFile {m.ct.ff.path}: {f.read()}")
    with open(m.ff, "r") as f:
        print(f"Remote FlyteFile {m.ff.path}: {f.read()}")
    print(pd.read_csv(m.ff_csv))

    with open(m.dict_ff["a"], "r") as f:
        print(f"Dict Local FlyteFile {m.ct.ff.path}: {f.read()}")


    print(f"Local FlyteDirectory {m.ct.fd}: {os.listdir(m.ct.fd)}")
    print(f"Remote FlyteDirectory {m.fd}: {os.listdir(m.fd)}")

    print(f"Local FlyteSchema:\n {m.fsc.open().all()}")

    print(f"Local StructuredDataset: \n{m.sd.open(pd.DataFrame).all()}")
    print(f"Local StructuredDataset: \n{m.ct.list_sd[0].open(pd.DataFrame).all()}")

    print(f"Local list[int]: {m.list_int}")

    return m

@workflow
def wf() -> (MyModel, MyModel):
    m = t1()
    return t2(m=m), t3(m=m)

if __name__ == "__main__":
    wf()

Pydantic V1 Example

  1. uncomment this line to override FlyteType's validator method
    https://github.com/flyteorg/flytekit/pull/2577/files#diff-f1dab760361040b570e0b633cf1706a0e8039e7a7b2ab3d9aa68533b03af27c7R4

  2. run the example below

from dataclasses import dataclass
import typing
import pandas as pd


from flytekit import task, ImageSpec, workflow, kwtypes
from flytekit.types.file import FlyteFile
from flytekit.types.schema import FlyteSchema
from typing import Dict, List, Optional
from flytekit.types.structured import StructuredDataset

from flytekit.types.directory import FlyteDirectory
import os
from pydantic.v1 import BaseModel

key = "19b4f2f26d2e0e57e6820c603f52678bf7d0ca4f"
flytekit_dev_version = f"https://github.com/flyteorg/flytekit.git@{key}"
pydantic_dev_version = f"git+https://github.com/flyteorg/flytekit.git@{key}#subdirectory=plugins/flytekit-pydantic"

image = ImageSpec(
    registry="localhost:30000",
    # registry="futureoutlier",
    apt_packages=["git"],
    packages=[
        pydantic_dev_version,
        f"git+{flytekit_dev_version}",
        "pandas",
    ],
    builder="default",
    # builder="envd",
)

@dataclass
class DC:
    a: int
    b: str
    c: Optional[FlyteFile] = None

TestSchema = FlyteSchema[kwtypes(some_str=str)]

CsvFile = FlyteFile[typing.TypeVar("csv")]
SvgDir = FlyteDirectory[typing.TypeVar("svg")]

class CustomType(BaseModel):
    """Custom type that stores the field it was used in."""

    value: int
    field_name: str
    ff: Optional[FlyteFile] = None
    fd: Optional[FlyteDirectory] = None
    list_sd: Optional[List[StructuredDataset]] = None


    def __repr__(self):
        return f"CustomType<{self.value} {self.field_name!r}>"

# Note: we use `__get_pydantic_core_schema__` to constuct the schema
class MyModel(BaseModel):
    ct: CustomType
    ff: Optional[FlyteFile] = None
    ff_csv: Optional[CsvFile] = None
    fd: Optional[SvgDir] = None
    fsc: Optional[TestSchema] = None
    sd: Optional[StructuredDataset] = None
    dc: Optional[DC] = None
    list_int: Optional[List[int]] = []
    dict_ff: Optional[Dict[str, FlyteFile]] = None

@task(container_image=image)
def t1() -> MyModel:
    # Create a local directory
    dir_path = "./build"
    os.makedirs(dir_path, exist_ok=True)
    file_path = os.path.join(dir_path, "fltedir_example.txt")
    with open(file_path, "w") as f:
        f.write("FlyteDirectory content")

    # write a txt file
    file_path = "./build/local_example.txt"
    os.makedirs(os.path.dirname(file_path), exist_ok=True)
    with open(file_path, "w") as f:
        f.write("Default content")
    # write a csv file
    data = {
        'Column1': ['Row1-Col1', 'Row2-Col1', 'Row3-Col1'],
        'Column2': ['Row1-Col2', 'Row2-Col2', 'Row3-Col2'],
        'Column3': ['Row1-Col3', 'Row2-Col3', 'Row3-Col3'],
    }

    # Create DataFrame
    df = pd.DataFrame(data)

    # Save DataFrame to CSV
    csv_path = './build/example.csv'
    df.to_csv(csv_path, index=False)

    # For FlyteSchema
    schema = TestSchema()
    df = pd.DataFrame(data={"some_str": ["a", "b", "c"]})
    opened_schema = schema.open()
    opened_schema.write(df)

    # SturcturedDataset
    df = pd.DataFrame({"name": ["Tom", "Joseph"], "age": [20, 22]})


    m = MyModel(
        ct=CustomType(
            value=1,
            field_name="my_field",
            ff=FlyteFile(file_path),
            # fd=FlyteDirectory(path=flytekit.current_context().working_directory)
            fd=FlyteDirectory(dir_path),
            list_sd=[StructuredDataset(dataframe=df), StructuredDataset(dataframe=df)]
        ),
        ff=FlyteFile("s3://my-s3-bucket/a/example.txt"),
        ff_csv=CsvFile(csv_path),
        fd=SvgDir("s3://my-s3-bucket/a/"),
        fsc=schema,
        sd = StructuredDataset(dataframe=df),
        dc=DC(a=1, b="b"),
        list_int=[1, 2, 3],
        dict_ff={"a": FlyteFile("s3://my-s3-bucket/a/example.txt"), "b": FlyteFile(file_path)}
    )
    return m

@task(container_image=image)
def t2(m: MyModel) -> MyModel:
    # print(type(m.ct.ff))
    print("@@@ running t2")
    print(m.ct.ff)
    with open(m.ct.ff, "r") as f:
        print(f"Local FlyteFile {m.ct.ff.path}: {f.read()}")
    with open(m.ff, "r") as f:
        print(f"Remote FlyteFile {m.ff.path}: {f.read()}")
    print(pd.read_csv(m.ff_csv))

    with open(m.dict_ff["a"], "r") as f:
        print(f"Dict Local FlyteFile {m.ct.ff.path}: {f.read()}")


    print(f"Local FlyteDirectory {m.ct.fd}: {os.listdir(m.ct.fd)}")
    print(f"Remote FlyteDirectory {m.fd}: {os.listdir(m.fd)}")

    print(f"Local FlyteSchema:\n {m.fsc.open().all()}")

    print(f"Local StructuredDataset: \n{m.sd.open(pd.DataFrame).all()}")
    print(f"Local StructuredDataset: \n{m.ct.list_sd[0].open(pd.DataFrame).all()}")

    print(f"Local list[int]: {m.list_int}")

    return m

@task(container_image=image)
def t3(m: MyModel) -> MyModel:
    print("@@@ running t2")
    print(m.ct.ff)
    with open(m.ct.ff, "r") as f:
        print(f"Local FlyteFile {m.ct.ff.path}: {f.read()}")
    with open(m.ff, "r") as f:
        print(f"Remote FlyteFile {m.ff.path}: {f.read()}")
    print(pd.read_csv(m.ff_csv))

    with open(m.dict_ff["a"], "r") as f:
        print(f"Dict Local FlyteFile {m.ct.ff.path}: {f.read()}")


    print(f"Local FlyteDirectory {m.ct.fd}: {os.listdir(m.ct.fd)}")
    print(f"Remote FlyteDirectory {m.fd}: {os.listdir(m.fd)}")

    print(f"Local FlyteSchema:\n {m.fsc.open().all()}")

    print(f"Local StructuredDataset: \n{m.sd.open(pd.DataFrame).all()}")
    print(f"Local StructuredDataset: \n{m.ct.list_sd[0].open(pd.DataFrame).all()}")

    print(f"Local list[int]: {m.list_int}")

    return m

@workflow
def wf() -> (MyModel, MyModel):
    m = t1()
    return t2(m=m), t3(m=m)

if __name__ == "__main__":
    wf()

@Future-Outlier
Copy link
Member Author

Waiting for Pydantic Community for replying how to support both pydantic v1 and v2.
pydantic/pydantic#9919

Future-Outlier and others added 26 commits July 22, 2024 13:36
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
* Add an exeception when filters' value isn't a list

* Make the exception more specific

Signed-off-by: Nelson Chen <[email protected]>

* add an unit test for value_in

Signed-off-by: Nelson Chen <[email protected]>

* lint

Signed-off-by: Kevin Su <[email protected]>

---------

Signed-off-by: Nelson Chen <[email protected]>
Signed-off-by: Kevin Su <[email protected]>
Co-authored-by: Kevin Su <[email protected]>
Signed-off-by: wayner0628 <[email protected]>
Signed-off-by: Kevin Su <[email protected]>
Co-authored-by: Kevin Su <[email protected]>
* Show different of types in dataclass when transforming error

Signed-off-by: Future-Outlier <[email protected]>

* add tests for dataclass

Signed-off-by: Future-Outlier <[email protected]>

* fix tests

Signed-off-by: Future-Outlier <[email protected]>

---------

Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Signed-off-by: Future-Outlier <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants