Skip to content

cotton6/COTTON-size-does-matter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

[ICCV'23] Size Does Matter: Size-aware Virtual Try-on via Clothing-oriented Transformation Try-on Network

Chieh-Yun Chen1,2, Yi-Chung Chen1,3, Hong-Han Shuai2, Wen-Huang Cheng3,
1Stylins.ai  2National Yang Ming Chiao Tung University  3National Taiwan University

Official Pytorch implementation [Paper][Supplement]

image

Abstract: Virtual try-on tasks aim at synthesizing realistic try-on results by trying target clothes on humans. Most previous works relied on the Thin Plate Spline or the prediction of appearance flows to warp clothes to fit human body shapes. However, both approaches cannot handle complex warping, leading to over distortion or misalignment. Furthermore, there is a critical unaddressed challenge of adjusting clothing sizes for try-on. To tackle these issues, we propose a Clothing-Oriented Transformation Try-On Network (COTTON). COTTON leverages clothing structure with landmarks and segmentation to design a novel landmark-guided transformation for precisely deforming clothes, allowing for size adjustment during try-on. Additionally, to properly remove the clothing region from the human image without losing significant human characteristics, we propose a clothing elimination policy based on both transformed clothes and human segmentation. This method enables users to try on clothes tucked-in or untucked while retaining more human characteristics. Both qualitative and quantitative results show that COTTON outperforms the state-of-the-art high-resolution virtual try-on approaches.

Implementation

Please see ./code for more implementation details.

Multi-garment try-on results

image

Multi-size try-on results

image image image image image image

Visual comparison with state-of-the-art virtual try-on methods

  • Preserving human characteristics, i.e., tattoo

    Due to the proposed Clothing Elimination Policy, COTTON is able to preserve the human characteristics, i.e. tattoo. image

  • Preserving clothing characteristics, i.e., neckline

    Our proposed Clothing Segmentation Network properly segments the region of clothes around the neckline that cannot be seen when people wear it. It helps COTTON to yield correct neckline type on try-on results. On the other hand, the baselines all lead to undesired noise around the neckline on the final synthesis results. image

Citation

@InProceedings{Chen_2023_ICCV,
    author    = {Chen, Chieh-Yun and Chen, Yi-Chung and Shuai, Hong-Han and Cheng, Wen-Huang},
    title     = {Size Does Matter: Size-aware Virtual Try-on via Clothing-oriented Transformation Try-on Network},
    booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
    month     = {October},
    year      = {2023},
    pages     = {7513-7522}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published