You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Model stacking is widely used in Kaggle competitions as an effective technique to boost model performance. However, PyTorch Tabular currently does not support stacking out of the box. As a Kaggler, I believe adding this feature would enhance the library's usability, making stacking more accessible and streamlined for users.
Describe the solution you'd like
I propose creating a Config class and a Model class specifically for the stacking model. Through the Config class, users will have the flexibility to customize the stacking process, including specifying which models to stack and their configurations. I am actively working on a pull request to implement this feature and plan to push the changes soon.
Describe alternatives you've considered
A custom implementation of stacking is possible; however, it requires time and effort to design, implement. This is why I believe adding this feature directly to the library is a better and more efficient solution.
Additional Context
To aid visual understanding, I have included a diagram illustrating the proposed stacking mechanism. This shows how different model backbones, such as CategoryEmbedding, FTTransformer, TabTransformer and more, can be combined through concatenation and passed to a linear layer for final predictions.
The text was updated successfully, but these errors were encountered:
taimo3810
changed the title
Add Model Stacking Support to PyTorch Tabular
Add Built-in Support for Model Stacking in PyTorch Tabular
Dec 7, 2024
Is your feature request related to a problem? Please describe.
Model stacking is widely used in Kaggle competitions as an effective technique to boost model performance. However, PyTorch Tabular currently does not support stacking out of the box. As a Kaggler, I believe adding this feature would enhance the library's usability, making stacking more accessible and streamlined for users.
Describe the solution you'd like
I propose creating a Config class and a Model class specifically for the stacking model. Through the Config class, users will have the flexibility to customize the stacking process, including specifying which models to stack and their configurations. I am actively working on a pull request to implement this feature and plan to push the changes soon.
Describe alternatives you've considered
A custom implementation of stacking is possible; however, it requires time and effort to design, implement. This is why I believe adding this feature directly to the library is a better and more efficient solution.
Additional Context
To aid visual understanding, I have included a diagram illustrating the proposed stacking mechanism. This shows how different model backbones, such as CategoryEmbedding, FTTransformer, TabTransformer and more, can be combined through concatenation and passed to a linear layer for final predictions.
The text was updated successfully, but these errors were encountered: