-
Notifications
You must be signed in to change notification settings - Fork 3.7k
[TOPI/TEST] Add Testcase folder for TOPI #225
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
|
@tqchen Got it |
|
Would be nice though to cover the testcase by having a function that takes in the workload configuration, and test a few workloads(different input filter size) |
tqchen
pushed a commit
to tqchen/tvm
that referenced
this pull request
May 26, 2018
tqchen
pushed a commit
that referenced
this pull request
May 29, 2018
tqchen
pushed a commit
to tqchen/tvm
that referenced
this pull request
Jul 6, 2018
sergei-mironov
pushed a commit
to sergei-mironov/tvm
that referenced
this pull request
Aug 8, 2018
vinx13
pushed a commit
to vinx13/tvm
that referenced
this pull request
Mar 9, 2022
tqchen
pushed a commit
to tqchen/tvm
that referenced
this pull request
May 25, 2023
Now FuncCopier will provide a var_map member as the relax var mapping from the old function and the new function. In FuncCopier we will call SymbolicVarRenewMutator, which also renews relax Vars. However, previously we did not consider this var mapping into account, leading that we will provide wrong var_map in FuncCopier. That leads to bugs in the Gradient pass, whichs uses this member. This PR will merge the var mappings in FuncCopier and in SymbolicVarRenewMutator.
tqchen
pushed a commit
to tqchen/tvm
that referenced
this pull request
May 25, 2023
Now FuncCopier will provide a var_map member as the relax var mapping from the old function and the new function. In FuncCopier we will call SymbolicVarRenewMutator, which also renews relax Vars. However, previously we did not consider this var mapping into account, leading that we will provide wrong var_map in FuncCopier. That leads to bugs in the Gradient pass, whichs uses this member. This PR will merge the var mappings in FuncCopier and in SymbolicVarRenewMutator.
zxybazh
pushed a commit
to zxybazh/tvm
that referenced
this pull request
May 25, 2023
Now FuncCopier will provide a var_map member as the relax var mapping from the old function and the new function. In FuncCopier we will call SymbolicVarRenewMutator, which also renews relax Vars. However, previously we did not consider this var mapping into account, leading that we will provide wrong var_map in FuncCopier. That leads to bugs in the Gradient pass, whichs uses this member. This PR will merge the var mappings in FuncCopier and in SymbolicVarRenewMutator.
Lunderberg
pushed a commit
to Lunderberg/tvm
that referenced
this pull request
Jun 12, 2023
Now FuncCopier will provide a var_map member as the relax var mapping from the old function and the new function. In FuncCopier we will call SymbolicVarRenewMutator, which also renews relax Vars. However, previously we did not consider this var mapping into account, leading that we will provide wrong var_map in FuncCopier. That leads to bugs in the Gradient pass, whichs uses this member. This PR will merge the var mappings in FuncCopier and in SymbolicVarRenewMutator.
Hzfengsy
pushed a commit
to Hzfengsy/tvm
that referenced
this pull request
Jun 29, 2023
Now FuncCopier will provide a var_map member as the relax var mapping from the old function and the new function. In FuncCopier we will call SymbolicVarRenewMutator, which also renews relax Vars. However, previously we did not consider this var mapping into account, leading that we will provide wrong var_map in FuncCopier. That leads to bugs in the Gradient pass, whichs uses this member. This PR will merge the var mappings in FuncCopier and in SymbolicVarRenewMutator.
Ubospica
added a commit
to Ubospica/tvm-develop
that referenced
this pull request
Jul 4, 2023
Now FuncCopier will provide a var_map member as the relax var mapping from the old function and the new function. In FuncCopier we will call SymbolicVarRenewMutator, which also renews relax Vars. However, previously we did not consider this var mapping into account, leading that we will provide wrong var_map in FuncCopier. That leads to bugs in the Gradient pass, whichs uses this member. This PR will merge the var mappings in FuncCopier and in SymbolicVarRenewMutator.
MasterJH5574
pushed a commit
to MasterJH5574/tvm
that referenced
this pull request
Aug 17, 2025
junrushao
added a commit
to junrushao/tvm
that referenced
this pull request
Nov 5, 2025
Upstream : https://github.com/apache/tvm-ffi.git Branch : main New HEAD : 75c2a2bb7b8d367f33ade1a1b4f9f14212fc080f Subject : [ADDON] Improved github action for torch-c-dlpack-ext (apache#225) Author : Yaxing Cai <[email protected]> Date : 2025-11-04T20:45:03-08:00 Delta : 1 commit(s) since a5241e5e5edf Compare : apache/tvm-ffi@a5241e5...75c2a2b This commit updates the tvm-ffi submodule to the latest upstream HEAD.
junrushao
added a commit
to junrushao/tvm
that referenced
this pull request
Nov 5, 2025
Upstream : https://github.com/apache/tvm-ffi.git Branch : main New HEAD : 75c2a2bb7b8d367f33ade1a1b4f9f14212fc080f Subject : [ADDON] Improved github action for torch-c-dlpack-ext (apache#225) Author : Yaxing Cai <[email protected]> Date : 2025-11-04T20:45:03-08:00 Delta : 1 commit(s) since a5241e5e5edf Compare : apache/tvm-ffi@a5241e5...75c2a2b This commit updates the tvm-ffi submodule to the latest upstream HEAD.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
@Huyuwei let us add a testcase of the depthwise related api here. The difference from recipes are, we will only test the correctness and compatiblity of python2/python3 under conditions but not the speed(to save the CI server's time)
@sxjscience