Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Block Sparsity not working for Squeezable Pointwise Convolution Layers #38

Closed
s36srini opened this issue Jul 22, 2019 · 2 comments
Closed
Assignees
Labels
feature request feature request technique:pruning Regarding tfmot.sparsity.keras APIs and docs

Comments

@s36srini
Copy link

ValueError: Block Sparsity can only be used for layers which have 2-dimensional weights. I checked the source code and saw this comment:
# TODO(pulkitb): Check if squeeze operations should now be removed since we are only accepting 2-D weights.

How is pruning going to work with pointwise convolutional layers of format 1x1xiCxoC if block sparsity is only supported for 2D tensors?

@alanchiao alanchiao added the technique:pruning Regarding tfmot.sparsity.keras APIs and docs label Feb 6, 2020
@alanchiao alanchiao added the feature request feature request label Feb 27, 2020
@teijeong
Copy link
Contributor

Hi @liyunlu0618 , can you update?

@liyunlu0618
Copy link
Contributor

Duplicate of #634.
Will use that to track the feature request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request feature request technique:pruning Regarding tfmot.sparsity.keras APIs and docs
Projects
None yet
Development

No branches or pull requests

5 participants