Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Fix operators lying about their number of inputs #17049

Merged
merged 21 commits into from
Jan 15, 2020

Conversation

ptrendx
Copy link
Member

@ptrendx ptrendx commented Dec 11, 2019

Description

This PR fixes a number of backward operators that advertise the wrong number of inputs (either by having a wrong number passed to set_num_inputs or leaving it empty which defaults to 1 input). It also introduces a check in MakeNode to catch this error.

Having consistent number of inputs and outputs is important because tools like Monitor rely on them (and currently some of the networks are impossible to inspect during backward because of this) and they are needed for more advanced manipulation of the computational graph.

Checklist

Essentials

Please feel free to remove inapplicable items for your PR.

  • Changes are complete (i.e. I finished coding on this PR)

@szha
Copy link
Member

szha commented Dec 17, 2019

how should we test for this systematically?

@ptrendx
Copy link
Member Author

ptrendx commented Dec 18, 2019

I introduced a test for it in the MakeNode function which should cover vast majority of the cases. There is still however a possibility of an escape if you make a node without inputs and then add them manually in your FGradient. For that case only an actual graph validation pass would catch that (NNVM does not really give any enforcement mechanisms for this).

src/operator/nn/concat.cc Outdated Show resolved Hide resolved
@DickJC123
Copy link
Contributor

Thanks for carrying through on this issue. I remember running into the problem and getting only as far as describing the issue with this incomplete PR #15834. I reviewed my implementation of the input check there and I like that you have pushed the check down into MakeNode(). However, I notice in my implementation, I was protecting the routine MakeNonLossGradNode, where you don't have a similar check because MakeNode() is called with inputs=null. Could you review my commit above and add a check for MakeNonLossGradNode if you feel it enhances the protection?

Copy link
Contributor

@DickJC123 DickJC123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding the check to MakeNonlossGradNode(), which caught additional non-compliant operators. LGTM.

@DickJC123 DickJC123 merged commit bd67723 into apache:master Jan 15, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants