-
Notifications
You must be signed in to change notification settings - Fork 2.6k
Abs layer support #6475
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Abs layer support #6475
Conversation
@@ -42,7 +42,8 @@ CNNLayer::Ptr ActivationLayerCreator::CreateLayer(pugi::xml_node& node, LayerPar | |||
{"elu", std::make_shared<LayerCreator<CNNLayer>>("ELU")}, | |||
{"sigmoid", std::make_shared<LayerCreator<CNNLayer>>("Sigmoid")}, | |||
{"tanh", std::make_shared<LayerCreator<CNNLayer>>("TanH")}, | |||
{"not", std::make_shared<LayerCreator<CNNLayer>>("LogicalNot")} | |||
{"not", std::make_shared<LayerCreator<CNNLayer>>("LogicalNot")}, | |||
{"abs", std::make_shared<LayerCreator<CNNLayer>>("Abs")} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do you use v7 parser?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No reason, removed.
@@ -1025,7 +1025,8 @@ class INFERENCE_ENGINE_INTERNAL_CNNLAYER_CLASS(EltwiseLayer): public CNNLayer { | |||
Logical_OR, | |||
Logical_XOR, | |||
Logical_NOT, | |||
Mean | |||
Mean, | |||
Abs, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like you don't need to add Abs
layer to Eltwise enum. Because in addSpecificCreator
you create basic CNNLayer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
guess I need it in parser:
void FrontEnd::parseAbs(const Model &model, const ie::CNNLayerPtr &layer, const DataVector &inputs, const DataVector &outputs) const {
LayerParams params = {layer->name, "Eltwise", layer->precision};
auto res = std::make_shared<InferenceEngine::EltwiseLayer>(params);
res->_operation = InferenceEngine::EltwiseLayer::Abs;
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like after these changes we can get two behavior:
- Convert ngraph to ICNNNetwork creates CNNLayer for
Abs
layer - Myriad frontend creates EltwiseLayer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I removed changes in addSpecificCreator
.
So it is now processed in Myriad frontend.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My concern was that I don't want to have two behavior.
Looks like after your changes all plugins which work with legacy representation will work with Abs
layer as with CNNLayer
and only VPU
will interpret it as EltwiseLayer
. Or am I wrong?
Could you align VPU
behavior?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are right.
But the try to align VPU behavior to the other plugins leaded to failures in gna plugin. And processing Abs layer a bit differently on VPU solved the issue. I think this is the minimum amount of code to support the operation, which does not require changes in other plugins.
But if it's critical, I'll take another look at what I can do about it.
So, is it critical?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is not critical because it is a legacy part and some plugins already migrated to new API.
But it is strange that changes in VPU plugin affect GNA 😄
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay, then I'll leave this part as it is since it works.
Do you have any other comments?
@@ -459,6 +459,8 @@ InferenceEngine::details::CNNLayerCreator::CNNLayerCreator(const std::shared_ptr | |||
res->params["operation"] = "max"; | |||
} else if (node->description() == "Minimum") { | |||
res->params["operation"] = "min"; | |||
} else if (node->description() == "Abs") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What specific creator do you use? this creator or creator for Abs
operation?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I use creator for Abs operation
But it seems to break something for gna plugin
Trying to fix the failures
364ff07
to
c4af353
Compare
@ilyachur could you please look once again? |
0f6f80d
to
691971b
Compare
@@ -449,6 +449,8 @@ void EltwiseValidator::parseParams(CNNLayer* layer) { | |||
casted->_operation = EltwiseLayer::Pow; | |||
} else if (op == "mean") { | |||
casted->_operation = EltwiseLayer::Mean; | |||
} else if (op == "abs") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't see any changes in the convert_function_to_cnn_network
. Looks like the conversion won't create Eltwise operation for Abs
layer.
We always create CNNLayer
for Abs
in legacy representation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure. Thanks.
Removed abs from here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Common part LGTM
@AntonDudchenko could you please take a look? |
Changes from OV side LGTM |
f491792
to
9404523
Compare
9404523
to
b17f983
Compare
* [MyriadX] Abs op support
* [MyriadX] Abs op support
* [MyriadX] Abs op support
No description provided.