forked from openvinotoolkit/openvino
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Doc Migration (master) (openvinotoolkit#1377)
* Doc Migration from Gitlab (openvinotoolkit#1289) * doc migration * fix * Update FakeQuantize_1.md * Update performance_benchmarks.md * Updates graphs for FPGA * Update performance_benchmarks.md * Change DL Workbench structure (openvinotoolkit#1) * Changed DL Workbench structure * Fixed tags * fixes * Update ie_docs.xml * Update performance_benchmarks_faq.md * Fixes in DL Workbench layout * Fixes for CVS-31290 * [DL Workbench] Minor correction * Fix for CVS-30955 * Added nGraph deprecation notice as requested by Zoe * fix broken links in api doxy layouts * CVS-31131 fixes * Additional fixes * Fixed POT TOC * Update PAC_Configure.md PAC DCP 1.2.1 install guide. * Update inference_engine_intro.md * fix broken link * Update opset.md * fix * added opset4 to layout * added new opsets to layout, set labels for them * Update VisionAcceleratorFPGA_Configure.md Updated from 2020.3 to 2020.4 Co-authored-by: domi2000 <[email protected]>
- Loading branch information
Showing
2 changed files
with
114 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,96 @@ | ||
# Deprecated API for CPU kernels creation {#openvino_docs_IE_DG_Extensibility_DG_deprecated_Factory} | ||
|
||
List of deprecated API for kernels development: | ||
* `InferenceEngine::IExtension::getPrimitiveTypes(char**& types, unsigned int& size, ResponseDesc* resp)` method | ||
* `InferenceEngine::IExtension::getFactoryFor(ILayerImplFactory *&factory, const CNNLayer *cnnLayer, ResponseDesc *resp)` method | ||
* `InferenceEngine::ILayerImplFactory` class | ||
|
||
>**NOTE**: This guide demonstrates how to use deprecated API for kernels creation. However, keep in mind that this API will be deleted soon. | ||
1. Create your custom layer factory `CustomLayerFactory` class: | ||
```cpp | ||
// custom_layer.h | ||
// A CustomLayerFactory class is an example layer, which makes exponentiation by 2 for the input and does not change dimensions | ||
class CustomLayerFactory { | ||
|
||
}; | ||
``` | ||
2. Inherit it from the abstract `InferenceEngine::ILayerImplFactory` class: | ||
```cpp | ||
// custom_layer.h | ||
class CustomLayerFactory: public InferenceEngine::ILayerImplFactory { | ||
}; | ||
``` | ||
|
||
3. Create a constructor, a virtual destructor, and a data member to keep the layer info: | ||
```cpp | ||
// custom_layer.h | ||
class CustomLayerFactory: public InferenceEngine::ILayerImplFactory { | ||
public: | ||
explicit CustomLayerFactory(const CNNLayer *layer): cnnLayer(*layer) {} | ||
private: | ||
CNNLayer cnnLayer; | ||
}; | ||
``` | ||
4. Overload and implement the abstract methods `getShapes` and `getImplementations` of the `InferenceEngine::ILayerImplFactory` class: | ||
```cpp | ||
// custom_layer.h | ||
class CustomLayerFactory: public InferenceEngine::ILayerImplFactory { | ||
public: | ||
// ... constructor and destructor | ||
StatusCode getShapes(const std::vector<TensorDesc>& inShapes, std::vector<TensorDesc>& outShapes, ResponseDesc *resp) noexcept override { | ||
if (cnnLayer == nullptr) { | ||
std::string errorMsg = "Cannot get cnn layer!"; | ||
errorMsg.copy(resp->msg, sizeof(resp->msg) - 1); | ||
return GENERAL_ERROR; | ||
} | ||
if (inShapes.size() != 1) { | ||
std::string errorMsg = "Incorrect input shapes!"; | ||
errorMsg.copy(resp->msg, sizeof(resp->msg) - 1); | ||
return GENERAL_ERROR; | ||
} | ||
outShapes.clear(); | ||
outShapes.emplace_back(inShapes[0]); | ||
return OK; | ||
} | ||
StatusCode getImplementations(std::vector<ILayerImpl::Ptr>& impls, ResponseDesc *resp) noexcept override { | ||
// You can add cnnLayer to implementation if it is necessary | ||
impls.push_back(ILayerImpl::Ptr(new CustomLayerImpl())); | ||
return OK; | ||
} | ||
}; | ||
``` | ||
5. Create your custom layer implementation `CustomLayerImpl` class using the [instruction](../CPU_Kernel.md). | ||
|
||
6. Implement methods in the `Extension` class: | ||
```cpp | ||
// custom_extension.h | ||
class CustomExtention : public InferenceEngine::IExtension { | ||
public: | ||
// ... utility methods | ||
// Retruns the list of supported kernels/layers | ||
StatusCode getPrimitiveTypes(char**& types, unsigned int& size, ResponseDesc* resp) noexcept override { | ||
std::string type_name = "CustomLayer"; | ||
types = new char *[1]; | ||
size = 1; | ||
types[0] = new char[type_name.size() + 1]; | ||
std::copy(type_name.begin(), type_name.end(), types[0]); | ||
types[0][type_name.size()] = '\0'; | ||
return OK; | ||
} | ||
// Main function | ||
StatusCode getFactoryFor(ILayerImplFactory *&factory, const CNNLayer *cnnLayer, ResponseDesc *resp) noexcept override { | ||
if (cnnLayer->type != "CustomLayer") { | ||
std::string errorMsg = std::string("Factory for ") + cnnLayer->type + " wasn't found!"; | ||
errorMsg.copy(resp->msg, sizeof(resp->msg) - 1); | ||
return NOT_FOUND; | ||
} | ||
factory = new CustomLayerFactory(cnnLayer); | ||
return OK; | ||
} | ||
}; | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,18 @@ | ||
# Old ShapeInference Extensibility API {#openvino_docs_IE_DG_Extensibility_DG_deprecated_ShapeInfer} | ||
|
||
The new approach to shape inference suggests a creation of a custom nGraph operation that contains a special method for shape inference. | ||
The following classes and methods were deprecated: | ||
|
||
* `InferenceEngine::IShapeInferExtension` class | ||
* `InferenceEngine::IShapeInferExtension::getShapeInferTypes(char**&, unsigned int&, ResponseDesc*)` method | ||
* `InferenceEngine::IShapeInferExtension::getShapeInferImpl(IShapeInferImpl::Ptr&, const char*, ResponseDesc*)` method | ||
|
||
However, the old approach with the `InferenceEngine::IShapeInferExtension` method still works for already existing custom layers. | ||
Custom Shape Inference functions are registered by calling `InferenceEngine::ICNNNetwork::AddExtension` with the implemented `InferenceEngine::IShapeInferExtension` method, which is a holder of custom implementations. | ||
The holder requires to implement two key methods: | ||
* `InferenceEngine::IShapeInferExtension::getShapeInferImpl` - Returns custom shape inference implementation for the given type. | ||
* `InferenceEngine::IShapeInferExtension::getShapeInferTypes` - Provides all custom types. | ||
|
||
Custom shape inference implementation is represented by the `InferenceEngine::IShapeInferImpl::inferShapes` method. | ||
|
||
It is impossible to overwrite built-in shape inference functions. Custom type must be different from the supported ones. |