From 2bc4430c9c9999640a34df4a606a89b74b12008f Mon Sep 17 00:00:00 2001 From: Vishaal Kapoor <40836875+vishaalkapoor@users.noreply.github.com> Date: Fri, 30 Nov 2018 23:35:42 -0800 Subject: [PATCH] Clarify dependency on OpenCV in CNN Visualization tutorial. (#13495) --- docs/tutorials/vision/cnn_visualization.md | 15 ++++++++++----- 1 file changed, 10 insertions(+), 5 deletions(-) diff --git a/docs/tutorials/vision/cnn_visualization.md b/docs/tutorials/vision/cnn_visualization.md index 63d2b13271ba..5ded6f1587e0 100644 --- a/docs/tutorials/vision/cnn_visualization.md +++ b/docs/tutorials/vision/cnn_visualization.md @@ -1,16 +1,21 @@ # Visualizing Decisions of Convolutional Neural Networks -Convolutional Neural Networks have made a lot of progress in Computer Vision. Their accuracy is as good as humans in some tasks. However it remains hard to explain the predictions of convolutional neural networks, as they lack the interpretability offered by other models, for example decision trees. +Convolutional Neural Networks have made a lot of progress in Computer Vision. Their accuracy is as good as humans in some tasks. However, it remains difficult to explain the predictions of convolutional neural networks, as they lack the interpretability offered by other models such as decision trees. -It is often helpful to be able to explain why a model made the prediction it made. For example when a model misclassifies an image, it is hard to say why without visualizing the network's decision. +It is often helpful to be able to explain why a model made the prediction it made. For example, when a model misclassifies an image, without visualizing the network's decision, it is hard to say why the misclassification was made. Explaining the misclassification of volcano as spider -Visualizations also help build confidence about the predictions of a model. For example, even if a model correctly predicts birds as birds, we would want to confirm that the model bases its decision on the features of bird and not on the features of some other object that might occur together with birds in the dataset (like leaves). +Visualizations can also build confidence about the predictions of a model. For example, even if a model correctly predicts birds as birds, we would want to confirm that the model bases its decision on the features of bird and not on the features of some other object that might occur together with birds in the dataset (like leaves). -In this tutorial, we show how to visualize the predictions made by convolutional neural networks using [Gradient-weighted Class Activation Mapping](https://arxiv.org/abs/1610.02391). Unlike many other visualization methods, Grad-CAM can be used on a wide variety of CNN model families - CNNs with fully connected layers, CNNs used for structural outputs (e.g. captioning), CNNs used in tasks with multi-model input (e.g. VQA) or reinforcement learning without architectural changes or re-training. +In this tutorial we show how to visualize the predictions made by convolutional neural networks using [Gradient-weighted Class Activation Mapping](https://arxiv.org/abs/1610.02391). Unlike many other visualization methods, Grad-CAM can be used on a wide variety of CNN model families - CNNs with fully connected layers, CNNs used for structural outputs (e.g. captioning), CNNs used in tasks with multi-model input (e.g. VQA) or reinforcement learning without architectural changes or re-training. -In the rest of this notebook, we will explain how to visualize predictions made by [VGG-16](https://arxiv.org/abs/1409.1556). We begin by importing the required dependencies. `gradcam` module contains the implementation of visualization techniques used in this notebook. +In the rest of this notebook, we will explain how to visualize predictions made by [VGG-16](https://arxiv.org/abs/1409.1556). We begin by importing the required dependencies. + +## Prerequesites +* OpenCV is required by `gradcam` (below) and can be installed with pip using `pip opencv-python`. + +* the `gradcam` module contains the implementation of visualization techniques used in this notebook. `gradcam` can be installed to a temporary directory by executing the following code block. ```python from __future__ import print_function