Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

ndarray.choose_element_0index not differentiable / bug in gradients after slicing? #7853

Closed
jaanli opened this issue Sep 11, 2017 · 2 comments · Fixed by #14273
Closed

ndarray.choose_element_0index not differentiable / bug in gradients after slicing? #7853

jaanli opened this issue Sep 11, 2017 · 2 comments · Fixed by #14273

Comments

@jaanli
Copy link

jaanli commented Sep 11, 2017

Environment info

Operating System: OSX
MXNet version: 0.11.0
Python version and distribution: python3/anaconda

Minimum reproducible example

I want to differentiate through ndarray.choose_element_0index:

In [2]: from mxnet import autograd

In [3]: from mxnet import gluon

In [4]: from mxnet import nd

In [5]: params = gluon.ParameterDict()

In [6]: w = params.get('var', shape=(1, 10))

In[7]: params.initialize()

In [8]: with autograd.record():
   ...:     element = nd.choose_element_0index(w.data(), nd.array([0]))
   ...:     y = nd.square(element)
   ...:     y.backward()

Returns:

----> 4     y.backward()
      5

/usr/local/anaconda3/envs/yumi/lib/python3.6/site-packages/mxnet/ndarray.py in backward(self, out_grad, retain_graph, train_mode)
   1100             c_array(NDArrayHandle, ograd_handles),
   1101             ctypes.c_int(retain_graph),
-> 1102             ctypes.c_int(train_mode)))
   1103
   1104

/usr/local/anaconda3/envs/yumi/lib/python3.6/site-packages/mxnet/base.py in check_call(ret)
    127     """
    128     if ret != 0:
--> 129         raise MXNetError(py_str(_LIB.MXGetLastError()))
    130
    131 if sys.version_info[0] < 3:

MXNetError: [16:38:03] src/ndarray/autograd.cc:237: Check failed: !i.entry_.is_none() Cannot differentiate node because it is not in a computational graph. You need to set is_recording to true or use autograd.record() to save computational graphs for backward. If you want to differentiate the same graph twice, you need to pass retain_graph=True to backward.

Stack trace returned 5 entries:
[bt] (0) 0   libmxnet.so                         0x0000000107519ad8 _ZN4dmlc15LogMessageFatalD2Ev + 40
[bt] (1) 1   libmxnet.so                         0x0000000107cfaca4 _ZN5mxnet8autograd15AutogradRuntime15ComputeGradientERKNSt3__16vectorINS_7NDArrayENS2_9allocatorIS4_EEEES9_bb + 9236
[bt] (2) 2   libmxnet.so                         0x0000000107c1bc88 MXAutogradBackwardEx + 488
[bt] (3) 3   _ctypes.cpython-36m-darwin.so       0x0000000106d2d2b7 ffi_call_unix64 + 79
[bt] (4) 4   ???                                 0x00007fff5afbcd60 0x0 + 140734719839584

But this works and behaves as expected:

In [19]: with autograd.record():
    ...:     y = nd.square(w.data()[0][0])
    ...:     y.backward()
    ...:     print(w.grad())
    ...:

[[ 0.01366779  0.          0.          0.          0.          0.          0.
   0.          0.          0.        ]]
<NDArray 1x10 @cpu(0)>
@szha
Copy link
Member

szha commented Dec 22, 2017

@apache/mxnet-committers: This issue has been inactive for the past 90 days. It has no label and needs triage.

For general "how-to" questions, our user forum (and Chinese version) is a good place to get help.

@vandanavk
Copy link
Contributor

Error on MXNet v1.5 (built from source). Tested on OSX

---------------------------------------------------------------------------
MXNetError                                Traceback (most recent call last)
<ipython-input-13-b04ee09a6160> in <module>()
     12      element = nd.choose_element_0index(w.data(), nd.array([0]))
     13      y = nd.square(element)
---> 14      y.backward()

~/Documents/mxnet/incubator-mxnet/python/mxnet/ndarray/ndarray.py in backward(self, out_grad, retain_graph, train_mode)
   2213             ctypes.c_int(train_mode),
   2214             ctypes.c_void_p(0),
-> 2215             ctypes.c_void_p(0)))
   2216 
   2217     def tostype(self, stype):

~/Documents/mxnet/incubator-mxnet/python/mxnet/base.py in check_call(ret)
    250     """
    251     if ret != 0:
--> 252         raise MXNetError(py_str(_LIB.MXGetLastError()))
    253 
    254 

MXNetError: [15:24:46] src/pass/gradient.cc:192: Operator choose_element_0index is non-differentiable because it didn't register FGradient attribute.

Stack trace returned 10 entries:
[bt] (0) 0   libmxnet.so                         0x000000011232dbd6 dmlc::StackTrace() + 1238
[bt] (1) 1   libmxnet.so                         0x000000011232d5c5 dmlc::LogMessageFatal::~LogMessageFatal() + 53
[bt] (2) 2   libmxnet.so                         0x00000001162009ea nnvm::pass::(anonymous namespace)::Gradient(nnvm::Graph) + 13066
[bt] (3) 3   libmxnet.so                         0x00000001156de032 nnvm::Graph std::__1::__invoke_void_return_wrapper<nnvm::Graph>::__call<nnvm::Graph (*&)(nnvm::Graph), nnvm::Graph>(nnvm::Graph (*&&&)(nnvm::Graph), nnvm::Graph&&) + 162
[bt] (4) 4   libmxnet.so                         0x00000001156dded0 std::__1::__function::__func<nnvm::Graph (*)(nnvm::Graph), std::__1::allocator<nnvm::Graph (*)(nnvm::Graph)>, nnvm::Graph (nnvm::Graph)>::operator()(nnvm::Graph&&) + 64
[bt] (5) 5   libmxnet.so                         0x00000001161e5e18 nnvm::ApplyPasses(nnvm::Graph, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > > > const&) + 1448
[bt] (6) 6   libmxnet.so                         0x000000011512548e nnvm::ApplyPass(nnvm::Graph, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&) + 590
[bt] (7) 7   libmxnet.so                         0x00000001152376fc nnvm::pass::Gradient(nnvm::Graph, std::__1::vector<nnvm::NodeEntry, std::__1::allocator<nnvm::NodeEntry> >, std::__1::vector<nnvm::NodeEntry, std::__1::allocator<nnvm::NodeEntry> >, std::__1::vector<nnvm::NodeEntry, std::__1::allocator<nnvm::NodeEntry> >, std::__1::function<nnvm::NodeEntry (std::__1::vector<nnvm::NodeEntry, std::__1::allocator<nnvm::NodeEntry> >&&)>, std::__1::function<int (nnvm::Node const&)>, std::__1::function<nnvm::NodeEntry (nnvm::NodeEntry const&, nnvm::NodeEntry const&)>, std::__1::vector<nnvm::Op const*, std::__1::allocator<nnvm::Op const*> >, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >) + 8764
[bt] (8) 8   libmxnet.so                         0x000000011537a8c5 mxnet::Imperative::Backward(std::__1::vector<mxnet::NDArray*, std::__1::allocator<mxnet::NDArray*> > const&, std::__1::vector<mxnet::NDArray*, std::__1::allocator<mxnet::NDArray*> > const&, std::__1::vector<mxnet::NDArray*, std::__1::allocator<mxnet::NDArray*> > const&, bool, bool, bool) + 13445
[bt] (9) 9   libmxnet.so                         0x00000001150d434d MXAutogradBackwardEx + 3293

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants