-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Not working with multiple processes #40
Comments
Thanks for your report! |
Thanks! FYI, If you move |
moving |
I tried that, but it does not work. Example code: from concurrent import futures
import sys
import mxnet as mx
import mobula
# Import Custom Operator Dynamically
mobula.op.load('./AdditionOP')
def foo():
AdditionOP = mobula.op.AdditionOP
a = mx.nd.array([1, 2, 3])
b = mx.nd.array([4, 5, 6])
a.attach_grad()
b.attach_grad()
with mx.autograd.record():
c = AdditionOP(a, b)
dc = mx.nd.array([7, 8, 9])
c.backward(dc)
assert ((a + b).asnumpy() == c.asnumpy()).all()
assert (a.grad.asnumpy() == dc.asnumpy()).all()
assert (b.grad.asnumpy() == dc.asnumpy()).all()
print('Okay :-)')
print('a + b = c \n {} + {} = {}'.format(a.asnumpy(), b.asnumpy(), c.asnumpy()))
def main():
ex = futures.ProcessPoolExecutor(1)
r = ex.submit(foo)
r.result()
if __name__ == "__main__":
main() |
@YutingZhang from concurrent import futures
import mxnet as mx
import sys
from mobula.testing import assert_almost_equal
sys.path.append('../../') # Add MobulaOP Path
class AdditionOP(mx.operator.CustomOp):
def __init__(self):
super(AdditionOP, self).__init__()
def forward(self, is_train, req, in_data, out_data, aux):
out_data[0][:] = in_data[0] + in_data[1]
def backward(self, req, out_grad, in_data, out_data, in_grad, aux):
in_grad[0][:] = out_grad[0]
in_grad[1][:] = out_grad[0]
@mx.operator.register("AdditionOP")
class AdditionOPProp(mx.operator.CustomOpProp):
def __init__(self):
super(AdditionOPProp, self).__init__()
def list_arguments(self):
return ['a', 'b']
def list_outputs(self):
return ['output']
def infer_shape(self, in_shape):
return in_shape, [in_shape[0]]
def create_operator(self, ctx, shapes, dtypes):
return AdditionOP()
def foo():
a = mx.nd.array([1, 2, 3])
b = mx.nd.array([4, 5, 6])
a.attach_grad()
b.attach_grad()
print("REC")
with mx.autograd.record():
c = mx.nd.Custom(a, b, op_type='AdditionOP')
dc = mx.nd.array([7, 8, 9])
c.backward(dc)
assert_almost_equal(a + b, c)
assert_almost_equal(a.grad, dc)
assert_almost_equal(b.grad, dc)
print('Okay :-)')
print('a + b = c \n {} + {} = {}'.format(a.asnumpy(), b.asnumpy(), c.asnumpy()))
def main():
ex = futures.ProcessPoolExecutor(1)
r = ex.submit(foo)
r.result()
if __name__ == '__main__':
main() |
So |
Yes. |
@wkcn Send you an email to your live.cn email :) |
Mail received. Thank you! : ) |
Hi @YutingZhang , the two testcases you gave have been passed in the latest MXNet and MobulaOP : ) |
@wkcn Thanks a lot! Did you work around the problem in MobulaOP? Or is it due to MxNet's update on CustomOP (you also contributed to this)? |
@YutingZhang It is due to MXNet’s update, and other contributors fixed it. |
Close it since the problem has been addressed. : ) |
When calling MobulaOP in a subprocess, it gets stuck.
Environment: lastest mxnet nightly build and Python 3.6.5
An example code modified from
dynamic_import_op.py
to replicate this error.The text was updated successfully, but these errors were encountered: