Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DLPack Conversion API #1573

Merged
merged 9 commits into from
Aug 10, 2018
Merged

DLPack Conversion API #1573

merged 9 commits into from
Aug 10, 2018

Conversation

eqy
Copy link
Contributor

@eqy eqy commented Aug 9, 2018

(do not merge yet)
cc @tqchen

@tqchen tqchen mentioned this pull request Aug 9, 2018
@tqchen tqchen changed the title temp checkin of WIP dlpack to/from [WIP] DLPack Conversion API Aug 9, 2018
Parameters
----------
dltensor : DLPack tensor

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add description


Parameters
----------
tvm_func: Built tvm function operating on arrays
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

numpy docstring convention

tvm_func: Function
     The original tvm function input

TVM_DLL int TVMArrayToDLPack(TVMArrayHandle from,
DLManagedTensor** out);

/**/
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add comment block here

Wrapped tvm function that operates on PyTorch tensors
"""
import torch
import torch.utils.dlpack
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add a check to see if tvm_func is callable and raise error if it is not

return tvm_func(*args)
return _wrapper

def to_pytorch(tvm_func):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to_pytorch_func to avoid confusion

@tqchen tqchen added the status: need update need update based on feedbacks label Aug 10, 2018
@tqchen tqchen changed the title [WIP] DLPack Conversion API DLPack Conversion API Aug 10, 2018
@tqchen tqchen merged commit f52255b into apache:master Aug 10, 2018
@tqchen tqchen added status: accepted and removed status: need update need update based on feedbacks labels Aug 10, 2018
@tqchen
Copy link
Member

tqchen commented Aug 10, 2018

Thanks @eqy ! this is merged

@wkcn
Copy link
Member

wkcn commented Aug 11, 2018

Hi!
I have a problem about this PR.
For the function _dlpack_deleter in python/tvm/_ffi/ndarray.py,
dltensor = ctypes.py_object(dltensor)
The type of the previous dltensor is class int, and the type of the posterior dltensor is py_object.
The next line:
if ctypes.pythonapi.PyCapsule_IsValid(pycapsule, _c_str_dltensor):
PyCapsule_IsValid will get the address of an int object rather than PyCapsule
So PyCapsule_IsValid always return False.

I think you could check whether the sentence under if runs.

@tqchen
Copy link
Member

tqchen commented Aug 11, 2018

@wkcn Thanks for pointing this out, that line should be changed to
pycapsule = ctypes.cast(pycapsule, ctypes.py_object), can you send a PR?

@tqchen
Copy link
Member

tqchen commented Aug 11, 2018

cc @eqy

@tqchen
Copy link
Member

tqchen commented Aug 11, 2018

#1589 should fix the problem

@wkcn
Copy link
Member

wkcn commented Aug 11, 2018

Yes. #1589 has fixed the problem :-)

FrozenGene pushed a commit to FrozenGene/tvm that referenced this pull request Dec 27, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants