-
-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Exposing APIs from HPy extensions to be used by other HPy extensions #446
Comments
IMO, we definitively need some interception. I can add following point:
It may be the case that it is fine to pass the
Sounds good to me. I'm just not so sure about this: numpy_api_capsule->my_api_function_pointer = HPy_AsAPI(ctx, &my_api_function); Would |
Good point. Yes, we should probably do the exactly same thing as with |
Packages from top4000 with string asammdf Do we know of any other package that exposes some C API? I looked at pandas, they don't have it. What is NumPy's take on its C API: should people be ideally using the memory view and other generic means over the NumPy's C API? If that was the case, we could also say that exposing own C APIs is something that should not be done and hence is not supported in HPy. |
I would assume that since there is the array API and NumPy implements it (https://numpy.org/doc/stable/reference/c-api/array.html), NumPy's take is not necessarily to use memory view. But I don't know. |
Isn't that API on the Python level? |
It would be nice if people used the dlpack interface, which provides a standard way to interacts with array-like objects. But thinking about this more deeply it seems that if the HPy port of NumPy must export some kind of C-API, it would still have to be able to export exactly the CPython So if we are confined to use PyArrayObject, can we export that from an HPy port of NumPy without using legacy mode? |
Note all the dlpack interface requires is capsule support, which HPy has. |
Cython does also contain a system for exposing your types/functions as API, via automatic capsule use. But it also has internal shared code capabilities. If you import multiple Cython modules (transpiled with the same version), they'll share the implementation of the custom function type, things like that. |
The motivating example is the NumPy API that is exposed to other Python extensions such that they can work with arrays natively/directly without a round-trip through Python code/abstractions.
How the NumPy API works at the moment:
HPyContext
PyCapsule
with a pointer to this struct filled with pointers to the implementationPyCapsule
from NumPy, gets the raw C pointer from it and uses it to call the NumPy API through the structThe very same scheme can work with HPy, but has one drawback: the 3rd party extension gets some
HPyContext
and passes it to NumPy, which means:HPyContext
instance to different packages (it can store module state in it, for example). WithHPyContext
flowing from one extension to another, this is no longer possible.Are those restrictions problematic enough to seek a better solution?
One possibility is to provide some way to "wrap" function pointers with a trampoline that can "transform" the
HPyContext
to another if necessary. Example in code:Question is how to generate the trampoline. We can use macros for that, something like
HPy_APIDef(...)
. As a bonus we could generate CPython API trampolines, so that the API can be usable from non-HPy packages (NumPy would have to expose another capsule with the CPython trampolines to be used by non-HPy packages).The text was updated successfully, but these errors were encountered: