Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

trio.run() should take **kwargs in addition to *args #470

Open
kennethreitz opened this issue Mar 16, 2018 · 73 comments
Open

trio.run() should take **kwargs in addition to *args #470

kennethreitz opened this issue Mar 16, 2018 · 73 comments

Comments

@kennethreitz
Copy link

kennethreitz commented Mar 16, 2018

this would greatly enhance usability — especially for function declarations like:

async def request(method, url, *, pool=None, preload_content=False):

@njsmith
Copy link
Member

njsmith commented Mar 17, 2018

Hmm, this has come up a few times, and it's a totally reasonable question – I think we need some sort of FAQ entry for this... maybe in the tutorial? Will think about it.

Anyway, here's the problem: a function like run has to take the thing to run, and it also has to take its own arguments to configure how it runs it. So we need some way to distinguish which arguments are for which, but all the possible solutions involve some kind of awkwardness.

BTW, in ordinary usage you wouldn't use trio.run to call request, that wouldn't make much sense – you normally call trio.run once at the very top of your program, so it's usually something like trio.run(main). (I guess you're looking at making synchronous shim wrappers or something?) But there are a bunch of functions in trio that have this same problem: nursery.start_soon, run_sync_in_worker_thread, etc. etc., so we do need a general solution.

What we do is recommend people do from functools import partial and then trio.run(partial(request, method, url, pool=my_pool)) or whatever. I'm not a huge fan, but all the alternatives seem worse. Some advantages of this are: (a) it works uniformly for async and sync callables, (b) it's ordinary standard Python, so once you learn it you can use it everywhere, with or without trio, (c) partial objects support introspection, so e.g. if you start a task like this and then later use trio's debugging interfaces to print out the task tree, then you'll see that trio has figured out that this task is running a function named request, not one named partial or <lambda> or something, (d) it allows extra arguments to be added, which is kind of a niche thing but important for nursery.start (note: different from nursery.start_soon), (e) the syntax really isn't that bad, and in particular supports all of Python's normal funcall syntax like partial(request, *args, extra_arg, something=whatever, **my_kwargs). The left edge looks a little funny, but otherwise it's using function call syntax to represent a function call.

Rejected alternatives include:

  • YOLO it up: run takes its arguments, any unrecognized arguments are passed on to the called function. But if we did this, then we could never add any new kwargs to run (or similar functions) without breaking backcompat, which isn't workable in a foundational library like trio that needs to (eventually) be super careful about backcompat.

  • Reserve underscored kwarg names as being for trio.run, and the rest get passed on to the function being called: this is kinda gross, and doesn't really solve the problem, because there's nothing that says an arbitrary function can't take kwargs that start with underscores. In particular, by implementing this we would be defining a dozen such functions! And some of them you might want to pass to each other in unusual-but-they-happen situations.

  • Use a special argument for arguments to run, like def run(fn, *args, **kwargs, run_args={}): this is super awkward to use and document (the signature no longer tells you what arguments you can pass in run_args), plus still has the issue of potential kwarg name collisions (what if fn has a run_args argument).

  • Use a special arguments for kwargs for the function, like def run(fn, *args, *, fn_kwargs={}, ...): this is uglier to use and read than partial (since now you have to write kwargs using dict syntax, while partial uses call syntax), and it's still an idiosyncratic special trick we'd be forcing people to learn. If we're going to make people learn an idiosyncratic trick anyway, partial is much more generally useful.

  • Allow raw coroutine objects, like def run(fn(*args, **kwargs), ...): it doesn't solve the problem for variants that take sync functions like run_sync_in_worker_thread, it doesn't work for more sophisticated nursery-like objects that need to control how they run the function passed to nursery.start_soon (think: Erlang-style supervisors that can restart services after a crash), and it would make it impossible for us to ever fix the super-annoying wart that new users always run into where they forget a mandatory await and Python doesn't give them a proper error.

If you've got a better idea definitely do let me know but... what we do now really is the best option available, AFAICT :-).

@cjerdonek
Copy link
Contributor

cjerdonek commented Mar 17, 2018

a function like run has to take the thing to run, and it also has to take its own arguments to configure how it runs it.

Here's another option I don't see listed. What about not supporting passing configuration options directly to run() at all? Instead, if non-default behavior is desired, the caller can instantiate, say, a TrioRunner object with those options and use runner.run(). It looks like you're almost doing that here already anyways, but with a Runner class that has a slightly different responsibility, and perhaps not exposed as part of the public API.

There's something to be said for a clean function signature that doesn't mix arguments for async_fn with arguments for run. It also provides more options for configuring run() and may age better, too, because you won't have to add more and more kwargs over time. (One of them is already called restrict_keyboard_interrupt_to_checkpoints :), which might be better as an attribute.)

@njsmith
Copy link
Member

njsmith commented Mar 18, 2018

What about not supporting passing configuration options directly to run() at all? Instead, if non-default behavior is desired, the caller can instantiate, say, a TrioRunner object with those options and use runner.run().

Interesting idea! And for trio.run alone, this would probably work. But, the same problem happens for:

  • trio.run
  • nursery.start_soon
  • nursery.start
  • trio.run_sync_in_worker_thread
  • trio.BlockingTrioPortal.run
  • trio.BlockingTrioPortal.run_sync
  • trio.hazmat.spawn_system_task
  • trio.hazmat.TrioToken.run_sync_soon

And then trio-asyncio has the same problem for its functions to call trio-from-asyncio and vice-versa, and I'm sure this will keep cropping up around the ecosystem.

Adding a configobject interface for all of these seems really cumbersome? And I'd rather have just one solution to this problem that gets used everywhere, so you only have to learn it once, rather than two different ones so then you have to learn both solutions, and which solution is used where.

@cjerdonek
Copy link
Contributor

cjerdonek commented Mar 18, 2018

Okay, good to know. There's yet another hybrid pattern / approach you could use for all of these. Namely, rather than calling, say, runner.run() and putting the config options in the runner object, you could have a single options argument that houses all of the config options and optionally pass that to run().

More generally, using this pattern means that you would have at most one reserved keyword argument name per function. The argument name could potentially be the same across all of the functions, though the class could vary. When you want to add more options to some function, you could add to the appropriate class without needing to touch any of the function signatures. Maybe you could even do something clever with an optional positional argument, so you wouldn't need to choose and reserve a name.

@njsmith
Copy link
Member

njsmith commented Mar 18, 2018

Ah, yeah, that's similar to the run_args={} idea I mentioned above, except using a class instead of a dict to pass the options. That has the advantage that the class could use funcall syntax to specify the options (options=trio.RunOptions(a=1, b=2) vs. options={"a": 1, "b": 2}). You do have to define the class (though maybe just using a single Options class that just saves its kwargs would be enough for everyone?), and with one reserved kwarg you still have the possibility for collisions.

Maybe you could even do something clever with an optional positional argument, so you wouldn't need to choose and reserve a name.

That's true, you could do something like:

def run(*args, **kwargs):
   # Real code would need more fiddly error-checking but this should give the idea
    if isinstance(args[0], Options):
        options = args.pop(0)
    fn = args.pop(0)
    ...

...and then it'd be used like:

run(fn, foo, bar)
# or
run(Options(clock=whatever), fn, foo, bar)

This feels very surprising though in the Python context.

Again, these can work but... the only real downside to partial is that it forces people to learn one quirky convention, but everything else does that too.

(FWIW, I stole the partial convention from asyncio, so I think that means it also has the Guido Seal Of Approval ;-).)

Maybe the tutorial should just get a short discussion of partial + a link to this thread for those who are curious about the details.

@cjerdonek
Copy link
Contributor

Thanks for summarizing. You stated it well. Yeah, I'm familiar with the functools.partial approach required by asyncio. But it always feels unsatisfying and never natural. For example, each time you're faced with the decision of whether to pass the *args in the partial() call, or pass only the **kwargs and leave the *args for run(). The latter spreads the arguments across more than one invocation, which isn't ideal for code comprehension. And if the former should be used, then why not insist that partial() always be used? That way there'd be only "one way." (Actually, some parts of asyncio's API do require partial, like Future.add_done_callback().)

That has the advantage that the class could use funcall syntax to specify the options (options=trio.RunOptions(a=1, b=2) vs. options={"a": 1, "b": 2}).

It might be worth pointing out that you can also use funcall syntax for dicts, e.g. options=dict(a=1, b=2), though it wouldn't do argument name checking like a class would provide. I often do this for cases where the dict value will wind up as kwargs for some function.

Incidentally, if you'll indulge me, I can't help mentioning a vague, not-fully-formed idea I've had for some time, which is for Python to expose an object that could be used to encapsulate *args and **kwargs in a single object (e.g. pkargs = Args(*args, **kwargs)). It would be some marriage of a tuple and a dict and allow passing *args and **kwargs to a function using a single value (e.g. func(***pkargs) or a similar syntax). Do you know if something like this has ever been proposed?

@njsmith
Copy link
Member

njsmith commented Mar 19, 2018

For example, each time you're faced with the decision of whether to pass the *args in the partial() call, or pass only the **kwargs and leave the *args for run(). The latter spreads the arguments across more than one invocation, which isn't ideal for code comprehension.

I guess my convention is to always use either run(fn, arg1, arg2, run_arg=...) or run(partial(fn, arg1, arg2, kwarg=something), run_arg=...). I agree that run(partial(fn, kwarg=something), arg1, arg2) would be very weird style.

And if the former should be used, then why not insist that partial() always be used? That way there'd be only "one way."

Yeah, it really comes down to taste. On balance I think the advantages of being able to write run(fn, arg1, arg2) are enough to make it worthwhile, but if you'd rather write run(partial(fn, arg1, arg2)) then go for it I guess :-).

(Actually, some parts of asyncio's API do require partial, like Future.add_done_callback().)

¯\_(ツ)_/¯

It would be some marriage of a tuple and a dict and allow passing *args and **kwargs to a function using a single value (e.g. func(***pkargs) or a similar syntax). Do you know if something like this has ever been proposed?

Huh, no, I haven't seen that idea before.

@buhman
Copy link
Member

buhman commented Mar 28, 2018

Personally I'd like to see python add syntax-level support for partial application, then the equivalent of run(partial(fn, arg1, arg2)) is always what happens.

@njsmith
Copy link
Member

njsmith commented Mar 29, 2018

@buhman if you can convince python-dev of that then we can certainly revisit this :-).

@Zac-HD
Copy link
Member

Zac-HD commented May 26, 2018

Duplicate of #88, similar to python-attrs/attrs#22.

(FWIW I'm very strongly in favor of partial instead of adding **kwargs)

@sscherfke
Copy link

Aliasing partial (e.g., as P) can reduce the pain and visual clutter:

from functools import partial as P

trio.run(P(func, a, b, c=42))

@njsmith
Copy link
Member

njsmith commented Jul 25, 2018

I have suggested, half in jest, that trio should add trio.withargs or something as an alias for functools.partial, to make it feel less weird and scary.

It might actually be worth considering seriously.

@njsmith
Copy link
Member

njsmith commented Dec 20, 2018

There was a substantial discussion/brainstorming session about this in chat yesterday, starting around here: https://gitter.im/python-trio/general?at=5c19957cb8760c21bbed7ee9

I think there are 3 reasons there's ongoing tension here: (1) intuitively it seems like people want to pass through kwargs to their function a lot more than they want to use the relatively obscure configuration arguments like start_soon(..., name="myname") or run_sync_in_worker_thread(..., cancellable=True) so it feels like passing through kwargs should be the shorter/easier to read version, and it isn't. This is something we could check by looking at real code. (2) this creates pressure to prefer positional arguments in new APIs, which on the margin means sacrificing other considerations, like readability, (3) partial is jargony and obscure and requires an extra import, which are bad things in a common operation.

Some ideas from the thread:

  • make nursery.start_soon(fn, args(posarg, kw=whatever)) the One Way to call things while passing args.
  • nursery.start_soon[options(name="thing")](fn, posarg, kw=whatever). Seems weird and syntax is similar to how PEP 484 does generic constructors (e.g. l = List[int]()).
  • nursery.start_soon.add_options(name="thing")(fn, posarg, kw=whatever)
  • nursery.start_soon(fn, posarg, kw=whatever, start_soon_name="thing") (or trio_name or something, maybe, though then we'd lose the ability to error out if someone passes an unrecognized kwarg from the future)
  • nursery.with_options(name="thing").start_soon(fn, posarg, kw=whatever)
  • split into nursery.start_soon and nursery.start_soon_configured where the former only accepts arguments to fn, and the latter looks like what we have now.

Occasionally there are rumblings about adding some kind of macro/quoting syntax to Python. If that happens I guess we could have nursery.start_soon!(fn(posarg, kw=whatever), name="thing"), which would look lovely. But it would require stack introspection on every call, which might be too expensive for this.

None of these strike me as obviously superior to what we're doing now, but there are some new ideas I wanted to capture.

@Zac-HD
Copy link
Member

Zac-HD commented Dec 20, 2018

I kinda like nursery.with_options(...).start_soon, actually.

@njsmith
Copy link
Member

njsmith commented Jan 25, 2019

More brainstorming in chat: https://gitter.im/python-trio/general?at=5c4a4449f780a1521f638f59

@dhirschfeld
Copy link
Member

Put my 2¢ in @ https://gist.github.com/njsmith/ad4fc82578239646ccdf986ae3ca07c1

Curious if anyone else thinks that's a good api?
s/config/options/with_options/add_options depending on taste.

@smurfix
Copy link
Contributor

smurfix commented Jan 28, 2019

Another idea, see https://gitter.im/python-trio/general?at=5c4b3e3e20b78635b67fa78e

Basically this would allow you to call .start_soon(fn, *args, **kwargs) or .start_soon(**start_soon_options)(fn, *args, **kwargs), which is syntactically as minimal as you can get

@dhirschfeld
Copy link
Member

It's certainly possible to switch behaviour on whether or not fn is passed in but that's probably too much black-magic for my tastes.

The verbosity of options/config/with_options does grate a little but it has the benefit of being pretty explicit and obvious.

@dhirschfeld
Copy link
Member

Another api I was considering was forcing users to use a method to call the func, thereby leaving the __call__ free to configure the options - e.g.

class takes_callable(object):
    def __init__(self, wrapped_fn, **options):
        self.wrapped_fn = wrapped_fn
        self.options = options
    
    def call(self, fn, *args, **kwargs):
        return self.wrapped_fn(
            partial(fn, *args, **kwargs),
            **self.options
        )
    
    def __call__(self, **options):
        return takes_callable(self.wrapped_fn, **options)
In [41]: @takes_callable
    ...: def run(fn, **options):
    ...:     print(f"options = {options!r}")
    ...:     return fn()

In [42]: def func(*args, **kwargs):
    ...:     print(args)
    ...:     print(kwargs)

In [43]: run.call(func, 1, 2, q=3)
options = {}
(1, 2)
{'q': 3}

In [44]: run(clock=None).call(func, 1, 2, q=3)
options = {'clock': None}
(1, 2)
{'q': 3}

I think this might be a cleaner api but also less obvious/discoverable

@himtronics
Copy link

IMO the run*()/start*()/spawn*() methods all need to transform a function call (function + parameters) into a task that will be run/started/spawned sooner or later with possibly some additional options on how to do that.

Splitting the spec of the function call over multiple parameters of the run*()/start*()/spawn*() methods is not a good idea as this is something belonging tightly together and that is what partial(fn, *args, **kw) does, storing the parameters in a single object though the name of the method is sub optimal for trio's purpose.

As trio does not necessarily need the __call__ functionality it could just use partial() for storing the info and take fn, *args and **kw from that object when creating the underlying Task() so that partial() is not on the call stack.

trio could import partial under a different name (or declare a class that just stores its arguments in its __init__(), unfortunately Task() is already taken and FuncCall() a bit long, perhaps FC?,

E.g. start_soon(async_fc, name=None) with async_fc = FC(async_fn, *args, **kw) would allow to extend the run*/start*/spawn* methods by other positional or keyword parameters.

@njsmith
Copy link
Member

njsmith commented Mar 17, 2019

Doing some paid work tonight, I'm being annoyed again by having to use functools.partial every time I call run_sync_in_worker_thread.

I think at this point we can be fairly confident we've exhausted all the different combinations of .with_options and variants, and they're all pretty awful.

I'm thinking about revisiting one of the ideas we rejected early on: of using underscores to mark configuration-kwargs, versus no-underscore to mark passthrough-kwargs. So example usage would be:

await trio.run(main, _clock=MyClock)
nursery.start_soon(async_fn, arg1, arg2, kwarg=foo, otherkwarg=bar, _name="something")
await trio.run_sync_in_worker_thread(container.logs, stdout=True, stderr=False, _limiter=my_limiter)
await serve_tcp(handler, _port=80)
await serve_ssl_over_tcp(handler, _port=443, _ssl_context=my_context)

One downside is that it's weird-looking. But I guess we'd get used to it? I think serve_tcp and serve_ssl_over_tcp are the only two functions where we want to pass both an arbitrary callable + mandatory kwargs (see #563), so they're the most affected. It does seem easier to explain and to use than all the .with_options(...) ideas proposed above. I'm not super-excited about the looks, but I can imagine we might decide that using partial all the time is annoying enough to overcome this downside.

The more technical potential downside is lack of compositionality/universality: what if we want to pass through a kwarg that starts with an underscore? For example, nursery.start_soon(trio.run_sync_in_worker_thread, sync_fn, _cancellable=True, _name=sync_fn) – notice that _cancellable is supposed to go to run_sync_in_worker_thread, while _name is supposed to go to start_soon. I think our full set of options are:

  • Pass through unrecognized kwargs, even if they start with underscore: this is definitely not workable, because it would mean that if we ever added any new kwargs to these functions it would be a breaking change.

  • Use some sort of quoting rule: for example, any kwarg with double-underscores gets one underscore stripped off, and then is passed through. So our example becomes:

    nursery.start_soon(
        trio.run_sync_in_worker_thread, sync_fn, __cancellable=True, _name=sync_fn,
    )

    Or, slightly more complicated but maybe nicer to look at, we could look at the kwargs, and of them, all the ones with the maximum number of underscores get taken by the top-level fn, and the rest get passed through. So our example would become:

    nursery.start_soon(
        trio.run_sync_in_worker_thread, sync_fn, _cancellable=True, __name=sync_fn,
    )
  • Document that if they really want to pass-through underscored kwargs, they should use partial:

    nursery.start_soon(
        partial(trio.run_sync_in_worker_thread, sync_fn, _cancellable=True),
        _name=sync_fn)
    )

    Given how rarely this situation arises, handling it with partial doesn't seem like a huge deal. Also, if we started with this option, then we'd have the option to later extend it to one of the quoting options above, without breaking backcompat. So if we're going to do this at all, then this seems like the variant to start with.

The implementation would be extremely simple. E.g.:

async def run(async_fn, *args, **kwargs, _clock=None, _instruments=()):
    check_no_underscores(kwargs)  # raises an error if any keys start with _
    ...

No magic decorators, so it's totally understandable by naive readers, sphinx, mypy, and pylint. The one wrinkle is that mypy/pylint wouldn't know to flag trio.run(..., _clokc=...) as an error, but that doesn't seem like a huge issue. (And if we really want to then we could fix it with plugins. It seems like we'll have mypy/pylint plugins in any case, to handle things like trio.open_file and missing-await detection – see #671 – so adding this feature wouldn't be a huge extra cost.)

@njsmith
Copy link
Member

njsmith commented Mar 17, 2019

Oh, wait, there's a second kind of compositionality problem. There are places where trio generates a kwarg: in particular, nursery.start automagically passing task_status=.... And I guess serve functions might end up passing through the stream as a kwarg too (#563), like handler(peer=stream_obj). I think these are the only two places where this comes up.

But, it's not a theoretical problem: the serve functions both take both a callable and also support task_status. So if we say that all their kwargs have to start with _, AND ALSO say that they have to take a kwarg called task_status, then that's a problem!

What options do we have available?

  • The serve functions could go ahead and take task_status, even though it's not underscored. It would be a bit awkward to explain why we were breaking our own rule, but not necessarily a big deal? task_status is already effectively a reserved kwarg name in trio; it doesn't really make sense to have an incoming connection handler that takes task_status.

  • We could switch the start protocol to use _task_status as the name of its magic kwarg. This would be an awkward transition b/c there's no good way to deprecate the old way. (I guess our options would be (a) rename start, (b) do a two-round deprecation where we force everyone to add a new_style=True switch and then force them to remove it again.) But eh, given that it's effectively a magic reserved name anyway, the underscore wouldn't look that weird. This would help the serve functions. Are there any cases that go the other way, where you want to pass a function-taking-a-function to start, and have task_status passed through? Of course, we don't support that now either...

Are we worried about similar issues for the peer=/client=/whatever-we-call it arg from #563? I'm having trouble thinking of any situation where you'd want to pass a function-that-takes-a-callable to serve. Maybe run_sync_in_worker_thread, but then you'd want the peer= to be passed through. Probably someone can come up with something, though?

I don't feel like I've fully wrapped my head around the issues here.

@smurfix
Copy link
Contributor

smurfix commented Mar 17, 2019

Meh. I still think that nursery.start_soon.mod(name="foo")(fn, *args, **kwargs) would work best ("mod" being the shortest name I could think of that still means something reasonable) given that nursery.start_soon(name="foo")(fn, *args, **kwargs) seems to be too magical for some people's taste. Modifiers on fn can be achieved very easily, by using fn.mod(bar='baz').

I don't like magic underscores; "internal use only" is not the same as "strip the under and feed me to the next part". I'd hate to be required to again resort to functools.partial except now in a different set of circumstances.

I'd leave task_status alone. Its default value wants to be renamed, but that's a different issue.

@dhirschfeld
Copy link
Member

IMHO having _ or __ prefixed kwargs is a horrible kludge and think that using partial is preferable to that - i.e. the cure is worse than the disease!

I personally like @smurfix's solution the best (maybe s/mod/config/). The verbosity doesn't really bother me and, to me, is greatly preferable to _ prefix kwargs being implicitly passed as config.

@njsmith
Copy link
Member

njsmith commented Mar 30, 2019

The discussion around PEP 570 made me realize an interesting corner case... even if we have a signature like:

async def run(async_fn, *args, **kwargs):

...then technically it's not quite true that you can pass through any kwargs. Specifically, you cannot pass through a kwarg named async_fn :-)

This may not really matter in practice (in the unlikely case that someone hits it they can use partial), but I found it surprising.

@mentalisttraceur
Copy link

mentalisttraceur commented Apr 18, 2019

Here is a perspective that I don't rigidly or fully stand by but which is important and I haven't seen said here:

  1. The learning curve for partial could be massively reduced if:

    1. every single function like trio.run() just stopped taking *args to pass to the async function as well,

    2. the documentation and examples as a result showed partial being used all over the place,

    3. (if needed) the documentation linked to a small and really intuitive summary of what partial is and why it is good.

  2. Every time you force people to use partial you make the world a little better, because:

    1. partial is amazingly versatile and uses for it tend to turn up in many unexpected places,

    2. developers who "think with" partial are thus more versatile developers,

    3. passing arguments along with a callable is a distinct code feature, extra functionality, and partial properly pulls and abstracts just that feature into its own composable form, making it unnecessary for people to keep reimplementing it,

    4. partial makes it easier to

      • verify correctness,
      • write correct code, and
      • understand what the code is doing

      with less mental overhead, without having to jump around to other parts of the code or documentation to verify what happens to the arguments being passed through.


Personally, both in my own code and at work,

  1. I intentionally avoid, change, and push back against any interface that accepts arguments along with callables anywhere that partial will do, and

  2. I push for partial over functools.partial, even as I advocate for everything else to be properly namespaced, because the latter is a pain to write and because partial application is a fundamental operation, like +.

And I have been happier ever since, enjoying in particular a greater smoothness of thought due to no longer having to do a depth-first search of the possible future effects and uses of this kind of argument passthrough.

I do get frustrated having to write calls to partial sometimes, but then I remember that code is read more than it is written, and that my insistence on partial-only argument passthrough has significant return on investment in people being able to

  1. learn by example,
  2. write more robustly correct code,
  3. write more future-proof code, and
  4. verify code correctness

with less overhead. And that makes the upfront nuisance of writing partial not feel so bad.


TL;DR: forcing people to always use partial is good in some significant, permeating ways that are really worth weighing here.

@smurfix
Copy link
Contributor

smurfix commented Jun 30, 2020

@mentalisttraceur That was me, at #470 (comment)
Thanks for the endorsement ;-)

I dunno about )(. Personally I don't have a problem with them. Also, if you use the call inside another wrapper, as in

nursery.start_soon(
    trio.to_thread.run_sync.options(cancellable=True),
    func, *args, **kwargs
)

having to attach .call or .do or … to the second line looks strange. But maybe we can simply let people use whatever style they prefer …

@oremanj
Copy link
Member

oremanj commented Jun 30, 2020

Do you find that writing def __add__(self, other): ... or referencing obj.__name__, etc., grates on you and makes you dislike using python?

No -- but I do think of the "dunder namespace" as fundamentally belonging to the language implementation, not to library authors. Some libraries do use it, true, but generally as something that "looks like" how the implementation uses it: a vaguely "magical" method or attribute name. Seeing dunder keyword arguments feels especially weird to me -- if I saw that in some unfamiliar code, I'd wonder whether I should consult the target method's documentation or Python's. I'm sure I'll get used to it if it's the way things are, but I suspect I'm not the only one who will find it somewhat off-putting at first.

nursery.start_soon.options(...)(...)

I like this approach, and I do think we can type it using PEP 612. The implementation will need a bit of descriptor trickery to remember which nursery it should eventually call start_soon on. I agree that )( might be polarizing, but I think __keywords__ have a similar risk.

@jab
Copy link
Member

jab commented Jun 30, 2020

With the .options approach, could the docs mitigate the concern over )( by breaking up an initial example into multiple subexpressions?

custom_start_soon = nursery.start_soon.options(name="foo")
custom_start_soon(f, spam='eggs')

(Maybe the docs could then even briefly note that this can be combined into a 1-liner like nursery.start_soon.options(name="foo")(f, spam='eggs') if you're into that sort of thing.)

In the case of __keywords__, on the other hand, I can't think of any similar way to build up to their usage incrementally to reduce any "this looks weird" feelings.

@mentalisttraceur
Copy link

@njsmith Yeah, that is much simpler. Nice. The only downside I can think of with your example implementation is that help(trio.run) will describe an instance of takes_callable? help in the Python REPL is a common way that I quickly reference documentation for libraries I am using. Unfortunately after spending a few hours on it, there doesn't seem to be a comparably simple solution that gets nice docstrings everywhere - especially not while also preserving other I-presume-desireable properties like pickleability or introspectability.

I hear you on )(. It definitely takes an extra mental step, and I've worked with developers who I know would get confused at first the moment they see it. Part of me is tempted to just say that the way to solve that problem is teaching, not limiting ourselves. But I am also sympathetic that APIs are user interfaces and that unless there is good reason, user interfaces should be tuned to people as they are.

If the )( is ultimately decided to be too much of an issue, I second the suggestion by @smurfix to permit both - for example just put call = __call__ at the bottom of the takes_callable class.

@smurfix
Copy link
Contributor

smurfix commented Jul 1, 2020

takes_callable could simply copy the wrapped function's docstring onto itself.

@mentalisttraceur
Copy link

mentalisttraceur commented Jul 1, 2020

@smurfix Yeah that's a good start. However:

  1. Do we only want a good docstring on run, or also on run.options? Maybe even run.options(...)? (The former is easy enough as a one-liner docstring like f"""Options setter for {wrapped.__module__}.{wrapped.__qualname__}""". Someone else can probably think of a good simple one for the latter.)

  2. help shows more than just the docstring - for class instances it'll display the whole class, including all of its methods.

But my point was just that it takes extra lines for each additional bit of functionality - if we want the result of .options(...) to be pickleable we can't use lambdas so that's more lines, if we want the options to be programmatically accessible or visible in the docstring then that's extra lines, if we want the help of .options to include what it's a wrapper of rather than just generically that it's a wrapper and we can't use f-strings that's maybe extra lines, etc.

All doable, but when it's all done every variant I've thought of is a lot more lines than just the very nice and simple variant up above. But of course it becomes simpler if we just say "we don't care about it being pickleable", "we don't care about help on the .options showing a docstring that gives any information about what it's options for", etc.

@smurfix
Copy link
Contributor

smurfix commented Jul 1, 2020

Prettying up the __doc__string and the help() output is certainly doable but, well, perfect is the enemy of good and all that.

No, I have to admit that I don't care at all for callables to be pickle-able. Doing that is rarely if ever good design. Besides, except for trio.run itself (and why would you want to pickle that?) they're all on classes which can't be pickled anyway.

@mentalisttraceur
Copy link

mentalisttraceur commented Jul 1, 2020

Okay, if pickling is discarded, then I think a traditional function-style decorator is sufficient to cover all docstrings and help outputs nicely:

def takes_callable(wrapped):
    @functools.wraps(wrapped)
    def wrapper(fn, /, *args, **kwargs):
        return wrapped(partial(fn, *args, **kwargs))

    name = f'{wrapped.__module__}.{wrapped.__qualname__}'

    def bind_options(self, **options):
        f"""Bind options to {name}"""
        def call_with_options(fn, /, *args, **kwargs):
            f"""Call {name} with options: {options}"""
            return wrapped(partial(fn, *args, **kwargs), **options)

        # Enable `).call(` as an alternative to `)(`:
        call_with_options.call = call_with_options

        return call_with_options

    wrapper.options = bind_options
    return wrapper

Which still feels a little too complex, but it's the best I can do so far while

  1. retaining docstrings and nice help output on wrapped, .options and .options(...) and
  2. supporting .options(...).call(...) as an equivalent alternative to .options(...)(...).

@altendky
Copy link
Member

https://docs.python.org/3/reference/lexical_analysis.html#reserved-classes-of-identifiers

Any use of __*__ names, in any context, that does not follow explicitly documented use, is subject to breakage without warning.

(pretty sure this doesn't even qualify as two cents but...)

@mentalisttraceur
Copy link

@altendky I actually think this is very important to bring up! Up until now I thought Python just took a casual "we use dunder identifiers for internal stuff sometimes" position, but it sounds like Python is officially taking the Serious(tm) "we reserve all dunder identifiers for internal usage at any time" position.

@dhirschfeld
Copy link
Member

A new PEP could be the best way forward in case anyone wants to comment:
https://github.com/python/peps/blob/master/pep-0637.txt

I guess trio wouldn't be able to use it for quite a while though...

@smurfix
Copy link
Contributor

smurfix commented Sep 24, 2020

Well, we could simply alias __getitem__ to options, which would transparently allow both trio.run.options(option=value)(func, *args, **kw) and trio.run[option=value](func, *args, **kw) (when the PR is implemented, assuming that it will be).

+1 from me.

@smurfix
Copy link
Contributor

smurfix commented Sep 24, 2020

python/peps#1615

@brettcannon
Copy link
Contributor

PEP 637 was rejected, so that won't be an option.

@tiangolo
Copy link
Member

I want to point out another thing to have in mind related to usability and user experience:

  • editor autocompletion
  • editor inline errors
  • mypy checks

Now that ParamSpec is accepted (and even supported by mypy), it allows a trick that enables all this autocompletion and tooling support for the arguments and return values of the functions passed to Trio functions. This means better tooling for user's code.

And as ParamSpec is also in typing_extensions all this also works for current/older versions of Python (3.6+).


I'm making a PR here (at least as a conversation starter/re-starter): #2208

It has a proposal of adding some 3 new alternative functions that just wrap the current ones and have these typing tricks.

As an example, here's how it could look like when sending tasks to a worker thread (similar for Nurseries, more examples in the PR):

  • Autocompletion:

Selection_034

  • In-editor inline errors:

Selection_035

  • Autocompletion for the result value retrieved from the worker thread:

Selection_036

  • Inline errors for that result value:

Selection_037

  • Mypy error detection:
main.py:13: error: Unsupported operand types for + ("str" and "int")
Found 1 error in 1 file (checked 1 source file)

@noahbkim
Copy link

noahbkim commented Oct 3, 2022

Hi, I'm sorry if this is covered in the discussion above and I missed it, but why can't there be a simpler, underlying interface along the lines of:

def run_explicit(async_fn, args: Iterable[str], kwargs: Dict[str, Any], clock=None, ...):

This introduces trivial overhead and would give a verbose option for developers who need the flexibility.

@oremanj
Copy link
Member

oremanj commented Oct 3, 2022

run_explicit(fn, (1, 2), {"foo": 3}, clock=...) is more typing and harder to read than run(partial(fn, 1, 2, foo=3), clock=...), and would need to be replicated for every function in Trio that takes a function and arguments (there's a long list above).

@noahbkim
Copy link

noahbkim commented Oct 3, 2022

Yes, that's why you still offer run() as is (but passes its args down to run_explicit). Partial is a no-go in caching scenarios I'm currently dealing with because of the inline/anonymous functions. I think this would be a very straightforward change and would be willing to write the PR.

@oremanj
Copy link
Member

oremanj commented Oct 3, 2022

It increases the API surface to little benefit; I don't think there would be appetite for the change. I don't understand what's causing problems for you about partial() though, maybe you could elaborate on the problems you're seeing?

@smurfix
Copy link
Contributor

smurfix commented Oct 3, 2022

@oremanj Yes, partial is bad for you if you do any caching.

>>> from functools import partial
>>> a=partial(int,1.23)
>>> b=partial(int,1.23)
>>> a==b
False
>>> hash(a)
8775412814713
>>> hash(b)
8775412814723
>>> 
>>> a=(int,1.23)
>>> b=(int,1.23)
>>> a==b
True
>>> 

@noahbkim did I understand the problem correctly?

I do wonder whether there's a rationale for partial not offering reasonable __hash__ and __eq__ built-ins, other than "we didn't think of that" …

@oremanj
Copy link
Member

oremanj commented Oct 3, 2022

I'm not really a fan of complicating Trio's API to support caching use cases that I think will be pretty uncommon, especially since it's easy to write run_explicit yourself.

@arthur-tacca
Copy link
Contributor

@smurfix

I do wonder whether there's a rationale for partial not offering reasonable __hash__ and __eq__ built-ins, other than "we didn't think of that" …

There's a surprisingly good reason, described in CPython issue 3564: if partial objects tested for equality by comparing func, args and keywords then you could end up with a situation where two partial objects compare as equal but give different results in practice so really ought to be compared as different. That's because a function can tell the difference between e.g. 1 and 1.0 even though they compare as equal. For a more extreme example, consider partial(id, x) and partial(id, y) where x and y are different objects that compare equal.

@rafalkrupinski
Copy link

I'd love to see this implemented the way @tiangolo suggested in #2208 .

Properly type hinting partial is much more difficult than "inheriting" type hints from the wrapped function

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests