Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Composability] Run Queue and Composable API #433

Closed
SBoudrias opened this issue Dec 18, 2013 · 20 comments
Closed

[Composability] Run Queue and Composable API #433

SBoudrias opened this issue Dec 18, 2013 · 20 comments

Comments

@SBoudrias
Copy link
Member

Subject previously brought here #345 (comment) and on the PRD document.

The idea is to run every Generator method inside a loop in order to allow different Generators from running their action in harmony. For example, all prompt run first, then the writing, then the install.

Suggested API

If keeping current API

I'd suggest extending the function prototype to add a helper method.

function Generator() {
    // Instantiation, so runned from the start.
}
util.inherit(Generator, yo.generators.Base);

Generator.prototype.ask = function() {
    var done = this.async();
    this.prompt({}, done); // etc...
}.queueIn("prompt");

Generator.prototype.router = function() {
    this.dest.write("app/router.js");
    // etc
}.queueIn("write");

Generator.prototype.write = function() {
    this.installDependencies();
    // etc
}.queueIn("end");

With a new API

var Generator = yo.generators.Composable.extend({
    // the contructor
    initialize: function() {},

    prompt: { // ran in the prompt queue
        ask: function() {}
    },

    write: { // ran in the write queue
        router: function() {},
        controller: function() {}
    },
    end: { // ran in the end queue
        install: function() {}
    },

    // ran in the default queue
    otherOne: function() {}
});

I kinda prefer this way of inheritance using a method on the constructor.

Though, I wonder if using queue name as key is obvious enough or if it would bring conflicts (though we could easily throw relevant error when someone overwrite a reserved name...). Another option could be to nest the queues deeper, e.g. this.queue.prompt.taskName

What about backward compatibility?

If we're not changing the API, we can simply push all old generator task in the default queue.

If we're changing the API, then we got a way to detect our current API and offer a shim runner calling the #run method.

Default actions

I believe the Generator should always queue the #installDependencies task without asking the user to do it. (This way we can also make sure we only run the install commands once)

I also think we can automatically schedule write to the Gruntfile.js if we abstract its AST away.

Actual work

I created a small Node module for creating grouped queue with different priority. I really liked backburner API, but it wasn't though for node - plus queue system are really easy to integrate in a few lines of code sooooo.

@addyosmani
Copy link
Member

Thank you for putting together your thoughts on an API for this. It's massively useful in continuing the conversation.

My two cents:

  • I really prefer the new API proposed but also feel it's important (for a while) to maintain backwards compatibility
  • "If we're not changing the API, we can simply push all old generator task in the default queue" + 1 and "I believe the Generator should always queue the #installDependencies task without asking the user to do it." + 1 too
  • Would it be feasible for us to extend the prototype with a helper for 1.x but provide a public doc with the new API and inform developers it will be what they should expect to use in Yeoman 1.5/2.0? This provides enough notice for a backwards compatibility break but doesn't slow down our ecosystem growth in the short term.

@SBoudrias
Copy link
Member Author

I feel .extend really is simple sugar syntax over inheritance. So I'm pretty confident any of these changes can maintain 100% backwards compatibility as long as we default every non specifically queued task on the default queue.

I also feel we can bring everyone of these pieces progressively into play. And, I think just these simple change would allow a user from running (for example) yo angular, then yo mocha on the same project with more ease. Also, that'll make it easier for a generator (generator-angular) to call another generator (generator-gruntfile) - and making the experience/flow for the end user much better.

Then when we make a decision on the composability API, we just bring it into play with more ease with these building pieces already in place.

@addyosmani
Copy link
Member

If we can maintain backwards compat. with .extend, the newer API is definitely preferable. I like the idea of non-specifically queued tasks ending up on the default queue :)

@addyosmani
Copy link
Member

@SBoudrias could you talk about some of the practical work that we would need to do to implement composability? I've played around with your queing system and could see where it could fit in. So so far it looks like:

  • Implement a queing system for prompt/write/end
  • Implement the Gruntfile API helper
  • Implement a prototype of generator-webapp based on composable pieces

Is there anything else?

@SBoudrias
Copy link
Member Author

Yeah, so I see it in two big section.

First one is allowing generators to be runned together. This include current propositions, queue system (allowing generators to run side by side rather than one after the other) and Gruntfile API (sharing control on common files). This first step should allow one generator from requesting another one programmatically (composable pieces).

The second step is to allow user to combine unrelated generator. This one is trickier, and currently there's the option of passing generators intent, or maybe just theorize on possible combination and create a mostly user based control (generator author wouldn't need to consider combination much when they write their generators). This would also require ways of scoping generators to parts of the system (basically file paths). But this is still blurry to me, it still require more reflexion.

On the architecture side, I think all this would need to be coordinate by a global runner object. The current generator#run method would delegate the execution logic to the runner (keeping use of run method will ensure backwards compatibility). This mean the runner will be able to make sure each generator is only runned once (in case two generators each use the same common generator). This will also mean the runner could be handled by the Environment or a tool like yo to inject any generators in the run stack (thus resolving parts of the second step to composability).

For now this is how I'd see it. One unresolved question here would be how we handle command line options and programmatically called generators options. This one is hard because currently options are linked to the env; which is probably a bad default anyway. So, yeah, this is to rethink. Basically, how can a generator knows which options are meant for him? - IMO, command line options parsing should be handled by the UI layer (in most case yo), and passed to generators via method parameters.

@SBoudrias
Copy link
Member Author

Just coming back on some though, we may not be able to use generator#run method as this will cause compatibility issues if a new generator compose (hookFor, call run, etc) on an old generator. I don't think there's huge issue with backward compatibility, but this may be a friction point during implementation. - Unrelated to architecture IMO as it's mostly implementation. But its worth mentioning.

@SBoudrias
Copy link
Member Author

Implement a queing system for prompt/write/end

About this, I'd like to copy-paste the initial queue list proposed on the PRD (which need works and though)

  1. Setup
  2. Prompt
  3. Configuration
  4. Writing
  5. Conflicts
  6. End

@smackesey
Copy link
Contributor

The second step is to allow user to combine unrelated generator.

It seems to me that maximum composability will come from having generators operate on a formal representation of a directory tree. What if the core of a generator were a series of functions that either created such a tree or mutated an existing tree? Each generator could perform arbitrary transformations on the tree as well as queue "instantiator" functions on target nodes. The tree could be "grown" at the end by traversing it and executing all the instantiators on each node (or falling back to a default, like reading a template file, if there are no instantiators); this would produce strings containing the full file content at each file node. At the very end, the tree could be written to disk with another traversal that creates directory or file for each node.

This system is a good solution for giving multiple generators control over the same file. For example, a distributed Gruntfile specification across many generators could be implemented by having each one queue instantiators that add specific configuration to the Gruntfile node.

Another advantage of this is that it eases and speeds testing by avoiding unnecessary filesystem transactions. @SBoudrias recently posted testing guidelines (#449) specifying that beforeEach should be used over before; with a large test suite running many different generator configurations, this could end up being pretty slow. If a tree only needs to be generated in memory, rather than on the filesystem, this is much less of a problem.

@SBoudrias
Copy link
Member Author

@smackesey I'm not sure this would be manageable performance wise (it would require a lot of memory). And I'm not sure I see how this would help compose generators. Either way, if you'd like to add on this idea, please use #345 as your point is not directly related to using a queue system to allow generators to run in "parallel" (~kinda) rather than in series.

@SBoudrias
Copy link
Member Author

On the queue subject, here's the list of queue group I see (first run first):

  • initialize: Get general informations, parse options, setup configs, start running generators you extend or compose with (basically, an alternative to putting everything in the constructor method).
  • prompt: Time to use this.prompt to query user choices/preferences
  • configuration: Configure project and filters depending on user choices. Also time to scaffold project wide configurations files (.editorconfig, etc).
  • default: Any task not registered anywhere in particular.
  • writing: Scaffold the required files.
  • conflicts: Used internally to hanlde conflicts
  • install: Used internally to install NPM and Bower packages.
  • end: Last run loop hook. Log confirmation messages or give getting started feedback, etc.

@addyosmani
Copy link
Member

The above list looks good to me. Could we s/writing/write or would that cause conflicts elsewhere? I also had a question about initialize - what type of general information would we be trying to obtain at that point beyond the generators you want to extend/compose? git usernames etc or something else?

@addyosmani
Copy link
Member

Another question :) Do we see ourselves wanting pre/post hooks for the queue group? e.g supporting post-initialize, post-write, post-install etc or would these just cause unnecessary overhead?

@SBoudrias
Copy link
Member Author

Could we s/writing/write or would that cause conflicts elsewhere?

Not sure I understand...

I also had a question about initialize - what type of general information would we be trying to obtain at that point beyond the generators you want to extend/compose? git usernames etc or something else?

I see it mostly used by subgenerators to read package.json, bower.json, .editorconfig or any other relevant config file.

Do we see ourselves wanting pre/post hooks for the queue group? e.g supporting post-initialize, post-write, post-install etc or would these just cause unnecessary overhead?

I believe this is overhead, but we can add them later very easily if it is something people needs.

@JohnMorales
Copy link

I see a couple of usability problems with prompt here:

  1. duplicate questions
  2. complex questions (questions dependent on the answer of another & ordering).

Possibly could be addressed with the ability to see what has been asked, but would require a common vocabulary.

@SBoudrias
Copy link
Member Author

@JohnMorales I think that composed generators should be focused enough so features don't overlaps.

Although, this is a valid comment for the long. When we add support for user defined composition, we may need to disable some run calls / or limit each generator namespace to be runned only once.

@robwierzbowski
Copy link

On duplicate questions: the generators may need to read and act on previous generator's prompt answers. For example: an early "set up your dev environment" generator offers the choice between CSS and Sass and lets the user specify directories; a later generator automatically knows whether to add Sass or CSS boilerplate, and where to put it.

@robwierzbowski
Copy link

Love API number 2, by the way.

@floatdrop
Copy link
Contributor

It's a bit late, but prompt is used to call inquirer for user interactions, for example in generator-generator - and I got funny error message without stacktrace, when try'd to migrate on 0.17.0-pre.1:

path.js:299
    return splitPathRe.exec(filename).slice(1);
                       ^
RangeError: Maximum call stack size exceeded

May be it's time to deprecate this.require as this comment says? Hope this helps some one.

@SBoudrias
Copy link
Member Author

May be it's time to deprecate this.require as this comment says?

What?

and I got funny error message without stacktrace, when try'd to migrate on 0.17.0-pre.1:

It is a prerelease, errors and bugs are expected. Report details (and open new issues) so we can fix them.

@floatdrop
Copy link
Contributor

What?

Sorry, wrong link. This comment confused me before I get what is wrong with this.prompt. Opened new issue for migrating problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants