Skip to content
This repository has been archived by the owner on Sep 2, 2023. It is now read-only.
This repository has been archived by the owner on Sep 2, 2023. It is now read-only.

Proposal: Support loading package by own "name" #306

Closed
guybedford opened this issue Apr 4, 2019 · 39 comments
Closed

Proposal: Support loading package by own "name" #306

guybedford opened this issue Apr 4, 2019 · 39 comments

Comments

@guybedford
Copy link
Contributor

This has been discussed before, but I think it's a good time to consider it now.

A very common problem in Node.js applications is the backtracking problem where you have to always reference modules through ../../components/name.js.

Since we're loading the package.json when checking the module boundary, we thus also know the package.json "name" that any given module is in.

What we could do is support loading a package by its own package.json "name" by default.

This would allow conventions where users just set a package.json name and can then use import "pkgname/components/name.js" instead.

It's a very small feature that seems like it could solve a pain point quite elegantly.

Would be nice to discuss this further at the next meeting.

@guybedford guybedford added the modules-agenda To be discussed in a meeting label Apr 4, 2019
@ljharb
Copy link
Member

ljharb commented Apr 4, 2019

It'd be great to provide this feature - in both CJS and ESM (i wouldn't want it in only one of them) - but this would conflict with someone manually putting a folder in node_modules with their same package name. I think we'd need a prefix that couldn't possibly be already valid - like ~ or something.

@devsnek
Copy link
Member

devsnek commented Apr 4, 2019

definitely a cool idea, but I think it's out of scope of this group

@GeoffreyBooth
Copy link
Member

definitely a cool idea, but I think it’s out of scope of this group

Why? Resolution is part of modules.

@ljharb
Copy link
Member

ljharb commented Apr 5, 2019

Modules should have the same resolution as CJS, or at least a subset - but not a superset. This is a feature that applies to CJS as well, and it would be a bad idea to add it to ESM without also adding it to CJS.

@robpalme
Copy link
Contributor

robpalme commented Apr 5, 2019

@ljharb You've mentioned this approach a few times now. What is the rationale for keeping CJS up to date with the ESM features?

It seems odd to try to update the old system to keep feature parity with features of the new system. Potentially it sends mixed messages about the future endorsed practices and places an ongoing tax on the design space.

@ljharb
Copy link
Member

ljharb commented Apr 5, 2019

CJS isn’t legacy. It’s not “the old system”. It’s one of two first-class module systems that node will be supporting for likely the next decade, if not forever.

Hobbling CJS is not an effective means of encouraging adoption of ESM - and any features that only work in ESM will make migration and interop harder.

@bmeck
Copy link
Member

bmeck commented Apr 5, 2019 via email

@jkrems
Copy link
Contributor

jkrems commented Apr 5, 2019

Saying that the import system must be a strict subset of the require system seems to hobble the former. I do agree that both are and will for the foreseeable future be first-class module systems but both will necessarily have unique features based on their properties and constraints. Just given that require is used in the ecosystem with the very specific semantics it has today means that there's a tiny design space. Accepting that as a constraint for import means import will always be a worse require.

@devsnek
Copy link
Member

devsnek commented Apr 5, 2019

re my comment, i think 1) both systems can get use from this, but 2) if one doesn't have it, it would be confusing for the other to have it. additionally, if we added it and cjs wanted to support it, we would want to make sure the design was good for both systems. This is why i think the discussion should take place upstream.

@SMotaal
Copy link

SMotaal commented Apr 9, 2019

I think it is fair to note that if we design ESM to meet the requirements of CJS, then we will inevitably be forced to limit ESM's potential — if people don't like a limited ESM they will not use it.

CJS has a legacy, if people want to see it do new things, it is a good thing, because Node.js can decide about any design aspects.

ESM has specs, if people want to see it do new things, they need to make sure it conforms to the specs, and we've witnessed first-hand the complexity in making this happen in a controlled and manageable way.

So can we say that,

we would want to make sure the design was good for both systems

is satisfied if a custom CJS loader can achieve it — then maybe implementing in CJS follows after it proves worthwhile?

@ljharb
Copy link
Member

ljharb commented Apr 9, 2019

In this case it’s a node-only concern, since ESM has no concept of packages - so no, i don’t think a package-related feature should be built without doing it for both cjs and ESM, where it makes sense (and in this case it makes sense; this is a big existing pain point for cjs in the ecosystem).

@devsnek
Copy link
Member

devsnek commented Apr 9, 2019

so people are concerned that in trying to bring this feature to both systems, it will be made worse? if that is the case, I would only say you have no way of knowing that's the case, and that users of both systems would appreciate this feature. I'm not saying every feature has to be in both systems, but this one seems reasonable for both.

@guybedford
Copy link
Contributor Author

guybedford commented Apr 9, 2019

@devsnek
Copy link
Member

devsnek commented Apr 9, 2019

@guybedford

your proposal seems to be just as viable for cjs as for esm. the dependencies issue listed under cjs affects both.

@guybedford
Copy link
Contributor Author

@devsnek right, but esm is experimental and can permit breaking changes, while CommonJS cannot.

@ljharb
Copy link
Member

ljharb commented Apr 9, 2019

Just because ESM permits them doesn’t mean it’s wise to make avoidable ones.

If we find a way that works for CJS, then it ofc works for ESM, so why wouldn’t we want to exhaust that option before adding more deviation between the two primary module systems node will be supporting simultaneously for the next decade.

@devsnek
Copy link
Member

devsnek commented Apr 9, 2019

just to clarify, adding this field is semver minor even when taking dependencies clash into account because it's opt in.

@SMotaal
Copy link

SMotaal commented Apr 9, 2019

@ljharb I'm not implying to exclude CJS - merely to test favourability in the more flexible prevue open for ESM today and once refinements are addressed (or the proposed solution is deemed too breaking and subsequently dropped altogether) make the necessary efforts in the more widely used CJS space.

Purely from an evolutionary sense is all :)

@ljharb
Copy link
Member

ljharb commented Apr 10, 2019

@SMotaal if we've identified a workable plan for both, then experimenting with it in ESM would be fine, although I'm not sure what that would solve - this is a stipulated pain point that imo needs no proof, and the suggested solution is what the ecosystem already does in spades, via babel, symlinks, webpack aliases, typescript project references, etc.

@jkrems
Copy link
Contributor

jkrems commented Aug 26, 2019

A potential solution using the semi-popular ~ prefix that should be backwards compatible for CJS and implementable for ESM: nodejs/node#29327

@Jamesernator
Copy link

Jamesernator commented Aug 27, 2019

My feeling is that this should probably be configurable. I for one would be interested in just using / for package root similar to / in web browsers.

Given such configuration would basically just be import maps for a single part, I'd probably suggest just implementing import maps. e.g.

EDIT: Actually I was confused with the association direction of import maps so no maybe just a field "packageRootName": "/"/packageRootName": "~"/"packageRootName": true /* use the "name" field, the default if possible otherwise false would need to be default */.

@guybedford
Copy link
Contributor Author

There is already an imports field proposal at https://github.com/jkrems/proposal-pkg-exports#2-imports-field which would in theory allow configuring this sort of package-relative loading feature something like -

{
  "imports": {
    "#myroot/": "./"
  }
}

@jkrems
Copy link
Contributor

jkrems commented Aug 27, 2019

My feeling is that this should probably be configurable. I for one would be interested in just using / for package root similar to / in web browsers.

The problem with / is that it wouldn't be compatible with web browsers unless you can assume that the package would be served from the top-level path of the URL. Import maps may be able to fix it up but it would be super fragile since / is one of the allowed normal (non-bare) specifiers.

@jkrems
Copy link
Contributor

jkrems commented Aug 27, 2019

My feeling is that this should probably be configurable.

Skipped over this part: I would prefer if this kind of indirection wouldn't require configuration because it already adds additional indirection and complexity to reading code as-is. If every project picks their own way to express it, this would make it unnecessarily hard to understand what's going on imo.

@ljharb
Copy link
Member

ljharb commented Aug 27, 2019

I highly disagree this should be configurable; that adds a ton of complexity to every resolver out there, including node’s - let alone the conceptual complexity of every codebase either not being able to use an alias out of the box, or using different ones.

We need to pick one default, always-on, same-for-everyone alias that’s part of the specifier, so that it’s unambiguous, static (not more package.json interaction), and universally consistent.

@Fishrock123
Copy link
Contributor

Fishrock123 commented Aug 27, 2019

I feel like if it is necessary Node.js can add some separate API for this, but if it for some reason needs to be within import statements then I strongly feel this should go to TC39, not us. (Although we may be able to make a recommendation.)

@ljharb
Copy link
Member

ljharb commented Aug 27, 2019

TC39 doesn’t have any relevance here; there’s no language concept of a “package” and the spec has gone to great lengths to impose no restrictions on specifier contents.

@jkrems
Copy link
Contributor

jkrems commented Aug 27, 2019

@Fishrock123 From a web-perspective, this would be something that a dev (or their toolchain) would write into an import map so it's a "solved problem". As @ljharb said, the signal has been consistently that they don't want to be in the business of telling people what kinds of patterns to use for "application-level logic" like this.

@justinfagnani
Copy link

One thing to consider here is how import maps will be generated.

Will self-referencing packages have to indicate so in package.json, or will import map generators have to read source to see if they need to add a scope and "~" entry to the package, or is the assumption that import map generators should add a scope and "~" entry to every package?

@jkrems
Copy link
Contributor

jkrems commented Aug 27, 2019

@justinfagnani I think either "add it by default" or "have a signal to add" seem reasonable. If we want to be conservative, we could say that using exports turns this feature on..? You have a point that enabling it by default would mean two additional entries in the import map for every package, no matter what.

@ljharb
Copy link
Member

ljharb commented Aug 27, 2019

I don't think using exports should be required to use this feature; I have packages where I intentionally want all paths on the filesystem to be reachable, but that doesn't mean this convenience feature isn't useful to me.

@jkrems
Copy link
Contributor

jkrems commented Aug 27, 2019

That's totally fair. I was trying to come up with a solution that doesn't require yet-another package.json property. Also, you can use "exports": { "./": "./" } if you want to export everything but this may be more wordy than "allowSelfImport": true, depending on what that field would be.

@SMotaal
Copy link

SMotaal commented Aug 29, 2019

An argument can be made here for this being the abstract ... relative path signifier (regardless of the actual form).

Just like the current (.) and parent (..) paths:

  • it is unambiguous in a module resolver context.
  • it can be enforced deterministically.
  • it is not an actual relative path/URL construct.

But there are more considerations to keep in mind here:

  • The fact that this is not valid in a URL can have pros/cons, I think that this being viewed as opt-in resolver behaviour likely leads to needless complexities and unpredictable outcomes across the board, more so from a portability standpoint.

  • When we think of module resolution, we must also consider what it takes for one to safely wrap fetch and fs.… at runtime to reference related assets.

My personal take:

  • Using the bare "name" means:

    • Technically requires honouring a package scope mapping for the main entry's scope (ie { [mainEntryPackageName]: new URL('./', mainEntryURL) }) in exports and that applies just the same for import maps.

      1. There is the case where the package.json is not where it needs to be, where it throws only if a same-named package does not silently get in the way.

      2. Also, if the package being installed under a different name… but I am not sure how to address this here myself, but feel it is not to complicated to workout based on previous discussions.

    • A developer will need to search and replace if they change the name — they are on the clock and are in a position to shoulder the more thorough burdens of this.

  • Using a special signifier means:

    • Technically requires a more straightforward package scope mapping, but, complicates portability (ie import maps being just one example).

      1. There is the case where the package.json is not where it needs to be, and a now we are silently potentially hitting something in a parent path.

      2. Also, mono-repo (specifically yarn-style) packages… not to dwell on the details.

    • A developer will not need to search, but will likely not be able to if they needed to do so (ie for debugging) — just think of how far we are pushing the burden of catching a missed test/edge case downstream.

Recommendation:

I would personally argue for bare "name", but for this to work for Node.js side and especially for it to be reliable for tooling, we need a more concrete and visible "package scope" concept and definitions — so endusers could leverage such tooling to opt for more stylistic conventions and/or alternative mapping strategies, potentially even loader hooks, where traceability would (or at least can) be a lot more fine-grained.

@ljharb
Copy link
Member

ljharb commented Aug 29, 2019

Repeating the name of the package - and renaming it in a bunch of files when a package is renamed - is a tax that shouldn’t be necessary.

@jkrems
Copy link
Contributor

jkrems commented Aug 29, 2019

@SMotaal Unfortunately I don't think ... is any different from another prefix. ... is a perfectly valid directory name in Linux and it also is a valid path segment in a URL (https://foo/.../bar is a URL with literal segment ...). Afaik the only things that can't be valid node_modules/* directory names are strings containing a null-byte and exactly ./... Everything else can be a directory on at least some platforms (which matters if we want to support CJS without making breaking changes).

I wouldn't want to get into the business of trying to verify the package.json name. Especially since that name can be pretty long, so the UX would be rather poor.

@SMotaal
Copy link

SMotaal commented Aug 31, 2019

@jkrems… Absolutely, my ... here is strictly for abstracting a ‹signifier meeting specified criteria› but my leaning is towards using name.

If a release of some deeply nested high-demand dependency somehow breaks here…

  1. Using "name" means that no matter how deep the nesting and not matter how seasoned the enduser, they will likely end up filing an issue against the repo itself because traces will clearly indicate a missing dependency "name/…".

  2. Using ‹…› means that even seasoned endusers will likely not see this coming, let alone known how to find the culprit, and if there is even one or that is an unrelated bug, because traces will not related to a known missing dependency by "name".

If we are considering a signifier to be something that will reside in code that is deployed as third-party dependencies (say through NPM) to the enduser's app, and where disambiguation of this shared signifier is solely based on the out-of-band resolution process of package.json files that can be expected to creep into parent paths, I am hesitant to simply compare this being handled directly in Node.js to existing tooling which applies to packages in code bases that will still be built (ie bundled or rewritten to replace the signifier with the actual package name).

@SMotaal
Copy link

SMotaal commented Aug 31, 2019

On second thought, circumventing parent paths to the first "node_module/*" potentially addresses the concern about creep for published packages, specifically.

But can we say for certain that resolving the signifier is always going to be clear cut, or do things like symbolic links, flags… etc. leave potential for complicated edge cases?

@SMotaal
Copy link

SMotaal commented Aug 31, 2019

I wouldn't want to get into the business of trying to verify the package.json name. Especially since that name can be pretty long, so the UX would be rather poor.

This being a runtime behaviour, it is safe to assume that a "package scope" will ultimately be already tracked at some point prior to the possibility of a "scoped" module making this kind of base (ie either bare "name" or signifier) reference.

@GeoffreyBooth
Copy link
Member

Landed via nodejs/node#29327

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests