Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

suggestion: packages #288

Closed
dharmax opened this issue Jun 24, 2018 · 19 comments
Closed

suggestion: packages #288

dharmax opened this issue Jun 24, 2018 · 19 comments

Comments

@dharmax
Copy link

dharmax commented Jun 24, 2018

i think package.json shouldn't be deprecated, but changed: the current proposal suggest full urls in the imports; i suggest the full urls will reside on the package.json, along with the version, and have a semantic short-name for the imported package, so it would be easier to refer to it from the multitudes of projects files... Also, i'd suggest an option to provide a repository short-name in the package.json, so make the more readable.

example:

{
  repositories: {
    kuku: 'https://www.github.com/kuku/'
  },
  dependencies: {
    matrices: '@kuku/matrix-utils@~2.0.0'
   }
}

the usage is simple:

my-code.ts:

import {multiply} from 'matrices' // or perhaps  import {multiply} from '@matrices' 
...
...

@heineiuo
Copy link

heineiuo commented Jun 25, 2018

It's cute.

IMO I will write a modules.ts:

export = {
    moduleA: 'https://unpkg.com/module-a/0.0.1/module-a.ts',
   // others modules
}

and use in other module:

import { moduleA } from './modules.ts'

// do something 

@cgerling
Copy link

cgerling commented Jun 25, 2018

@heineiuo This way you have to do two imports to actually import the module, right?

import { moduleASrc } from './modules.ts'

const moduleA = await import (moduleASrc)

// do something

@heineiuo
Copy link

heineiuo commented Jun 25, 2018

@calvingerling Sorry, I forgot it... I prefer use like this:

export = {
  moduleA:  await import('https://unpkg.com/module-a/0.0.1/module-a.ts') // with top-level await support
}

or

import moduleA from 'https://unpkg.com/module-a/0.0.1/module-a.ts'
const dependencies = {
  moduleA
}
export = dependencies

@qti3e
Copy link
Member

qti3e commented Jun 25, 2018

Duplicate of #8
Also, I'm against loading packages from GitHub repositories, it makes the whole module resolution algorithm very complicated (it needs to clone the repository and then figure out what file it should load from directory, so it re-invents a new package.json), and one of Deon's goal is to simplify the module system not making it worse.

@ghost
Copy link

ghost commented Jun 25, 2018

Also, I'm against loading packages from GitHub repositories,

I completely agree with this.

I also don't like the Json syntax proposed by @dharmax , I'd rather use a ts file like @heineiuo said.

@dharmax
Copy link
Author

dharmax commented Jun 26, 2018

  1. @qti3e and @IngoW - the github was just for the sake of example, of course.
  2. Also, i don't like the propositions where the semver is just a simple part of the url path. It has a strong semantic meaning and in your propositions this meaning is weakened. Hence my usage of "@" in the urls.
  3. The suggestions based on using export are interesting, however, again, if i understand correctly, they are aimed to remove a standardized additional artifact for module management, such as package.json, right? If is - i think i'm against it. i find it useful from more than one aspect to have such an artifact. It is easy to see and manage dependencies, TMHO, when you have such a thing, rather than having each and every one of the possibly hundreds and hundreds of little files in a project hiding little or bigger dependencies all over, especially when they can mistakenly refer to different versions of these.

@kitsonk
Copy link
Contributor

kitsonk commented Jun 26, 2018

My opinion is the deno should be un-opinionated to the source of the package or module, it should be able to retrieve modules from a FQDN and continue to resolve relative modules imported by the module. Ideally there should be "zero" configuration. If importing and "packaging" modules in userland becomes a thing, deno should not get involved in that or make or assert information about that. If there are any assumptions around configuration of module loading it should align strictly to what is part of the WHATWG around module loading, which is currently stalled. If there aren't standards adopted in the browser around module loading, then deno should not implement them.

As far as semver, again that is something that can and should be resolved external to deno. I have a prototype of a module/package server which could be used with deno that operates a bit like unpkg does. Right now it is called pkgsrv and the README explains how it serves up packages. When deno gets back to being able to load modules from a URL, I will do some more experimentation and refinement, but I would hope the goal continues to be for deno not to inject any magic into how modules are resolved, unless there is an agreed and approved standard.

@flying-sheep
Copy link
Contributor

flying-sheep commented Jun 26, 2018

@dharmax I think you felt a need for convenience but jumped to a familiar solution instead of searching for a good one.

Ryan explicitly stated that the package.json was a mistake in his talk introducing deno (which is linked in the README), and I agree with him in that point.

I think we should try using the minimal solution we have and building convenience for ourselves (e.g. like suggested a modules.ts/external.ts module importing everything 3rd party). And only if that isn’t good enough we should start piling on complexity.

@Lcfvs
Copy link

Lcfvs commented Jun 26, 2018

IMHO, load each file at once may cause some problems, for example, I made a library, which contains more than 200 submodules.

Load them independently implies more than 200 http calls... its really a lot... for only one library, without dependencies.

The ability to load some modules from different versions can implies consistency problems and increase the risk to have some unavailable sub modules (because the repositiory hosts can be different).

@MarkTiedemann
Copy link
Contributor

I think we should try using the minimal solution we have (...) [a]nd only if that isn’t good enough we should start piling on complexity.

Packages are inevitable, I'd say.

This being the case, the complexity should not be avoided: I think complexity should only be avoided if it's not clear whether the minimial solution may or may not be enough. Since we know this complexity will become a thing, we should not avoid it but rather find good solutions for managing it as soon as possible.

If importing and "packaging" modules in userland becomes a thing, deno should not get involved in that or make or assert information about that.

Again, I think it's fair to say that, sooner or later, importing and packaging modules in userland will become a thing.

If deno does not get involved in that, this may lead to an eco-system that is initially fragmented and ridden by technical debt.

@kitsonk
Copy link
Contributor

kitsonk commented Jun 26, 2018

find good solutions for managing it as soon as possible

The problem is good solutions are a relative term depending on what your needs are. Pre-solving problems before you understand what the problem is are follies.

this may lead to an eco-system that is initially fragmented and ridden by technical debt

The whole reason why we barely have the ability to load ES Modules in browsers today is because the problem is bigger than anyone assumed. The solution in browser userland is to use tools like webpack and rollup. Node.js has been trying to solve the problem and still doesn't have a fully clear solution that doesn't break userland (.mjs is a part solution that no one is happy with).

Let me clarify, I am not saying Deno shouldn't try to solve this problem, but it shouldn't try to solve this problem alone. Deno participating in the WHATWG, where the responsibility sits, to agree to something that works for browserland and serverland, even if Node.js can't or won't support it, is the best case scenario. If Deno doesn't feel it can actively participate, then it shouldn't do anything until there is standard agreed.

We clearly needed something when CommonJS came around, and admittedly the standards bodies weren't as collaborative and inclusive as they are these days, but CommonJS didn't want to consider the complexity of browserland, and so AMD had to spin out from that. That split led directly to a huge amount of technical debt in userland which we all still experience today. If Deno lets that happen again, it either will isolate itself or it will create another island of fragmentation.

One of the easiest ways to avoid that is to allow userland to iterate, much like RequireJS, Browserify, webpack, SystemJS and Rollup have until there is a clear path which can be implemented in the runtime.

@flying-sheep
Copy link
Contributor

Load them independently implies more than 200 http calls... its really a lot... for only one library, without dependencies.

HTTP/2 fixes that.

@trotyl
Copy link

trotyl commented Jun 27, 2018

@flying-sheep HTTP/2 won't fix it. Knowing what the dependencies are would require parsing current module, it's not really parallel and the dependency depth would still impact performance.

Or you'll need to have a dedicated web server which aways use server-push for all dependencies and mess up caches.

@ry
Copy link
Member

ry commented Jun 27, 2018

@Lcfvs @trotyl @flying-sheep You're worrying about optimizing something that doesn't need optimizing. This isn't the browser. If you have to make 200 HTTP calls, once, to load a program - it's not an issue. I'm not going to invest any time trying to parallelize or otherwise optimize this. Remote code is cached locally forever - and this isn't for websites, it's for CLI programs.

@dharmax
Copy link
Author

dharmax commented Jun 27, 2018

i like that discussion. There are sensible arguments to either side. @flying-sheep, you made me doubt my original presumption that we should have a standard module management by using the same argument i often use when i talk with other engineers - not to be attached to something just because it is familiar. On the other hand, others wisely stated that proper module/dependency management with at least the same capabilities as npm/package.json better be addressed sooner rather than later.

Also, there was a suggestion to use a user code for dependency/module management, by @kitsonk. Perhaps this is the middle way. The is a silver lining between being too simple and flexible to chaotic. Perhaps using a userland, a semi-standard convention+library to solve it, is the way. It is also about the notation (e.g. the way the references to modules are formatted) - it would be rather messy if there weren't some standard/semi-standard notations about it, e.g. semver and where to place it...

@flying-sheep
Copy link
Contributor

@ry read again, I’m not worrying about anything 😉

@aduh95
Copy link
Contributor

aduh95 commented Jul 13, 2018

On the same subject, there is a proposal for Package Name maps, a mechanism for allowing for bare imports in the browser. This potentially allows for a consistent resolution algorithm between platforms.

I would suggest not to decide anything for deno until browser vendors implement (or reject) this proposal, cross compatibility module URL resolving would be awesome.

@lhartmann
Copy link

lhartmann commented Jul 20, 2018

Why not just import stuff on the entrypoint script, and then pass those references to the included scripts/libraries that need them?
import alib = require("alib-1.0.8.ts");
import blib = require("https://site.com/blib-2.0.ts", {alib: alib});
import clib = require{"clib.ts", {otherlib: "https://site.com/other.ts"});

Each included script could check it it was given a version, of import it on it's own. Something like:
import alib = libs.alib || require("alib-0.9.7.ts");

Nevermind... It is probably better staying compatible with browsers at any cost.

@ry
Copy link
Member

ry commented Aug 7, 2018

http imports only. no packages.

@ry ry closed this as completed Aug 7, 2018
piscisaureus pushed a commit to piscisaureus/deno that referenced this issue Oct 7, 2019
hardfist pushed a commit to hardfist/deno that referenced this issue Aug 7, 2024
This simplifies a lot of code as we no longer need to pass the deno_core namespace into everything.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests