-
Notifications
You must be signed in to change notification settings - Fork 51
Please get rid of: node.js, npm, packages from npm, Electron, svelte. Cargo and crates are also not good. #1469
Comments
Yes, I was very confused when I saw that building a decentralized project management app involves downloading a large amount of code over HTTP from npm, cargo and github -- three very centralized services. It is effectively a time bomb since you cannot guarantee that anyone can take this code 5 years from now and successfully build it, since it is relying on many online resources staying available. The fact that multiple web-based package managers are used makes it non-trivial to create a full backup of all of the code that is used to build the project in a form appropriate for archival. |
I agree with the sentiment here, but all the technologies mentioned allowed us to ship a working application in a timely manner. I think that should always be the first goal. Now that this is out of the way, we can start looking at improving efficiency, security, resilience etc. |
This appears to just straight up be a false claim, most Rust packages compile and run on older stable versions as early as 1.39 which you can easily install from your average GNU/Linux distribution. Some crates actually do have optional nightly-only features which are disabled by default. Ones that do depend on nightly are fairly rare and are largely just experimentation rather than crates that people use in practical code, and those never go past crate version 1.0.0 before feature stabilization. One another note I'd like to add, regarding this:
|
Maybe. I have not measured, it is just from my experience: most of Rust packages I tried to build personally (an LSP server for Rust and rav1e) have failed because of a nightly feature was used either in package itself or in some dependency of it (I also remember I have tried to build seqoia-pgp, but I don't remember if the build have failed because of non-nightly version or not). |
|
Should retry then, thank you. |
@KOLANICH So basically you are saying they should abandon the whole project? to my knowledge there is nothing like a decentralized packaging system. And most of that stuff about programming languages is just like, you know, your opinion. So if you have such broad criticism, maybe something more constructive would be better, such as actually giving a viable alternative that is not 'redo everything'. |
GUI:
|
This is something we're exploring, since we're going to want a read-only web-view at some point.
|
@KOLANICH How is Qt better than any of this?? |
I see one problem with Qt - you may want to write the native part in rust too, but Qt bindings for rust are very immature. |
Hi all, Regarding Qt, I would like to point you to the Qt offering changes 2020 blogpost that created controversial discussions regarding the open source support of Qt. |
Yeah, I know about this. But I feel like FOSS community depends on Qt too much to allow them behave not nicely. They either behave nicely, or get replaced. If they really stop provide prebuilt binaries, Fedora, Debian, other distros and lots of other parties will do that for them (OpenJDK is an example, Oracle changed their license terms - AdoptOpenJDK initiative has appeared, with quite some companies participating, and providing builds that were never have been published by Oracle, like 32-bit OpenJDK 15, and it is mostly relevant for Windows, since for *nixes OpenJDK is a part of distros). In fact Fedora already does that, and in my crossbuilds (host - linux, targets - windows) I use their packages unpacked (I use Ubuntu, but nothing prevents me from just downloading their packages and unpacking themm, I also remember I have applied some simple bash script to fix the hardcoded paths).
https://github.com/miurahr/aqtinstall is a standalone downloader. The problem of Qt official offline installers for Win32 is that they are not easily unpacked in Linux. I mean QIF doesn't work in Wine, so for unpacking I had to do one of the approaches:
Unfortunately in some point of time the binaries by Qt started to be incompatible to MinGW-w64 runtime from Fedora and Debian, so I returned to unpacking packages by Fedora. Also, noone forces us to use Qt 6 (it was really a very irresponsible decision to drop Win 7 and 32 bit, orgs have a lot of old PCs and they won't throw them into trash only because Qt Company thinks they should do so). |
I think this is a good approach. In the long run I agree that reducing dependencies and 'heavyweight frameworks' would be important, but a project can't focus on all the things at the same time. Please focus on making a usable P2P code collaboration tool first 🙂 There it helps to have an interface that is github-ish. I'm fairly sure people will pitch in to help with all those other things once you have that. Solving bootstrapping and software supply chain issues at the same time is not realistic (I say this coming from Bitcoin Core where we now spend a lot of time on those). |
IMHO noone sane will use the project that with high prohability is hijacked by enemies (and we all know who the enemies are and that in its current state it is easy for them to hijack the project unknowingly even to its authors). So either your policy is that bells and whisthes are less important than security, or that bells and whistles are more important. Only 2 choices exist (we don't consider |
I think we should reevaluate the seriousness of the problem that we are being presented here and the possible solutions which would not require throwing out the entire codebase.
The flaw in this core idea that we are being presented throughout the thread is that the author of the issue simply does not understand how small the scope of the problem is. Points 6 and 7, for example:
Those have nothing to do with security at all. Both Electron and svelte contribute to potential poor performance and high RAM usage of the client, but those are inherent problems of a HTML/JS/CSS stack and cannot reasonably be solved without using either an external browser as a UI host or using a different client entirely, which would require a different repo and different maintainers. On that note, I'd really like to see, for example, a pure Rust GTK+ client, but again, that's a whole different story which has nothing to do with this specific implementation of a client. Eliminating the above two points, we are left with point 1, 2 and 4:
The first two points have ready solutions which can be deployed right now with little effort:
Point 4 is largely solved as a consequence of decoupling from centralized registries: with dependencies being vendored and checked into source control, the threat of a new version being released and a backdoor sneaking into code is essentially gone, since all updates to the dependencies would have to go through a PR, where proper security auditing can be performed. This is time consuming and perhaps challenging, but perfectly secure and autonomous, since the external centralized package managers are essentially removed from the equation. A contributor policy of responsibly adding dependencies (which generally includes avoiding new dependencies when possible) will still be needed with the vendoring approach, but once the cumbersome task of auditing all dependencies and verifying their security is done, the repo will be free of uncontrolled external code. Unless the contributors create vulnerabilities themselves by accident, the client will essentially be invulnerable to external attacks. |
In fact not nothing, but almost nothing. But getting rid of them is still in wishlist.
It still won't spare you from reviewing the whole DAG of dependencies for backdoors (which usually defeats the purpose of using packages, when people use packages, they usually do it in a black box fashion, just install them and no matter which dependencies the packages are using and what the deps and their deps do, it is easy to use but insecure, and lot of deps are created using a mindset that installing deps recklessly is a necessary evil of the current technology growth and popularity and that nothing can be done with it and that it is OK to encourage users to do this way too. So it may result in need of some packages being rewriteen with a bit other goals than ease of installation.) This task would require in-depth reviewing each dependency and interactions between them. I feel like that such big tasks can be made feasible with something like language-based security, when a programmer declares the rules his package and its dependencies must obey and the runtime used enforces the rules. This simplifies review since one can only review rules and then assumme they are enforced by runtime. Of course it won't detect side-channel (it can detect some in some cases, i.e. by being able to access the API returning time) and microarchitectural attacks, but it can give a clue that a package can contain an explicit backdoor. Also it doesn't mean that rule enforcing is implemented correctly or that rules are written correctly. |
Yes, there's not much you can do about JavaScript's inherent security flaws other than carefully auditing dependencies once you're able to precisely control the entire graph of them, which is generally the direction I'm suggesting to work in. Static analysis tools, particularly JSHint and JSPrime may prove useful. An alternative interesting direction would be to use a WebAssembly-compatible language like Rust instead of JS, leveraging the interface types proposal for WebAssembly once it lands, which doesn't have any of the inherent language flaws that JS has, only Cargo's theoretical backdoor potential, which is mitigated by vendoring. This has the downside of requiring a rewrite of the JS part of the codebase, but gets rid of the severe difficulties of auditing dependencies with Node.js, thanks to things like |
Decentralized packaging aka build from source but have an option for a "binhost" aka binary prebuilt packages pulled from mirrors? Guix is a great GNU offering for this and probably my top pick as an independent package manager, but it can also be used as a full distribution. They have a CI system that builds binary packages regularly and they have a slowly growing list of mirrors for these packages, or you can just request to build from source yourself to get the "cleanest" version for your system. I think pacman (from Arch) and emerge (from Gentoo/Sabayon/CoreOS/ChromeOS) can also potentially be used for building from source on other operating systems (e.g. pacman can be used on Windows if you install Msys2 and not just the Git Bash for Windows minified version) and both have a similar binhost option, Pacman (for Arch) can use AUR, and emerge (for Gentoo/Sabayon/ChromeOS) can grab prebuilt packages that match your platform architecture as well. Another option is the Linuxbrew functionality that was rolled back into Homebrew aka brew.sh as that allows building from source for a large number of packages. I don't think they do |
Sounds kind of
Applying all those security checks needs lot of work (aka time and money)
In future it could be feasible to "bootstrap" radicle, every bit of the client is distributed through radicle. |
So does developing radicle instead of just aggreeing to use centralised services. Ones wanting to save money don't get involved into such useless projects for them as this one.
to mitigate it properly we need to create a completely new language with built-in security checks and rewrite radicle into it. |
I'll just jump in here since I agree with some things said here. @KOLANICH I think you're being a bit aggressive and very idealistic. There's only a certain amount of free time that a limited amount of people can devote to a project. Please bear that in mind. They can't solve all peripheral problems of the tools we have to use. Getting rid of the Electron dependency would be great. It's bulky, makes the binary pretty large, and doesn't provide a CLI. It looks like rust is already amongst the chosen languages, so it looks like a language contributors are comfortable with. GTK is cross platform and has a crate. The project could split into a lib and multiple binaries (GUI, CLI). The advantages I would see with the approach:
Disadvantages
In any case, thanks for the work already done. Had I time, I'd join the charge away from Electron towards rust in a heartbeat. Cheers |
The project itself is extremily idealistic. In order for it to succeed quite some unlikely things need to happen.
I know. But a single crucial flaw is enough to waste all the labour spent on developing this software. IMHO, static linking is a disadvantage, not advantage. And BTW, https://github.com/sixtyfpsui/sixtyfps , but IMHO this framework has no future if it would stay GPL licensed. |
@KOLANICH IMHO time to market is more important for this kind of project (there are no git on "torrent" viable alternatives), than "let's rewrite the whole ecosystem from scratch" |
IMHO time to market is much less important than doing things right. I.e. a single backdoor in pybitmessage was enough to spoil its reputation to the line to keep the most of its intended audience away. If a backdoor happenned once, how can we trust the project and its devs and community around it which hasn't spotted the misbehavior of devs in time anymore? In fact, creating a project intended for consumption by anti-New World Order (PRISM, TEE, DRM, mandatory proprietary JS on every site, proprietary spying mobile apps, centralised platforms for cooperation of people having KYC-like measures, other shit) enthusiasts and then killing it with a backdoor destroys the whole faith into the possibility anti-NWO software. I.e. after the backdoor has been discovered in pybitmessage, I have stopped using it and all the decentralized messengers at all. Now I consider all them as honeypots, unless explicitly proven otherwise. We should not trust projects; it is the projects that should proove they deserve the trust. Not we should just audit their code. The code should written the way making audit easy and maximally automated. Because the practice has showed - even the ones who have read the code (I myself have read the code of pybitmessage, because I was curious about its protocol internals, but it turned out I haven't read the line where backdoor resided) haven't spotted the backdoor. |
Thanks for the feedback everybody. I suggest opening a post on https://radicle.community/ for further discussion, if needed. |
npm is plagued by backdoors. It is inacceptable for a decentralized project to depend on backdoored packages. Decentralized projects are priority targets for adversaries who want to discourage use of decentralised projects by attacking everyone using them, because relyance of people on centralysed software allows to better control them.
Cargo and crates are examples of a package manager and a repo inheriting npm weaknesses.
Rust itself is also not very good, since most of packages require nightly version of rust, and the way of installing it is known as highly controversal from the point of security. Distros usually don't ship packages for rust nightly versions.Some features of ECMAScript make it especially suitable for hard-detectable backdoors.
[]
syntax for properties access, making the RCE code look like innocent collection access. One just generates the stringeval
in runtime, and thenwindow[g3n3v4l()]()
and there is noeval
token in the code.eval
backdoor in pybitmessage was not hidden at all (and even though it was spotted only when people got mass attacked), but if pybitmessage was written in JS and if there was an intent to actually hide the backdoor, it would be easier to do that with the antifeatures described above.getattr
, but it can be easily detected, usinggetattr
is a potential marker of obfuscation.pickle
, any package usingpickle
is a potential backdoor. Unfortunately some free software developers are either idiots or assholes with the positionif you don't trust our package, just don't use it. PR closed.
But usingpickle
can be detected by static analysis.importlib
and__import__
are also big red flags since they can import packages by dynamically generated names.Electron is a heaviyweight piece of Chromium. Large, slow, heavyweight, bloaty. It may be much better to just start a local webserver and control it using any browser of user's choice, such as Firefox.
svelte is a framework, but it is essentially redundant. Modern HTML DOM has enough API allowing to write modern websites completely without frameworks, in Vanilla JS. Maybe with a bit more code, but the benefits are great, sites on vanilla.js tend to have less overhead and better understandability, because everything is interacting directly to browser. No frameworks, and no surprises to be hidden within them, and much easier debugging since devtools see the actual event listeners, not the ones of framework calling the framework user's ones via closures.
The text was updated successfully, but these errors were encountered: