Good to see some OSS alternatives showing up!
Large tech companies, as with most industry, have realized most people will pay with their privacy and data long before they'll pay with money. We live in a time of the Attention Currency, after all.
But you don't need to be a canary to live a technology-enabled life. Much software that you pay with your privacy and data has free or cheap open-source alternatives that approach the same or higher quality. When you orient your way of consuming to 'eh, I can wait till the version that respects me is built', life becomes more enjoyable in myriad ways.
I don't take this to absolute levels. I pay for fancy pants LLM's, currently. But I look forward to the day not too far away where I can get today's quality for libre in my homelab.
This for instance will only install packages that are older than 14 days:
uv sync --exclude-newer $(date -u -v-14d '+%Y-%m-%dT%H:%M:%SZ')
It's great to see this kind of stuff being adopted in more places.
Does the JS ecosystem really move so fast that you can’t wait a month or two before updating your packages?
I know it would take time for packages to adopt this but it could be implemented as parameters when installing a new dependency, like `npm i ping --allow-net`. I wouldn't give a library like chalk access to I/O, processes or network.
A better (not perfect) solution: Every package should by AI analysed on an update before it is public available, to detect dangerous code and set a rating.
In package.json should be a rating defined, when remote package is below that value it could be updated, if it is higher a warning should appear.
But this will cost, but i hope, that companies like github, etc. will allow package-Repositories to use their services for free. Or we should find a way, to distribute this services to us (the users and devs) like a BOINC-Client.
It would also be nice to have this as a flag so you can use it on projects that haven't configured it though, I wonder if that could be added too.
Really depends on the context and where the code is being used. As others have pointed out most js packages will use semantic versioning. For the patch releases (the last of the three numbers), for code that is exposed to the outside world you generally want to apply those rather quickly. As those will contain hotfixes including those fixing CVEs.
For the major and minor releases it really depends on what sort of dependencies you are using and how stable they are.
The issue isn't really unique to the JavaScript eco system either. A bigger java project (certainly with a lot of spring related dependencies) will also see a lot of movement.
That isn't to say that some tropes about the JavaScript ecosystem being extremely volatile aren't entirely true. But in this case I do think the context is the bigger difference.
> then again, we make client side applications with essentially no networking, so security isn’t as critical for us, stability is much more important)
By its nature, most JavaScript will be network connected in some fashion in environments with plenty of bad actors.
I don't think people are having major versions updated every month, it is more really like 6 months or once a year.
I guess the problem might be people think auto updating minor versions in CI/CD pipeline will keep them more secure as bug fixes should be in minor versions but in reality we see it is not the case and attackers use it to spread malware.
Normally old major or minor packages don't get an update, only the latest.
E.g. 4.1.47 (no update), 4.2.1 (yes got update).
So if the problem is in 4.1 you must "upgrade" to 4.2.
With "perfect" semver, this shouldn't be a problem, cause 4.2 only add new features... but... back to reality, the world is not perfect.
More seriously, automated scanners seem to do a good job already of finding malicious packages. It's a wonder that npm themselves haven't already deployed an automated countermeasure.
/s
> It started with a cryptic build failure in our CI/CD pipeline, which my colleague noticed
> This seemingly minor error was the first sign of a sophisticated supply chain attack. We traced the failure to a small dependency, error-ex. Our package-lock.json specified the stable version 1.3.2 or newer, so it installed the latest version 1.3.3, which got published just a few minutes earlier.
[1] https://blogs.microsoft.com/blog/2024/05/03/prioritizing-sec...
In 2 months, a typical js framework goes through the full Gartner Hype Cycle and moves to being unmaintained with an archived git repo and dozens of virus infected forks with similar names.
You might be able to do this around install scripts, though disk writing is likely needed for all (but perhaps locations could be controlled).
Yeah, it needs work from the language runtime, but I think even a hacky, leaky 'security' abstraction would be helpful, because the majority of malware developers probably aren't able to break out of a language-level sandbox, even if the language still allows you to do unsafe array access.
Then we can iterate.
I thought we discuss here problems and possible solutions.
My fault.
There have been several incidents recently where popular packages were successfully attacked. To reduce the risk of installing a compromised version, we are introducing a new setting that delays the installation of newly released dependencies. In most cases, such attacks are discovered quickly and the malicious versions are removed from the registry within an hour.
The new setting is called minimumReleaseAge
. It specifies the number of minutes that must pass after a version is published before pnpm will install it. For example, setting minimumReleaseAge: 1440
ensures that only packages released at least one day ago can be installed.
If you set minimumReleaseAge
but need to disable this restriction for certain dependencies, you can list them under the minimumReleaseAgeExclude
setting. For instance, with the following configuration pnpm will always install the latest version of webpack, regardless of its release time:
minimumReleaseAgeExclude:- webpack
Related issue: #9921.
Added support for finders
.
In the past, pnpm list
and pnpm why
could only search for dependencies by name (and optionally version). For example:
pnpm why minimist
prints the chain of dependencies to any installed instance of minimist
:
verdaccio 5.20.1├─┬ handlebars 4.7.7│ └── minimist 1.2.8└─┬ mv 2.1.1└─┬ mkdirp 0.5.6 └── minimist 1.2.8
What if we want to search by other properties of a dependency, not just its name? For instance, find all packages that have react@17
in their peer dependencies?
This is now possible with "finder functions". Finder functions can be declared in .pnpmfile.cjs
and invoked with the --find-by=<function name>
flag when running pnpm list
or pnpm why
.
Let's say we want to find any dependencies that have React 17 in peer dependencies. We can add this finder to our .pnpmfile.cjs
:
module.exports = {finders: { react17: (ctx) => { return ctx.readManifest().peerDependencies?.react === "^17.0.0"; },},};
Now we can use this finder function by running:
pnpm why --find-by=react17
pnpm will find all dependencies that have this React in peer dependencies and print their exact locations in the dependency graph.
@apollo/client 4.0.4├── @graphql-typed-document-node/core 3.2.0└── graphql-tag 2.12.6
It is also possible to print out some additional information in the output by returning a string from the finder. For example, with the following finder:
module.exports = {finders: { react17: (ctx) => { const manifest = ctx.readManifest(); if (manifest.peerDependencies?.react === "^17.0.0") { return `license: ${manifest.license}`; } return false; },},};
Every matched package will also print out the license from its package.json
:
@apollo/client 4.0.4├── @graphql-typed-document-node/core 3.2.0│ license: MIT└── graphql-tag 2.12.6 license: MIT
Related PR: #9946.
nodeVersion
is not set to an exact semver version #9934.pnpm publish
should be able to publish a .tar.gz
file #9927.pnpm run
return a non-zero exit code #9626.