> Unfortunately, the process was complicated by well-meaning members of the npm community who believed that a malicious actor or security breach was to blame and independently attempted to publish their own replacements for these packages. Ensuring the integrity of the affected packages required additional steps and time.
That is such a bad response to this.
The problem isn't that "well-meaning members of the community" decided to upload packages. The problem is that when their system decides that a package shouldn't be up it completely removes the package, as if it never existed, and allows the namespace to be reused immediately. Those "well-meaning members" should not even be able to hijack packages this way, as it means the people who aren't "well-meaning" can also do it.
What should happen is that they block downloads of the package while they investigate. That way people who attempt to download the packages get a meaningful error and people are unable to hijack the package name.
Very displeased about this response.
> In yesterday’s case, we got it wrong, which prevented a publisher’s legitimate code from being distributed to developers whose projects depend on it. We identified the error within five minutes and followed defined processes to reverse this block. Unfortunately, the process was complicated by well-meaning members of the npm community who believed that a malicious actor or security breach was to blame and independently attempted to publish their own replacements for these packages.
No. Assuming everything in that excerpt is true (and I happen to know it's not, but that's not even relevant here), that wasn't the problem.
The problem is that NPM allowed packages to be re-uploaded by new authors after the initial versions had been spam filtered. Especially since allowing packages to be re-uploaded by new authors was the core issue of the left-pad debacle, and the one thing NPM said they'd fixed in response.
Let's summarise here:
1. NPM has a big issue
2. They claimed they had fixed it
3. They had not
4. In their post mortem they're pretending the issue doesn't exist
This guts any remaining trust I had in npm. Even if I wanted to trust them, they're not even admitting the problem exists; how am I meant to believe they're finally going to fix it? They've stopped even promising to fix this, and moved on to lies and denial.
Unacceptable. Literally. This is pushing me away from the node ecosystem because I am not prepared to accept this sort of weaponized incompetence from the primary package repo for node.
How can they claim no malicious actors were involved when packages such as duplexer3 were apparently replaced with undesirable code as reported in item?id=16087126 ?
> no malicious actors were involved in yesterday’s incident, and the security of npm users’ accounts and the integrity of these 106 packages were never jeopardized
Maybe not in the incident itself, but the sheer fact that many of the packages were replaced by other people constitutes a jeopardization of applications that depend on NPM. The only reason why some big package didn't get replaced with code that exfiltrated data from production or subtly backdoored it is sheer luck.
Very disheartening to see that NPM has not grown from the kik or left-pad incidents. Users should not be able to republish on top of old package names without some kind of intervention.
Of course, NPM's response to the kik/left-pad problem was also pretty awful. Make it so users can't delete packages. Cool. For those of you using NPM's private offerings, this also applies to you, so hope you don't care about cleaning up your private NPM registry content.
Getting pretty tired of this. Their open source operation seems to suffer from poor handling of community and technical issues, at least from a high level. Their private registry operation is very lean on features, and also suffers from very confusing limitations. I'm surprised at how long it took there to be read-only API keys. Until last year, you literally had to give your CI keys that could publish to your organization if you wanted access to private packages. And you had to pay for an extra user for the privilege of doing it wrong. It's fixed now but it still blows my mind that it took so long. Aside from faster installation, it's actually a lot better to just use private Git URLs instead of NPM's private offerings.
NPM will never properly take the fall. They will take just enough blame to seem responsible and then shift the majority of the poor decision making onto other actors in the community in their explanations. Third time in recent memory.
The last paragraph is good! The other paragraphs are bad. The two main ways they are bad are:
1) A system that detects "spam" and then allows for the complete removal of packages as if they never existed, allowing anyone to replace them, should never be described in the neutral terms used in this post. This system appears to be an existential threat to the company and project, and one of its largest mistakes. It won't take a long investigation to figure that out; it should be obvious today.
2) The claim that the security and integrity of these hijacked npm package names was not jeopardized appears to be 100% pants-on-fire false. If it is not false, I think npm's users are owed an explanation today of why it's false, rather than a bare assertion.
I know it's hard to be in the hot seat. No animosity to any of the humans involved.
I think it's time to replace npm in default Node installs. They've shown a history of negligence for package handling and procedure. Why should Node continue putting the trust of such a substantial part of not only their ecosystem but the JS ecosystem as well into such unreliable hands?
"We don’t discuss all of our security processes and technologies in specific detail for what should be obvious reasons..." - Security by obfuscation? That's not security. Your protocols and processes should stand up even if made public.
It is really another embarrasment for NPM, considering that this is not the first time something similar happens (see: left-pad issue, 23 march 2016)
So they learnt zero from it.
Then, WHEN ARE THEY GOING TO SIGN THOSE PACKAGES? Zero, nothing. They didn't learn from mistakes, they don't listen. NPM is still open to all sorts of malicious use.
Besides NPM problems, yesterday many packages won't work because package "pinkie-promise" wasn't available. This is the full effective source code of pinkie-promise:
Not just a NPM fault, but a fault of the js community as a whole for accepting systems made from hundreds of one-line packages, a sort of spaghetti code for the modern era.
module.exports = typeof Promise === 'function' ? Promise : require('pinkie');
> who believed that a malicious actor or security breach was to blame
This was a security breach. Their anti-spam system should block repos by freezing the module name and returning blank files, not by deleting the entire module and subsequently allowing anyone to upload new modules. This is leftpad all over again.
Update: Woah, so I was checking out my old NPM namespaces and apparently someone took control over https://www.npmjs.com/package/filesaver.js
Is it related to a package that write logs in colors and send visa card numbers to hackers ?
I left an Ethereum related project precisely because of the house of cards feel of it being built off JS and npm modules. They're a great example of how a project can be exploited by a malicious module which could proceed to extract all the tokens.
If my hat is black, I'm writing a daemon which monitors relevant npm modules and uploads subtlely modified versions if and when the possibility to do so occurs again. Particularly, but not only, targeting crypto currencies.
Independent of npm's issues now and in the past, how vulnerable are other package managers to similar problems? There were typosquatting issues with pip, replacement issues with Rubygems, probably others that I don't recall. What's the current state of things for the more commonly used languages?
Lesson : in a name space, never delete records. Mark them as unused, invalid, inactive, "deleted" but never actually delete.
Why does a spam system delete packages that have already been in use for quite some time? I could understand if it blocked some newly updated ones, but it seems like it has deleted already existing packages that also were used by other packages?
If I was managing npm-using systems, this press release would not put my fears to rest. They need to publish a full review of the contents of packages replacements that were uploaded (even if just for 5 minutes), and publish those replacements in a safe form so they can be reviewed personally by any concerned npm user (any user - not just those who downloaded the replacements through npm itself).
They need to pull their socks up and start signing these packages
Lots of people are piling onto npm here. This reads to me like a fairly simple unintended consequence of what seemed like a good approach.
Automated spam filters help to avoid dodgy packages. Spam filtering operates on heuristics so it’s sensible to not publicise how they work.
The automated spam filter kills dodgy uploads, as these mostly happen on previously unused names a decision is made to not have the spam packages’ name remain taken. Among other things this stops the spammers from leaving all the good names blocked.
The spam code gets a false positive and the above logic kicks in, leaving previously used names now available. This is noticed and corrected within a few hours.
The npm team will likely improve their spam filtering heuristics and also ensure that formerly good packages that get spam flagged do not release their names - they have indicated roughly this on Twitter.
This only just happened, and it’s the weekend, so I’d expect a full write up will be released during the week when they’ve had time to do a post-mortem and work through the salient points.
I'm curious of the manual review process. Is this synchronous, i.e. immediately when publishing a package? Or is it after the fact, where suspicious code code have already been distributed? There are plenty UX trade-offs in either direction, of course.
Npm isn't ready for production use. If you're going to use it you either need to ship the modules you need or run it against a private repo of modules.
Such explanation looks misleading to me. Even having the module detected as potentially malicious, why was it put into a state that such module doesn't exist, letting others publish package with the same name. For me it looks like there is something wrong at the npm side internally with the processes.
Am I missing something in thinking that a reasonable solution is to temporarily block downloads of potentially malicious packages? Why remove them?
EDIT: or better yet, don’t allow people to download the update, just keep users on the old, ostensibly safe version until everything gets sorted?
The pleasures of working with Node.js
I've used NPM only a little but this scares me. What are NPM alternatives?
How difficult is it to run your own private npm repo? Looks like that's what security-conscious folks should be doing, given this response. Any pointers/gotchas/battle stories much appreciated.
> We don’t discuss all of our security processes and technologies in specific detail for what should be obvious reasons, but here is a high-level overview.
Isn't that the opposite of good practice? You shouldn't rely on obscurity. It's better to have the security processes out in the open so that it can be audited and flaws pointed out.
The only thing missing from this post is 'we take security very seriously'
As others have said countless times on the original thread: new packages from different users should never have been allowed to replace the missing packages.
Even if a package is removed as malware, a user should never unexpectedly download the work of a totally new author that they haven't vetted.
Some of the protections should be done on the user side, but that's no reason for NPM to have dangerous policies.
I think a major security improvement for npm and other package managers using almost exclusively open-source/ non-compiled code is to require the source code be linked from a popular open-source platform like Github and then to take the package directly from there, to ensure the code can be audited and nothing else can be snuck in.
A manual review misjudged 106 packages at the same time? That seems... dubious. Unless they have a bulk removal based on meta data, instead of careful analysis. Which would also be pretty dubious.