Due do the ease of publishing packages and the popularity of node.js, npm has become the largest package registry in the world. It has surpassed Java’s maven, php’s composer and python’s pip by far. Having so many packages, cause them also to have the widest range of qualities.
However it does not necessary have the widest range in use cases covered or compatibility to more hardware and API’s. For example in Python, there is a great package for interactivity on the console. That might be a reason why Microsoft decided to implement the newer version of its azure cli tools in python providing a cool interactive access to he azure services, almost as on the website, but more powerful as it also has access to local resources. In Node.js when you google or ask the community for interaction on the console, you only will find trompt or inquirer for simple questions. Console UI support with buttons and mouse support seems unthinkable.
There was a talk on a go conference, that if you want to do AI seriously, you have to use python as well, because there are the fully featured implementations for tensorflow, torch and jupiter notebook. What counts for go, also counts here for node.js and npm.
There is said, that in Js there is everyday a new framework. And I think that is true, even I not only have one. For a large degree, I found there are also lots of broken, useless and empty packages. packages that are complete clones of other packages or copies with different defaults and configuration. Some packages are so small, that they basically copy their functionality. Some simply have updates to the dependencies, and yes, you have to be careful, because some packages are just dangerous.
So, the cheer number does not tell you, how much it can solve your problems and your business. It does not tell you, if you will find what you are looking for, because, while there are many good packages, there is also this huge amount of noise, that will stop you from finding what you are looking for.
But what if you need or just want a change in a package? The first that the open source community tell you, is to cast a PR or submit an issue on GitHub or what ever tools they use. (mostly github). After asking for the change, you wait, the maintainer might be quick or take 3-4 months to reply. He can accept, but more likely deny your request. maybe with some extra discussion. It is up to you, if you want to go through this hassle or simply clone the repo and make the change for yourself. (I am so happy, that in the JS community, most developer use the MIT license, that allows this!!) You can publish your changes on npm, just give it a new name. But now you got it, you got a new package. and with that, you also have to keep it up to date. maintain versions and maybe merge changes on the original repo to yours. This overhead is a good motivation, to just submit an issue and check out how active and receptive the authors are to your proposal. Once they merge the feature change you need, you really win, you got the feature and others are going to maintain it.
I was fascinated by the npm module schemats. it takes a database postgres or mysql and generate typescript type definitions. It was developed by some company, and it works well. still, some people would wish some updates and maybe integrations for more databases. The engineers publishing the module, however left that company, and now there is no one who is able to push new versions to the npm registry. It seems they worked out a solution with the original author, and made a new repo on GitHub, to have the project more community driven, but now since more than a year, nothing happened. the module still works well with postgres, but still lags important updates, that would be useful due to new features in the databases and new features in typescript. Now there are a lot of forks and packages on npm, with different updates and added features. The community around the package is now strongly segregated.
NodeJS-Git-Server is an other example with some lost potential. Back in time it was actually really good, I never used it productive, but made some tests, I also used it for learning and experimenting with git. In the code it basically implemented its own routing, and piped requests from the incoming remote client git to the local server git command. It had hooks for authentication and git events. but as node moved on, the code was never maintained. think from a version bigger than node 1.0.0 it always throw an error about extending a class from event-emitter. It was a single line to update, to fix the issue, and I submitted a PR. But it was never replied. In the issues there where many people looking for a solution, the best I could tell them is to edit that one file themselves. I tried to fork the project, but it relied heavily on more modules from the same developer, and I never got to it to also resolve all the dependencies, when forking. Because that is in some cases also something that get necessary when forking node modules. You might not really be interested in a change in the module you are using, but in one of its dependencies. Then you need to motivate the engineer for two changes or in other cases two developer. one for the change, the other for updating the dependency version.Later I found that Yarn has a feature in package json. to select the version for a sub dependency. In my opinion, a huge graph of dependencies is a great way to demotivate developers to fork your project.
Some more examples of that are: angular, meteor, Apollo graphql, vscode. It is a great way for a company to publish an open source project but keep control over its future.
In a previous project working with the Pomelo framework, (a scalable real-time communications framework, suitable for games, needed a change. but the engineers at netease seem to have already abandon it. As they ignored my PRs, and I could not fork more then 20 dependencies. So I wrote a script to overwrite files within the node-modules directory. That run before our server would actually start and require the modules.
Another solution I build for the express framework. The problem of the express framework today is, that the handlers are not async functions. I wanted just return data, that would be send to the client. But it would would introduce some incompatibilities. It works actually very good with most of the available middleware. but without a major version update express would not get async handlers. So I did some monkey patching. after I Found the place in the code, that would call the handlers, I exchanged the method to resolve the result of async functions and return it. It worked great, but I love saw there are decisions to be made about edge cases that the express.js team does not want to introduce to the framework. And that is good. newer frameworks provide such feature as well. such as nest.js, next.js or fastify. Currently fastify is my favorite solution, not my monkey patching.
These kind of solutions, are absolutely considered hacking, and the should only be a last resort if there is no other way around. Such changes have to be documented very carefully, tested carefully when updating the dependencies and communicated very clear in your team.
Now while looking looking up some resources for this article, I found a new Git server module. It is now written in an OOP style, run in modern browser and has more convenient way for authentication and push and fetch events.
I hope this article kept what it promised, and enable you to learn from my experience.