Today I saw a colleague to google and read about RPC, Remote Procedure Calls. And it made me thinking if. I should start to work on an RPC library, that I was thinking about a long time ago.
RPC libraries, there are many, such as:
- Apache Thrift
However, I would like to have an other one. The protocol should be simple so that it can be implemented with different technologies. But as you might already know I will put a focus on node.js.
Very important will be the data serialization. For that I choose
msgpack. It has implementations in many languages and the benchmarks show, that it is faster than JSON. And msgepack does not need a schema. like
First I will implement a version on tcp. Test it for reasonably large messages, around 1MB.
The server and the client will exchange very simple messages/packages. a request and response.
I plan to let the client download when connecting the list of available methods. and generate a local object that allow to use the functions as if they are local, without passing strings in to a library function. Should be possible to generate a type definition from the server and use it when creating the client. So that the client has typescript support.
Loading the list of methods, is already using the request, response protocol and implementation. Some helpers could be used to even exchange the type definition, even when they are used in distinct repos.
Also the server could expect the client to send authentication informations before exchanging/accepting any other messages.
I believe because many requests can run concurrently and there is a long term connection and a very small protocol, the RPC library should be very effective and fast.
When the performance is good, more tools can be build. Such as:
- exposing the api over web sockets.
- consolidate the api of multiple micro services into a single gateway
- the protocol can be extended for push messages and subscriptions
- routing routing and load balancing, even with rules.
These feature set is very similar to the tooling around
zeromq, but it has to be guarantied, that the first focus has to be the user of the API. It has to be most effective for developers and provide good performance.
In less than 100 lines I was now able to implement a first version using the msgpack5 module, actually two versions. And I got somewhat more clear how streams in node.js work and behave. More on that in a future article.
Because the difference between my first two versions is 1. processing the byte stream directly and decode and encode every junk directly. 2. in the second version I was using the encoder decoder classes of
msgpack5. Only in version 2, I was able to process messages bigger than one chunk of bytes.
However looking at the source of the decoder (github) I found that the lib also just try to parse each chunk completely. When the parsing has am
UnfinishedError, th3 ,chunk get concatenated with the next. That means that for long messages, the first part get parsed as often as there is more data for the same message. That seems like a waste but this simplicity can also lead to better performance on small messages.
before we end this, here is the code:
// we need the msgpack5 and net for serialization and communication
This is the simple implementation, There are a thousand things that can be done different and I think you understand how. here is some example code how to use the client and the server:
// server that provides an API to add to arguments together.
On the client side, this server can be connected to and used like this:
main().catch(err => console.log(err)).then(() => process.exit())
Did you like this article? You also might be interested in the schema less API.