Today I saw a colleague to google and read about RPC, Remote Procedure Calls. And it made me thinking if. I should start to work on an RPC library, that I was thinking about a long time ago.

RPC libraries, there are many, such as:

  • Apache Thrift
  • Soap
  • http
  • grpc
  • graphQL

However, I would like to have an other one. The protocol should be simple so that it can be implemented with different technologies. But as you might already know I will put a focus on node.js.

Very important will be the data serialization. For that I choose msgpack. It has implementations in many languages and the benchmarks show, that it is faster than JSON. And msgepack does not need a schema. like protobuf.

First I will implement a version on tcp. Test it for reasonably large messages, around 1MB.

The server and the client will exchange very simple messages/packages. a request and response.

I plan to let the client download when connecting the list of available methods. and generate a local object that allow to use the functions as if they are local, without passing strings in to a library function. Should be possible to generate a type definition from the server and use it when creating the client. So that the client has typescript support.

Loading the list of methods, is already using the request, response protocol and implementation. Some helpers could be used to even exchange the type definition, even when they are used in distinct repos.

Also the server could expect the client to send authentication informations before exchanging/accepting any other messages.

I believe because many requests can run concurrently and there is a long term connection and a very small protocol, the RPC library should be very effective and fast.

When the performance is good, more tools can be build. Such as:

  1. exposing the api over web sockets.
  2. consolidate the api of multiple micro services into a single gateway
  3. the protocol can be extended for push messages and subscriptions
  4. routing routing and load balancing, even with rules.

These feature set is very similar to the tooling around zeromq, but it has to be guarantied, that the first focus has to be the user of the API. It has to be most effective for developers and provide good performance.

implementing of a first version on the evening

In less than 100 lines I was now able to implement a first version using the msgpack5 module, actually two versions. And I got somewhat more clear how streams in node.js work and behave. More on that in a future article.

Because the difference between my first two versions is 1. processing the byte stream directly and decode and encode every junk directly. 2. in the second version I was using the encoder decoder classes of msgpack5. Only in version 2, I was able to process messages bigger than one chunk of bytes.

However looking at the source of the decoder (github) I found that the lib also just try to parse each chunk completely. When the parsing has am UnfinishedError, th3 ,chunk get concatenated with the next. That means that for long messages, the first part get parsed as often as there is more data for the same message. That seems like a waste but this simplicity can also lead to better performance on small messages.

before we end this, here is the code:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
// we need the msgpack5 and net for serialization and communication
const msgpack = require('msgpack5')();
const net = require('net');

module.exports.createServer = function createServer(api){
const server = net.createServer(connection => {
// when we get a connection, we pipe the receiving into a decoder
// and create a encoder that is piped into the connection
const connectionDecoder = msgpack.decoder();
const connectionEncoder = msgpack.encoder();
connection.pipe(connectionDecoder);
connectionEncoder.pipe(connection);

// Thanks to the decoder the `req` is now the object that was send by the client
connectionDecoder.on('data',async (req)=>{
//const req = (decode(data));

// next is calling the wanted method with the provided args and return the result
if(!req.method)return;
if(!api[req.method])return;
try{
const r = await api[req.method](...(req.args||[]));
// the request ID is created by the client, because the client could send many request concurrent.
// they could take different time in our server, so the order can vary. The server don't need the ID,
// but the client read.
connectionEncoder.write(({id:req.id, r }));
}catch(err){
console.log(err);
connectionEncoder.write({id:req.id, r:err.message, error: true});
}
})
});

api.__getMethods = async ()=>{
return methods;
}
const methods = Object.keys(api);

return server;
}

module.exports.createClient = function createClient(host,port){
// we will fill the api as soon as we connect to the serve
const api = {};
function addAPI(name){
api[name] = (...args)=> request(name,args);
}
// the __getMethods api is available on the server(see above)
addAPI('__getMethods')

const client = net.connect(port,host,async ()=>{
// load the list of API names, we could do something to support nested object structure,
// but a flat list will do for now
const methods = await api.__getMethods();
console.log('client:', methods)
methods.forEach(addAPI);
rpcClient.ready.resolve();
});

// the client does the same for encoding and decoding as the server for its connection
const connectionDecoder = msgpack.decoder();
const connectionEncoder = msgpack.encoder();
client.pipe(connectionDecoder);
connectionEncoder.pipe(client);

// request is the internal method to call the remote API,
// to avoid working with strings in the APIs client code.
const requests = {};
let nextRequestId = 0;
const request =function request (method, args) {
const id = nextRequestId++;
connectionEncoder.write({method,args,id});
// this request method does not await the response,
// but save the result promise on the requests object (by ID)
return requests[id] = getDeferrer()
}
// when we receive a result from the server, we resolve the promise
connectionDecoder.on('data',res=>{
// const res = decode(data);
if(requests[res.id]){
if(res.error){
requests[res.id].reject(res.r);
}else{
requests[res.id].resolve(res.r);
}
delete requests[res.id];
}
});
const rpcClient= {
api,
client,
ready: getDeferrer()
}
return rpcClient;
}

function getDeferrer(){
var _resolve, _reject;
const p = new Promise((res,rej)=>{
_resolve = res; _reject=rej;
});
p.resolve = _resolve;
p.reject = _reject;
return p;
}

This is the simple implementation, There are a thousand things that can be done different and I think you understand how. here is some example code how to use the client and the server:

1
2
3
4
5
6
// server that provides an API to add to arguments together.
// as the server await the result of that API implementation,
// it can be an async promise function or sync like here.
createServer({
add: (a,b)=> a+b,
}).listen(8081);

On the client side, this server can be connected to and used like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
main().catch(err => console.log(err)).then(() => process.exit())
async function main(){
const rpcClient = module.exports.createClient('0.0.0.0',8081);
// after ready the apis are loaded from the server and available on the API object
await rpcClient.ready;
const {api} = rpcClient;

// call the add method directly it is now for sure an async function,
// as it run on the server.
const sum = await api.add(1,2);
console.log({sum})

// the add method works for numbers and strings, just like in js
const textSum = await api.add('hallo ', 'world')
console.log({textSum})

// as we use the stream methods even large messages are working well
await api.add('0123456789'.repeat(100000),1);
process.exit();
}

back to developer log

Did you like this article? You also might be interested in the schema less API.