Branching out: Models, msgs, and microservices
Tue 18 Feb 2020, 01:15 PM
by Ben Langhinrichs
Posts in this series:
As you may have noticed in this series before, I am entering from a position of barely-concealed ignorance, with the hope that if I can make sense of this, so can others brought up on LotusScript, formula language, and the like. Experts may scoff, or even write their own far more detailed blog posts, but I hope I can speak to you with the humility of one who knows only a bit more than you may,.
Two Models for Client/Server
Model 1: Mutual Power with Data Interchange
In the client/server model which Notes/Domino uses, each side has a lot of power and a lot of code. If you are editing a document, for example, most of the processing happens on the client. If the database is local, everything happens on the client. If the database is on a server, data is transferred down, but most processing happens on the client, with occasional bursts of data back and forth to update things or retrieve things.
I discussed the REST API in Domino Access Services in my posts, Part 1 and Part 2, but want to step farther back so we can see the forest instead of the trees. From a high level point of view, this REST API depends on a similar model. You can access the contents of a document, then process it locally, and finally send some or all of the contents back to update the document. The mechanism and calls may be different, but the philosophy is the same.
Model 2: Pass processing to the server
In a thin client/server model, there is still an exchange of data, but it is minimized. Most data, and most processing, happens on the server. I mentioned gRPC in my earlier post on terminology, and hope not to go into much more detail here, but it is the protocol which allows the think JavaScript layer on the client to essentially call procedures on the server. This is done with messages that are written by gRPC. Each is highly structured (and compact), and uses a format known and agreed to between the client and server. In the AppDev Pack, this is domino-db's role. Let's look at an example of how a LotusScript agent and domino-db script might differ, both in coding and in what is happening.
Comparing the Models
Let's compare the code and what is happening with a specific task, that of marking active projects as needing reassignment when a person retires.
Model 1: LotusScript (or REST API)
Let's look at what is sometimes, not always, a wildly inefficient approach. It is also quite common because it is simple, and might even be a reasonable approach depending on a couple of factors.
This cycles through the view of projects. Note that this means the server has already limited the documents to those with a Form of "Project". For each entry, it loads the entire document into memory, checks the status and user before deciding whether to change the status and save the document.
Let's assume the database is on the server, there are 2000 projects, 600 of which are Active, 7 of which are Grace's. This agent will load 2000 documents into memory in the client, meaning all that data is transferred down to the client. Only 7 are saved, after which those 7 are transferred back up. If we assume that each document is 2000 bytes, then about 4MB were transfered down to the client. Most processing happened on the client, but there wasn't much processing, so this is mostly a waste.
There are obviously ways to make this more efficient, such as creating an "All Active Projects" view, or to shift more of processing to the server by executing a NotesDatabase.Search. But the point is, whatever is to be worked on is passed to the client, and updated results are passed back. You could also use a local replica, and then the only transfer of data would be the replication of the 7 documents, but you would have to maintain the local replica with aall the other changes that might happen, so it might be the cost was simply delayed.
The REST API is very similar. You can retrieve documents, check the values, and update the ones you need to, but the processing still happens on the client.
Model 2: domino-db
This sort of matching/changing values is ideal in the client/server model used by domino-db. The code below does exactly the same as the LotusScript code, but virtually no processing happens on the client. Instead, a gRPC call is made which sends up the query and replacements. The PROTON task then uses DQL to find the appropriate documents with the query, and changes the status on the 7 documents found. An array of UNIDs is passed back. Probably no more than a few hundred bytes are sent either way.
Obviously, there are other sorts of tasks that are poorly suited to domino-db, and that simply aren't possible given its very limited API. But I hope this at least explains the logic for using it in some cases, especially when you don't want or can't have the Notes client on your device.
Message and Microservices
Now that we understand the general idea of the two models, I wanted to expand a bit on where I see this going based on presentations at the factory tour, things I have read, and my intuition. So, if I have it all wrong, that's on me.
It is not really correct to say that the gRPC messages are "sent to the server". In a sense, they are broadcast to the network to a particular port. The PROTON task happens to be listening on that port, so it reads and processes the request, then posts its own message back to the network where the client is listening. But what if somebody else were listening as well? I don't mean this in terms of security, because the IAM Service is used for encryption and the communication is very secure. What I mean is, what if multiple PROTON tasks were running on the network listening for messages? This is a simplification, as I am no good at network stuff, but gRPC does allow for the first available process to pick up the message and deal with it. In theory, that could mean multiple PROTON tasks on different servers, but that is the least of it. Instead, think of gRPC messages being posted so that the appropriate processor would handle them. Think of the server as likewise posing small tasks to the network as gRPC calls that might be handled by different processes, including third party processes built specifically for the purposes.
This is the idea behind microservices. Let's say we needed a "tone analysis" such as the Watson folks describe, which ensures that our email is not overly negative or abusive. The router could post the text of the message for tone analysis, and depending on which service intercepted the request, it could provide back the analysis, which the router could then use to either send on or pass to somebody for examination or whatever. But if a separate process, such as a message in SameTime or a discussion posting or something else needed tone analysis, they could post the message which would be picked up by the same sort of tone analysis service and sent back. It might be handled differently on return, but the service watching for tone analysis requests wouldn't care. It would do its process without worrying who or why.
So, the client/server model used in domino-db is one of the earlier steps down the road to breaking up the monolithic functionality of both the Domino server and the Notes client, and instead enabling microservices that handle one job well, and which can be updated separately from the monolith.
As an ISV, I find this exciting. I am already looking at ways to use this model, and possibly participate in this microservice network with our Exciton products. But whether you are interested in that or not, it would behoove you to understand where the development (and administration) highway is heading.
Copyright © 2020 Genii Software Ltd.