Ben Langhinrichs

Photograph of Ben Langhinrichs

E-mail address - Ben Langhinrichs






January, 2020
SMTWTFS
   01 02 03 04
05 06 07 08 09 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31

Search the weblog





























Genii Weblog


Civility in critiquing the ideas of others is no vice. Rudeness in defending your own ideas is no virtue.


Fri 31 Jan 2020, 04:39 PM
 
Inline JPEG image
 
I'll be announcing some of the pieces of our new product line next week, so if you could all manage to keep the world from exploding, imploding, or generally going to hell before then, I'd appreciate it.
 
Checks the news.
 
Rats.
 
Oh well, I'll have announcements next week anyway. Stay tuned.
 

Copyright 2020 Genii Software Ltd.

Wed 22 Jan 2020, 12:08 PM
Posts in this series: 
While Node JS stuff is the newest, I'm going to start back at the REST API logic available in Domino Access Services. (If you thought I was going to post about the song by Cage the Elephant, you might have missed the gist of this blog, though I do like the song.) I'm pretty sure DAS was added in Domino 8.5.3, though the documentation is a bit scattered. The most reliable resource seems to be written by Dave Delay at https://github.com/OpenNTF/das-api-specs. While this is useful, it is hard not to notice that the last update was two years ago.
 
So, is DAS still relevant? I think so, though I'd love to hear from others. I do know that the AppDev Pack 1.0.3, which is very recent, has support for the OAuth DSAPI Extension as part of IAM, which supports the OAuth2 Introspection Protocol. In other words, you can use the same IAM services to authenticate a DSAPI like DAS as you can with the NodeJS stuff. Note that I say a "DSAPI like DAS" because you can write your own DSAPI to handle REST calls. That's exactly one of the things I am doing for a beta we are releasing in the not too distant future, though we are not doing this instead of Node JS, but rather in addition to it.
 
Domino Access Services for Tree Huggers (Documents and Views Edition)
Okay, so this is where I try to explain (figuring out as I go) what you can and can't do, and how you do hat you can. While their is support for calendar events, busytime, mail, and directory stuff, I'm going to leave that for some other enterprising person to document, I'm going to focus on documents  I'll try not to get too geeky, as I'll have to refer back to this post and want to remember what the hell I was talking about. I'll focus on your basic CRUD (Create/Read/Update/Delete), though for the sake of what people usually do, I'll switch that to the less memorable ARCUD (Access/Read/Create/Update/Delete) since that is the order we tend to need stuff. Acess is how we get to the documents in the first place.
 
0) Access - Getting a list of collections (views and folders) from a database
 
If you were paying attention in my previous post, and I hardly blame you if you didn't, you need to set up a database to allow DAS access first. Assuming you have done that, you now have a few URL endpoints you need to know about. I could just say URLs, but would you be as impressed? I thought not. In any case, you could start with
 
 
to get the array of views and folders. You can just type this in a browser, but technically that is doing a GET using HTTP. I downloaded Postman (because Jesse Gallagher suggested it), a free utility which supports all kinds of protocols and is easy to use.  So, if I do this in Postman (specifying the port because of reasons), I get the result below. Particularly notice the red arrows. First shows the URL endpoint and that this is a GET. The second shows the Response code of 200, which means the HTTP request was accepted and responded to. The third arrowshows in the JSON the URL endpoint for a specific view.
 
Inline JPEG image
 
 
We will do a second GET with the new URL endpoint for that collection, which we can see is called "Examples", and since it is not private and not a folder, is a public view. There, we will see all the view entries retrieved., I'll focus in on the one view entry for the document I want to read. The red arrow points to the URL for the specific document. Note that instead of being accessed through /api/data/collections, it is accessed through /api/data/documents. 
 
Inline JPEG image
 
 
So now, we have the URL for the specific document, and can move from Access to Read.
 
 
1) Read - Getting the contents of a Notes document and all its fields in JSON
 
When we read a document using DAS, we get a combination of system information about the document as well as the specific fields. In the image below, I numbered the arrows to make it clear what you are seeing. 1) The url used to access this document; 2) the system fields: form, unid, noteid, creation & modification dates, authors; 3) the rich text field Body which has been rendered as multipart MIME; 4) the plaintext rendering inside that MIME; and 5) the HTML rendering inside that MIME.
 
 
Inline JPEG image
 
The decision to use MIME is not surprising, as the Notes/Domino code already supports converting from rich text to MIME, and the MIME parts can include all of the images and attachment references. See the image below for the images and attachments represented as separate MIME parts with all the images and attachments base64 encoded so they can be included in the JSON.
 
But while MIME was easy for IBM to add, it isn't ideal for a variety of reasons. The rendering isn't so great, as we'll see, and there isn't a lot of support for turning a MIME representation in JSON or JavaScript into a web page. It can be done, but it is not straightforward. There is also a real question about whether these should be included inline, especially the attachments, or referenced in a way that can be accessed. When you get a web page generated by Domino, the attachment is referenced as /db.nsf/view/unid/$File/attachment.xls so that it doesn't have to get passed back and forth unless it is needed. I could not find any way to enable that with DAS, though there might be.
 
In any case, so far this has been fairly straightforward URL access to documents, and even those tree huggers amongst us who don't know what REST APIs are can use URLs. Let's get to the stuff that may be less obvious.
 
2) Create - Making a new document with a JSON representation
 
The URL for creating a new document is fairly simple. You just use the part without the unid/AE05828B3D7CCB9585257E9F006D13BD
 
 
But if you use that URL in a browser, you will get a generic collection of all documents in the database, which is handy but not for this purpose. The reason is that the URL in a browser is always doing a GET, and to create something, we need to do a POST. This is where Postman (see where the name came from?) is useful. Let's take the actual response we got from reading  the document above, and turn it around to create a new document, All we really have to do is switch from GET to POST with the exact same URL, but we call also change the JSON. (You have to copy thee JSON and switch to POST and then click on 'raw' and paste it back in before modifying.) In this case, I'm going to change the subject from Better Budgies to Better Budgies 2. My tendency is to also remove the system fields for @unid and @noteid, but it isn't necessary. They are ignored when creating a new document with POST. So, let's give it a try.
 
First, here's the part of the Examples view with the Better Budgies document.
 
Inline JPEG image
 
Second, we'll copy, paste, modify the JSON and switch to POST as the method.
 
Inline JPEG image
 
Oops, do you see what I did wrong? I got the response back below.
 
Inline JPEG image
 
I forgot to get rid of the /unid/AE05828B3D7CCB9585257E9F006D13BD in the URL. When I remove those and try again, I get another error.
 
Inline JPEG image
 
That's right, when I do a POST, I have to specify what type of content I am posting, as it could be HTML or a file attachment or whatever. So, I need to add a Header, which is easy in Postman, that sets the Content-Type to application/json. I try, and this is what I get.
 
Inline JPEG image
 
The status code of 201 is a bit different than the 200 I got before. Both mean success, but this means the content was posted and created. Let's take a look at the view now.
 
Inline JPEG image
 
Notice that underneath the Better Budgies post, there is now a Better Budgies 2 post. You might also notice that the size is quite different. While the document should have all the same fields, you'll find that the rich text field in Better Budgies been saved as a MIME field in Better Budgies 2. That conversion leads to some loss, which is one downside of using Domino Access Services with any application with rich text. Spoiler alert: that roundtrip fidelity is one of the things we specialize in at Genii Software, so the product we create will handle this better. In any case, here's the difference:
 
Inline JPEG image
 
Inline JPEG image
 
 
Well, this blog post is long enough already, so I'll continue with the topic in a separate post that will cover Update and Delete.

Copyright 2020 Genii Software Ltd.

Mon 20 Jan 2020, 03:57 PM
REST vs gRPC with logos
 
 
Posts in this series: 
In this post, I'll dive briefly into a few gory details. While these are critical to me as an ISV, for many Notes/Domino developers, they are concepts you only need to understand roughly. 
 
Traditional NRPC: Communication between tightly coupled client and server
In traditional Notes development, you have been spared most of the worry  about how the Notes client and Domino server communicate. They just do. We don't think about how running a script or editing a document in a server database from your Notes client. It just works. But the way it works is the Notes remote procedure call, aka NRPC. To quote the documentation:
Domino® servers offer many different services. The foundation for communication between Notes® workstations and Domino servers or between two Domino servers is the Notes remote procedure call (NRPC) service.
Because the Notes client and Domino server are tightly coupled, we don't need to worry about how NRPC works. It just works. And since both the Notes client and Domino server share a lot of the same executable code, they can balance who does what and only communicate when absolutely necessary.
 
Problem: Communication between decoupled client and server
Opening up NRPC to other systems is problematic. It can be done via the APIs, but essentially requires that the Notes or Domino code is installed on that server, so you are really still having Notes code communicate with Domino code via NRPC. But what about when we'd like a web page on a non-Domino server to communicate with Domino? Or we want a Notes client or Domino server to communicate with SalesForce or other software run by other people not interested in installing Notes/Domino. We need other protocols. 
 
HTTP - a transfer mechanism
Primarily, this means HTTP. Even a classic Domino web page communicating with Domino uses HTTP rather than NRPC, as does XPages. That is why you can sit in an Internet cafe in Brussels and work on a page served up by your Domino server in Montréal. You are using HTTP rather than NRPC. HTTP is stateless, meaning that every call is independent. A session may be maintained on the server, but every call has to prove itself part of the session. That can prove a heavy load. 
 
Solutions chosen by HCL: REST and gRPC
 
REST - a protocol for web services
In 2000, Roy Fielding wrote his doctoral dissertation on REST (Representational State Transfer), a concept he defined. The idea was that companies could create a REST API and defined efficient and easy ways to interact with their system over the existing HTTP system. If they wanted other people to use their system, as many companies did, they made it a public REST API. Instead of submitting an entire web page and having the server process it, the developer could use web services to send commands and receive back information to act on. This allowed web services when enabled programming in a far more interactional way. 
 
While there are variants, most REST APIs use JSON as the payload and HTTP as the delivery mechanism. There are many public unsecured REST APIs, but if authentication and security are needed, most use OAuth, a public open-source authentication system that can use your existing authentication with Facebook, Google, LinkedIn, etc. OAuth 2.0 is the current state-of-the-art, and Domino provides that for authentication.
 
In my last blog post, I mentioned Domino Access Services. That is the REST API that first IBM and now HCL have provided to allow accessing Domino data from a web service. Using a simple URL (which is usually a GET in HTTP), you can retrieve the contents of a Domino view or document as a JSON string which can then be parsed and used in various languages. Lots more can be done, but I'll talk about that in a different blog post. But for a simple example, here is a URL endpoint (as they are often called) to access the views and folders in a Domino database on my very local server:
 
 
and when I use that on my database that has Domino Access Services and has the database setting below
 
Domino Access Service setting
 
then I get a response in JSON with each view or folder (aka collection) defined. You'll notice it also gives the endpoint URL to get to the contents of each collection
 
Sample JSON for collections in Domino Access Services
 
 
gRPC - a transfer mechanism and protocol
While the REST API over HTTP was sufficient for most uses, some uses require greater efficiency. Since many of those uses involve Google, they created their own new-but-old take on remote procedure calls for highly efficient APIs. They realized that the worst part of RPCs was the stuff they were passing was so damn proprietary and included internal data structures and so forth, but JSON wasn't fast enough and XML was worse. So, they made the whole thing open source and used protocol buffers, and then defined an easy way to create and handle those protocol buffers in various languages.
 
When HCL started work on a Node JS project, they could have used a REST API, but decided it would be more scalable to go with gRPC. Mind you, you can install and use domino-db and never know it uses gRPC, but if you want a sense of how well it works, look in the dominodb/package/src directory:
 
Directory of JavaScript files
 
What you find is JavaScript files. No C/C++/Java. Just simple JavaScript files that passes gRPC messages up to the Domino server where the Proton server addin is ready and waiting for messages. It handles all the nasty stuff, and passes back other messages. This means that you can use the Server and Database and Document classes on any lightweight platform that allows Node JS (and therefore, JavaScript). 
 
The protocol buffers are defined in .proto files. Here is an example I built for one of our forthcoming products:
 
Sample .proto file for sort by row method
 
By comparison, that same method would have a JSON schema something like this if it were in a REST API:
 
Sample JSON schema for sort by row method
 
Again, you may not need to know a ton about these, but the terms will be used. If you can't get enough of this stuff and want a more general discussion of the different web services technologies, start with Kristopher Sandoval's article, When to Use What: REST, GraphQL, Webhooks, & gRPC.
 

Copyright 2020 Genii Software Ltd.

Tags:

Thu 16 Jan 2020, 02:42 PM
Inline JPEG image
 
Following up on my previous post, The tree you are busy hugging has new branches, I am starting a series of posts on how to explore those new branches. As a warning up front, I am not close to being an expert at this stuff, but maybe that is good. I hope you will follow this learning journey with me. The image above is shamelessly stolen from Bruce Elgort who posted it today. I will attempt to teach and, with any luck, we'll both learn someething. Btw, if you are one of those lofty gurus who knows more than I do (a low bar, I'm afraid), please don't hesitate to leave a comment here so I can correct and learn myself..
 
 
Node.js, so far as I can tell, is a way of taking JavaScript and making it more robust and speedy, as well as allowing some uses that were difficult before. It is implemented as a runtime environment outside of a browser, which means you can use Node packages standalone or in non-browser applications as well as in a browser. It is event-driven which goes along with HCL's direction (or perhaps HCL is going along with its direction?), and is built on a compiled C++ core so is very fast and very scalable. As IBM or HCL might call it, Enterprise Ready.
 
Like many things in the modern web world, there are multiple and somewhat incompatible versions of absolutely everything. Thus, if you go out to download NodeJS, which is freely and easily available, the version you find will be either 12.x.x or 13.x.x. But, of course, those aren't the versions you want. You will need the latest version of10.x, which you can find at the straightforwardly named https://nodejs.org/dist/latest-v10.x/. As much as you can trust anything these days, you can trust downloads from nodejs.org. Some versions of Node.js are marked with an LTS designation. LTS means Long Term Support, and it basically implies that an Enterprise wanting Enterprise Ready stuff can rely on that version in the long term. Thus, 10.x is dubbed LTS, and specifically "Dubnium" because everything on the web has to have a silly nickname so that you know developers are cool froods and not money-grubbing capitalists.
 
 
npm is the grand installer of things in the JavaScript and Node world. In JavaScript and Node, there are packages which you add in the same way you might add a LotusScript library, but they can be publicly or privately available. You may be familiar with packages from the plethora of JavaScript frameworks. (I hate the word framework. It brings to mind too many failed Lotusphere sponsors who over-promised and under-delivered, all while wearing excessive amount of black clothing and serious expressions.)
 
Fortunately, npm comes along with Node.js by default, so if you install Node.js, you will get npm for the same low, low price of free with no extra effort. 
 
 
The AppDev Pack is HCL's bundling of a few things you'll need to run Node JS stuff with HCL Domino 11.  You can download the AppDev Pack 1.0.3 from FlexNet where you get your Domino 11 downloads. Elements of the AppDev Pack include the following components. More documentation can be found here.
 
  1. Proton - The server addin which handles all the Notes/Domino logic. In true client/server fashion, the Node JS stuff from the client is packaged up and sent to Proton. Then, the DQL is resolved and reads/writes executed and the return value go back to the client. 
  2. domino-db - This is the package which your Node code will reference directly. It has six classes, though that is a bit deceptive. Almost everything in the package is for CRUD operations, though some of these are very powerful as they can do bulk actions on many documents.
    1. The DominoDB class is kind of like the NotesSession class in LotusScript in that it is instantiated once and then other stuff is accessed using it. It has one method and no properties. 
    2. The Server class is created using that useServer method from the DominoDB class.. It has three methods, but only one is a true method which gives the Database class. The other two are gets for properties.
    3. The Database class is created using the useDatabase method from the Server class. This is the closest you get to a true class like the NotesDatabase class in LotusScript. It has eighteen methods and a couple of other gets for properties. Most of the methods are for bulk operations based on DQL selections.
    4. The Document class is created using the useDocument method from the Database class. This is a parallel to the NotesDocument class in  LotusScript, and has seven methods for accessing, replacing, and deleting items and attachments.
    5. The BulkResponse class and DominoDBError class are both encapsulated ways to hand back information on the results of an action. The former lets you know the status of a bulk operation while the latter lets you know a bit about what went wrong when something did go wrong. 
  3. IAM - This is an authentication piece which allows for OAuth 2.0 scopes to be authenticated to user ids in the ID Vault (or locally on the server?). I don't know a heck of a lot about how it works, but it has three components:
    • domino-iam-service: A Node.js based lightweight server, which is deployed along with a Domino server to provide the whole IAM service.
    • oauth-dsapi-extension: A Domino extension to enable Domino to trust IAM and consume tokens that IAM grants to your application.
    • node-iam-client: A Node.js module to assist your Node.js application to talk with IAM. Java, .Net, and other applications can access IAM directly through IAM's RESTful APIs.
 
 
Domino Access Services, is a DSAPI way of reading/writing to Domino data using REST web services. It has been around a long time, but I think the intention is to have it get more powerful as part of HCL's modernization of web development in V11 and beyond. One of the long time gripes about DAS has been its security, and it appears that the IAM service allows for greater adherence to user-based security in DAS, meaning that it will get used a bunch more.
 
This may all be a lot for us tree huggers, but we can get it. For now, keep this post in mind as a resource. If you have questions or comments or thoughts, or even just want me to know I am not alone on my journey, please leave a comment.
 
Posts in this series: 

Copyright 2020 Genii Software Ltd.

Tags:

Wed 15 Jan 2020, 11:58 AM
Arms hugging tree
 
As a 57-year-old developer who has spent 24 of those years working with Lotus/IBM/HCL Notes/Domino, I am quite used to being seen as a tree hugger, and that may even be the kindest appellation applied to me. Let's face it, a whole bunch of you fit the name as well. But somehow, miracle of miracles, the tree has come back to life a bit. This might tempt us to hug even harder, thinking we've been proven right and not just stubborn. (Nah, we're just stubborn, but we got lucky.)
 
But that Notes/Domino tree you've been hugging so tightly has sprouted some new branches, and if you want to keep hugging it for the next decade so until you can put your feet up and rock those grandkids in the rocker, you (and I) might want to explore the new branches.
 
We are obviously long past the time when the web itself is new, and we've all watched and even participated in the birth and almost-death and, perhaps, the almost-not-death of XPages. But there's more out there, and it isn't all going to be in LotusScript and formula language and classic Domino. Those will continue, and you will still be able to hug that limb for a while to come, but I am determined to tread out onto some other branches. HCL has poured a lot of time and energy into the architecture for a new, more modern approach to Domino. The AppDev Pack with Proton and IAM and domino-db are just starting to develop, and clearly have a lot of room to grow. Full stack developers are more and more able to use the NERD/DERN stack (meaning Domino as a secure data backend keeping the sap in that tree you are squeezing half to death). We are just at the beginning stages of HCL's event-driven model, but that promises a ton of power and flexibility. There's more, as well, which I'll go into another time.
 
As a Notes/Domino developer, I'm going to start playing with Node JS more, and working on web services and REST APIs. But I'm not just a Notes/Domino developer. I'm also an ISV, and an impatient ISV at that. I could sit around and wait for HCL to fully take advantage of the architecture and model they've built, but where's the fun in that? Instead, I am designing and building products in all of these areas, but if you want your company to be able to take advantage of a turbocharged AppDev Pack, for example, you better learn enough Node JS to be able to use it. If you want a REST API that will let you change and render almost anything, you'll first need to get your head out of the LotusScript library long enough to figure out how a REST API works. You might want to at least read a bit about HCL's event-driven architecture during your lunch break.
 
Trust me, your LotusScript and formula language will still be sitting there when you get back (and we at Genii Software will still be enhancing them), but the tree is growing and you could at least consider hugging some different parts of it. Consider it job security while your rocker is on order.
 
Posts in this series: 

Copyright 2020 Genii Software Ltd.

Tags: