In a recent article, I described how serverless Functions on Oracle Cloud Infrastructure can invoke each other: synchronously and asynchronously. Although asynchronous calls are the desired state, such calls currently are possible only with quite a bit of hassle – using a Stream for queuing asynchronous requests and a Listener component that needs to be scheduled to consume messages from the Stream and act on them.
What I would like API Gateway to do is visualized in this figure:
Basically I would like API Gateway to accept a request that means: make a call at your earliest convenience to the endpoint indicated (function or HTTP destination) using the body, query parameters and headers I have indicated. I do no need a response – as I want to do fire and forget. I may want to provide a forwarding address for the eventual response and a conversation or correlation identifier to be included.
I would like to suggest to the team behind API Gateway that this would be a valuable feature to have – one that would allow a greatly desired quality in serverless and/or microservices designs: asynchronous interactions whenever possible.
A poor man’s approach – available today
The desired way of working with asynchronous, fire-and-forget requests can be achieved on OCI even today. With a Stream on the OCI Streaming Service as the ultimate decoupling point, the layout in the following figure gets me what I am looking for:
Any client application can publish a message on a Stream. This message contains an instruction to invoke a function or HTTP backend along with body, headers and query parameters required to make that call. The message may contain a forwarding address to send the response to, a conversation identifier for tracing and correlation purposes and a deferred delivery time.
The Stream is a queue of requests that need to be handled. A listener [and router] needs to pull messages from that queue periodically and frequently. The listener could be implemented through a Function that simply gets the most recent X seconds or minutes worth of Function Call Requests and for each makes the required call to the API Gateway.
In order to ensure the periodic and frequent execution of the Message Listener Function, this function can be exposed through API Gateway at a simple public HTTP endpoint. This endpoint can be triggered easily by a Health Check set up in OCI Monitoring. This Health Check is configured with an endpoint and a schedule for invoking that endpoint once every X seconds or minutes. See this article for an example of scheduling function execution using a Health Check.
Note: it would be nice to have a way to easily record the exact offset of the latest message handled by one cycle of the listener, to be used by the next round as its starting point; a second Stream could be used by the listener to record this information, as could a database or the Object Storage. OCI does not currently offer a cloud native cache solution that would perhaps be the best option.
A slightly more detailed figure of the implementation is shown below. It includes a second function exposed through an API Gateway that handles HTTP requests from clients that contain instructions for calls to be made asynchronously. The function pushes events with that instruction on a Stream and immediately return a response to the client. The Message Listener function is triggered – also through API Gateway – by the Health Check. It makes the requested call and it may then forward the response received from the API to the optionally indicated forwarding address.
Note: the Async Request Accepter function could perform validation and authorization on the request – if it knows how to handle those for the indicated endpoint. We should also think of a way to deal with errors resulting from the asynchronously made call.
My article on asynchronous calls from Function to Function through an OCI Stream
My article on scheduling function execution using OCI Monitoring Health Check