Happy 2020 from Forge - make this Proxy and Cache sample your New Year Resolution!
Christmas might be over but the fun is not, because it’s the new year of 2020! Surely you’ve been completely overwhelmed with all the cheers and gifts and we hope you won’t mind one more from us - a sample Node.js app with proxy and cache for Forge services:
- Github: https://github.com/dukedhx/forge-proxy-nodejs
- Docker Hub: https://hub.docker.com/repository/docker/dukedhx/forge-proxy-cache
The What, Why and When
This project is intended as a sample implementation of proxy and cache functionality as part of your Forge workflow, that is to retrieve OAuth access tokens from the Forge service on behalf of the client, inject them into the requests and forward them onto the intended Forge endpoints. Once the response came back you can optionally cache the response in string or binary format with expiry age and access control available for configuration.
See these articles (here and here) about the rationale and motivation to employ a proxy as well as a simpler proxy sample here. Basically adding a proxy/cache layer to our Forge apps can help boost the security, stability and latency of our workflow - access tokens would be hidden completely from the client who can be considered as tenants to your services so access tokens can be reused for clients of workflows under the same Forge credentials so as to cut the latency of having to retrieve access tokens for each and every client requests, with a view to keeping client data strictly isolated as your app has total control to what the clients can access. Responses from the Forge endpoints are passed to the clients w/o buffering/temporary persistence within our app, not to add pressure on the backend, and the contents can be optionally cached aside as binary or string formats for instant resolution of future request.
Broker/middleware approach as such is especially useful when we have multiple clients with similar business requirements to be fulfilled by the same Forge workflows so that there’s greater potential for latency/performance benefits to share access tokens and cached response among tenancies. And the proxy service can be deployed to near client locations to further improve latency and availability so we can focus our networking optimization to speed up traffic between the proxy service and Forge.
Architecture
You can either integrate this sample into your backend or deploy as a standalone broker app to your favorite cloud platform and even marshall a group of containers to form a service cluster for greater scalability and availability. See the below chart for a reference deployment paradigm:
Setup and Run
Prerequisites
- Forge account
- Node.JS (install here) or Docker (install here)
Run as Node.js app
- Clone the repo from GitHub
npm install
- Set up the client profiles (see next section for details)
npm start
- Access Forge endpoints through the broker, e.g. http://localhost:3000/oss/v2/buckets
Run as container
If Node.js is not part of the picture for your environment then go with the containerized approach:
- Prepare the image
Option A: pull the image from the Docker Hub
docker pull dukedhx/forge-proxy-cache
Option B: build the image from Dockerfile, you may customize the image by editing the Dockerfile in the sample - navigate to the project folder and run
docker build -t NameOfYourImage .
- Run the container, set up the client profiles and create directories to store the config files, cache flat files and database files:
docker run -v '/path/to/config':'/usr/src/forge-proxy-cache/config' -v '/path/to/cache':'/usr/src/forge-proxy-cache/cache' -v '/path/to/db':'/usr/src/forge-proxy-cache/db' -p 3000:3000 dukedhx/forge-proxy-cache
- Access Forge endpoints through the broker, e.g. http://localhost:3000/oss/v2/buckets
Configuration
Set the authentication and caching profiles in config/default.json
:
{
"forge_host":"https://developer.api.autodesk.com",
"clientProfile":[
{
"id": "xxx",
"header-key": "x-my-header-key",
"authProfile": "xxx",
"cacheProfile": "xxx"
}
],
"authProfile": [
{
"id": "xxx",
"clientId": "",
"clientSecret": "",
"scope": "",
"scope-header-key": "xxx",
"redirectURL": "xxx"
}
],
"cacheConfig": [
{
"id": "xxx",
"cacheConfig": [{
"path": ["oss/*", "derivative/*"], // or simply 'path/to/cache'
"age": 65535, // in seconds
"content": "binary/string"
}
]
},
{
"id": "xxxx",
"parent": "xxx" // inherit settings from a parent
}
]
}
What Next?
Here’s a few areas to expand up on to complement the sample:
- Add support for other logging destinations/persistence options such as Syslog, Redis etc.
- Implement error handling with friendly error, fault recovery (promise rejects due to networking etc)
- Cache management capabilities such as clearing, archiving, reporting on existing cache contents
- Other bits and pieces like management UI, graceful shutdown (fulfilling pending requests), quick start deployment (templates for cloud services) etc
If you’d like to share what’ve done with any of above welcome to chip in with a pull request. Thanks and look forward to working with you on all things Forge in 2020 and many years to come!