The beginnings of DevSecOps for NodeRED

In the begining…

Since completing our work on running NodeRED on OpenShift (see here), Mark Taylor and I have continued to look at what else we can do with NodeRED. One of the threads that Mark progressed was looking at how we can improve our deployment / management process to align more with a DevSecOps approach and between us we have been testing this out.

The following content has been created from a walkthrough document that Mark shared with me so most of the credit for this work goes to him :-) I have enhanced Marks approach with some tweaks around the management of the SSH layer.

First let me describe Mark’s NodeRED approach using one of his archecture models.

NodeRED configuration overview

As you can see Mark is using the “Projects” feature in NodeRED. This allows the flow files used by NodeRED to be held in and managed by Git. To allow the Git repository to drive the deployment of a NodeRED instance in OpenShift Mark adds a Dockerfile and settings.js file. I have expanded on this and package.json file to control the startup of the Flow Editor. The end state configuration consists of two NodeRED environments.

1) Flow Editor environment where the flows can be authored and save to Git
2) Runtime deployment which is built from the git repository

Also before I get into the main body of the content I just want to flag that I am by no measure an expert with Docker and OpenShift so I suspect there a cleaner ways to achieve what I have done.

Setting up NodeRED Flow Editor environment

The first step is to create the Flow Editor environment as this will initialise the git repository such that we can then augment it to allow deployments to be driven off of that repository. The Flow Editors role is to support access to the NodeRED flow editor interface so Flows can be created and stored. On top of this it will use the Projects feature which allows flows to be stored in a GitHub repository. Once stored in Git, these NodeRED flows can be managed as part of a CI/CD pipeline.

To build the Flow Editor we need a git repository so that we can add a Dockerfile and settings.js file to define the build we need [NB: This is not the respository which will be used to hold / managed the flows created by the flow editor, its just needed to created the Flow Editor]. In addition an access token needs to be created for this repository to allow OpenShift to access the repository to “build” the NodeRED environement. The following show sample Dockerfile and settings.js files.


FROM nodered/node-red
COPY settings.js node_modules/node-red/settings.js
COPY package.json .
USER rootRUN apk add --no-cache inotify-toolsCOPY ssh-sidecar/ /tmp
RUN chmod a+rwx /tmp/
COPY ssh-sidecar/ /tmp
RUN chmod a+rwx /tmp/
RUN chgrp 0 node_modules/node-red/settings.js && \
mkdir /data/.ssh && \
chgrp -R 0 /data && \
chmod -R g+rwX /data
USER 1001


module.exports = {
uiPort: process.env.PORT || 1880,
mqttReconnectTime: 15000,
serialReconnectTime: 15000,
debugMaxLength: 1000,
flowFile: 'flows.json',
adminAuth: {
type: "credentials",
users: [{
username: "mark",
password: "$2b$08$sv6U21t.VGS71wrXLo.SrO6tpiFexGhWqmPUZUgd5CKz93bwTS9N6",
permissions: "*"
username: "admin",
password: "$2b$08$w3b03WJJMGCig8G6Q0GhmOgSluy3zi./D6ZR.yEXrt6AgQA1paq3W",
permissions: "*"
functionGlobalContext: {
exportGlobalContextKeys: false,
logging: {
console: {
level: "info",
metrics: false,
audit: false
// Customising the editor
editorTheme: {
page: {
title: "Node-RED Flow Editor"
header: {
title: "Node-RED Flow Editor"
projects: {
enabled: true


"name": "node-red-docker",
"version": "1.2.0",
"description": "Low-code programming for event-driven applications",
"homepage": "",
"license": "Apache-2.0",
"repository": {
"type": "git",
"url": ""
"main": "node_modules/node-red/red/red.js",
"scripts": {
"start": "/tmp/",
"debug": "node --inspect= $NODE_OPTIONS node_modules/node-red/red.js $FLOWS",
"debug_brk": "node --inspect= --inspect-brk $NODE_OPTIONS node_modules/node-red/red.js $FLOWS"
"contributors": [{
"name": "Dave Conway-Jones"
"name": "Nick O'Leary"
"name": "James Thomas"
"name": "Raymond Mouthaan"
"dependencies": {
"node-red": "1.2.0"
"engines": {
"node": ">=10"

In the above “package.json” its key to note that I have changed the “start” script to call my “” script. This allows me to set up the file watches I need (more on this later).

The OpenShift commands to create the flow editor application are quite simple (NB: I had previously created an SSH key to access my Git repository and the private key is stored in a file named “opc-access”).

#### Define NodeRED Credentials key
#### Create a project for the Flow Editor
oc new-project floweditor \
--display-name="Flow Editor" \
--description="NodeRED Flow Editor environment"
#### Create the secret that will allow the builder to access github
oc create secret generic nodered-floweditor-repo-at-github \
--from-file=ssh-privatekey=ocp-access \
oc secrets link builder nodered-floweditor-repo-at-github
#### Build the new app
oc new-app \
--source-secret nodered-floweditor-repo-at-github \
--name floweditor
#### Create an https route to the app
oc create route edge floweditor --service=floweditor

One thing that did catch me out when I started to use Mark’ s approach was that the oc commands are executed locally and so this means that when the new-app command is executed the access to git will be performed locally. As I generated a new git access ssh key pair and I didn’t want to set this up in my .ssh config I needed to execute the new-app command in an ssh-agent shell as shown below.

ssh-agent bash -c 'ssh-add ocp-access; /
oc new-app \
--source-secret nodered-floweditor-repo-at-github \
--name floweditor'

Once deployed to OpenShift we need to make some additional changes. The OpenShift default deployment strategy is to perform rolling update (new and old pods running at same time). We need a stop/restart due to Write Once access to the persistent volume mounts (/data.

oc patch dc/floweditor --patch '{"spec":{"strategy":{"type":"Recreate"}}}'

We also need to update the existing container volume mount for /data to use persistent storage

oc set volume dc/floweditor --add --name=floweditor-volume-1 \
--type=pvc --claim-size=512M \
--claim-class=ibmc-vpc-block-general-purpose \

and create a container volume mount for /.ssh to use persistent storage

oc set volume dc/floweditor --add --name=floweditor-ssh \
--type=pvc --claim-size=12M \
--claim-class=ibmc-vpc-block-general-purpose \

Finally I created two supporting shell scripts to manage the SSH layer. What I found when I started using Mark’s approach was that as I configured the git environment in NodeRED I would hit issues accessing it. After some digging around I discovered that the issue was related to the target git servers not being part of the “know_hosts” file and the created SSH access keys not being part of the SSH configuration [NB: Mark didn’t experience the issue with the SSH key config so this may just be an issue I am seeing]. The two shell files address this by creating “watches” on the projects folder in the NodeRED running container. When new projects / settings are added they are detected and reflected in the containers environment. These changes are stored in the backing persistent volume so are preserved across pod restarts. The two script files are named “” and “”

# Start NodeRed
node $NODE_OPTIONS node_modules/node-red/red.js $FLOWS "--userDir" "/data" &
# Wait for NodeRED tos start
sleep 10
# Start the key create watche script in the back ground
/tmp/ &
# Set up watch on projects and create sub watches if project created
inotifywait -m -e create --format "%f" $TARGET \
| while read FILENAME
# Start watching the new project
echo "Start wait on "$FILENAME
inotifywait -m -e create --format "%f" $PROJECT \
| while read FILENAME
case "$FILENAME" in
# Check for a git directory and watch it
echo "got git ->"$FILENAME
inotifywait -m -e modify --format "%f" $PROJECT"/"$FILENAME \
| while read FILENAME
case "$FILENAME" in
# Process config change
echo "got git config ->"$FILENAME
# Back up known_hosts file
cp /data/.ssh/known_hosts /data/.ssh/known_hosts.old
# Use keyscan to populate know_hosts
ssh-keyscan $(cat $PROJECT/.git/config | grep 'url =' | cut -f 2 -d '@' | cut -f 1 -d ':') >> /data/.ssh/known_hosts
# Remove duplicate entries
sort /data/.ssh/known_hosts | uniq > /data/.ssh/
# Copy in new known_hosts file
cp /data/.ssh/ /data/.ssh/known_hosts

# Wait for files to be created in .sshkeys
inotifywait -m -e create --format "%f" $TARGET \
| while read FILENAME
# Process the new key
echo "copy "$FILENAME
# copy to .ssh
case "$FILENAME" in
# Use .pub version to get name and add to ssh config
echo "IdentityFile /data/.ssh/"$(echo $FILENAME | cut -f 1 -d '.') >> $PROCESSED/config

This approach works but I’ve not tested all possible outcomes :-)

Once NodeRED is deployed the editor can be accessed via the created edge route. Within the editor a project needs to be configured. This is done by going to the menu tab then Projects → New then click on “Create Project” [NB: On the first log into to the Flow Editor you will be directly presented with the Create Project screen].

Access Create New Project

You will then be presented with the following screen.

Create Project

After clicking on “Create Project” you will see the following

Provide user details

Any username can be specified, I used one of my email addresses and press “Next”.

Project details

Enter details for the project and click “Next”.

Project files

Do not alter the name of the flow file here and just click “Next”.

Project created

The encryption options should already have been pre-selected. Make sure you make a note of the custom key you enter!

After you click “Create Project” the process is complete and the project is created.

Project created

Once the project is created we need to update some of the project settings by going to the menu tab → Projects → Settings

Project settings

Go to the Settings tab and click the add remotes to add a remote git repository

Enter the name of the GitHub repository (SSH form) and click the add remote button followed by “Close”.

Add remote

To allow the flow editor to connect with the remote repository it is necessary to set up SSH keys. If you try to reuse a key in a different repository you will get the message ‘Key is already in use ‘.

In this example we will use a key name of baesshkey1 — a more descriptive name for each key should be used. Especially as you have to pair it with a repository.

To do this go to the menu tab and select settings (not project→settings).

Access settings

Click on the Git config tab and then the add key button.

Add key

We will call the key acesshkey and will set it up without a passphrase. Click “Generate key” to generate the new SSH key

New key

Copy the SSH key to the clipboard ready for the next step. Now go to the Github repository and click on the settings button.

Github repository settings

Under the Options column click on the “Expand” next to Deploy Keys and fill in the details.

Add new key

Enter the name of the key (acesshkey) as the title and paste the ssh key into the key filed. MAKE SURE TO TICK THE allow write access box.

After clicking “Add key” you will see the new key listed in the “Deploy keys” screen.

New deploy key

The “magic” shell scripts will process in the background and make sure you can access your configured git repo from your Flow Editor.

Creating a git managed flow

So with the Flow Editor up and running lets set about creating a Flow. We need to get a flow managed under git in order to move on to the next step of deploying a runtime instance.

Once we have a flow the associated project changes can be viewed by clicking on the “Project History” icon in the top right hand corner of the UI. This will display the “history” panel and any “Local Changes”. To add the changes click on the “+” icon as shown below.

Add change

With the change added we now need to commit the change locally. You can see the change waiting to be committed. Click on the “commit” button to display the commit description field. Provide a sensible description of the change and press the “Commit” button below the description field.

Commit changes

So we now have a locally committed change but we need to push this into git. To achieve this click on the “Commit History” section header to exapnd the list of changes. Once opened click on the “Manage remote branch” up and down arrow icon.

Manage remote branch

As my git repo was empty I didn't have any branches so I entered master into the “remote branch” entry field.

Access remote branch

Next I clicked on the “Create branch” element.

Create new branch

Once the branch was created I was able to press the “Push” button to push my local changes to my new branch.

Push local changes to git

Once the push is complete the “Commit History” panel will be updated to show that the change is on the remote branch as well as the local one.

Updated commit history

If we now look into the git repository we can see the freshly committed files.

Committed files

NodeRED runtime deployment

There are a number of steps to take the ‘code’ committed in the GitHub repository during the last step and create a running application in OpenShift.

To create a container application we are going to create a Dockerfile and create some NodeRED files that define the runtime environment. When the Flow editor pushes to the remote GitGub repository it creates five files as shown in the picture above.

To create a running application we need to create a settings.js file. A full settings file can be obtained from here

The main changes needed to the base file are shown below. The most critical point is that the Projects feature must be disabled.

module.exports = {
// The file containing the flows. If not set, it defaults to
flowFile: 'flows.json',
// By default, credentials are encrypted in storage using a generated key. To
// specify your own secret, set the following property.
credentialSecret: process.env.NODE_RED_CREDENTIAL_SECRET,
// Securing Node-RED
// -----------------
// To password protect the Node-RED editor and admin API, the following
// property can be used. See for details.
adminAuth: {
type: "credentials",
users: [{
username: "admin",
password: "$2a$08$zZWtXTja0fB1pzD4sHCMyOCMYz2Z6dNbM6tl8sJogENOMcxWV9DN.",
permissions: "*"
// The following property can be used to seed Global Context with predefined
// values. This allows extra node modules to be made available with the
// Function node.
// For example,
// functionGlobalContext: { os:require('os') }
// can be accessed in a function block as:
// global.get("os")
functionGlobalContext: {
// Customising the editor
editorTheme: {
title: "NodeRED Runtime"
header: {
title: "NodeRED Runtime"
projects: {
// To enable the Projects feature, set this value to true
enabled: false

The title value should be set to an appropriate name also note the reference to an environmental variable, NODE_RED_CREDENTIAL_SECRET. When we build the container runtime we will have to provide the secret used to encrypt the credentials.

We also need to replace the contents of the package.json file with a version suitable for a container. The main difference is that it defines the start script and parameters. This should be copied from here .The name and description from the original file should be reapplied.

The container is based on the existing Node-RED Docker image with some small additions to incorporate our amended settings.js and package.json files.

FROM nodered/node-red# Copy package.json to the WORKDIR so npm builds all
# of your added nodes modules for Node-RED
COPY package.json .
RUN npm install — unsafe-perm — no-update-notifier — no-fund — only=production
# Copy _your_ Node-RED project files into place
# NOTE: This will only work if you DO NOT later mount /data as an external volume.
# If you need to use an external volume for persistence then
# copy your settings and flows files to that volume instead.
#### We are going to create an ephemeral /data2 and change the ENTRYPOINT to use /data2USER root
RUN mkdir /data2 && \
chown -R node-red:root /data2 && \
chmod -R g+rwX /data2
USER node-red
COPY settings.js /data2/settings.js
COPY flow_cred.json /data2/flows_cred.json
COPY flow.json /data2/flows.json
ENTRYPOINT [“npm”, “start”, “ — cache”, “/data2/.npm”, “ — “, “ — userDir”, “/data2”]

Once the changed have been made the GitHub repository should look something like this.

Updated git repository

With the git repository set up the application can be deployed as follows.

  1. Create a new project via oc new-project e.g.
    oc new-project nodered-system \
    --display-name="NodeRED system" \
    --description="NodeRED runtime"
  2. Setup more SSH keys. To access the GitHub repositories we need to define another set of SSH keys. We define the private key to OpenShift and create a Deploy key in github using the public key. We could have copied the SSH keys from the file system of the flow editor but as the Deploy keys give write access it is better to define another set with just read access to the repository.
    Create the SSH key using
    ssh-keygen -C "nodered-processing/repo@github" -f nodered-github -N ''
    Register the repository SSH key with your private repository on GitHub, go to the Settings for the repository. On GitHub the repository SSH key is referred to by the term Deploy key. Search down the settings page and find the Deploy keys section and select it. Click on the Add deploy key button. In this section, give the key a name and paste in the contents of the public key file from the SSH key pair. This is the file with the .pub extension.
    The next step is to create a secret in OpenShift to hold the private key of the SSH key pair. When using the command line, to create the secret run
    $ oc create secret generic nodered-github \
    --from-file=ssh-privatekey=nodered-github \
  3. Create the application:
    oc new-app \
    --source-secret nodered-github \
    --env=NODE_RED_CREDENTIAL_SECRET="<some secret you remembered to make a note of>" \
    --name nodered-processing
    Check the progress of the build
    oc logs -f bc/nodered-processing
    If the builds completes successfully verify there is one pod running
    oc get pods
    You should see that a build pod has complete, a deploy pod has completed, and that there is now an application pod running:
    nodered-processing-1-build 0/1 Completed 0 72s
    nodered-processing-1-cbp4j 1/1 Running 0 9s
    nodered-processing-1-deploy 0/1 Completed 0 12s


After all this where have I got to… Well I have an approach which does allow me to bring support a DevSecOps style approach on top of NodeRED. There is room for improvement but its working. Going forward I want to look at creating a “side-car” to handle the SSH management as I think that will be cleaner. If I get this working I will create another post.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store