This post was created when I was using Blue Iris version 4, but may work with version 5.

I love Blue Iris surveillance software, but it can be a major CPU hog if not tweaked. You can really reduce CPU use by utilizing a graphics card (GPU). For example, I am using a Quadro P200, which I purchased on Amazon for $429 in October 2018. I am running Blue Iris on a custom built computer with an Intel i7-8700K CPU, 48GB of 3000MHz DDR4 RAM, with Windows 10 64-bit installed on a 250GB Samsung 970 EVO NVMe SSD. Surveillance video is saved to two 3TB Western Digital Red drives in RAID 0.

This computer is reasonably fast, but when I first installed Blue Iris CPU usage was very high (around 25%). I had twelve cameras, but I knew CPU use should not be that high. I did some tweaking and got Blue Iris CPU use down to around 2% when not viewed remotely in a web browser. Viewing Blue Iris on other devices where transcoding is required will increase CPU and GPU use. My CPU jumps to around 28% and my GPU to around 48% when remotely viewing my cameras on a web browser. When just viewing Blue Iris with the GUI, my GPU use is around 8%.

Let’s Tweak Some Settings

Go to Blue Iris Options (sprocket icon in upper left of the main window) and select the Cameras tab. Go to the Hardware accelerated decode option and select your GPU and then OK.

2019-05-04 10_59_45-Blue Iris Options

Now we must change the settings of each camera. I wish there was an option to make this a universal change across all cameras.

In the Blue Iris GUI, right-click on a camera and select “Camera properties…”, go the “Video” tab and select your GPU from “Hardware accelerated decode,” check the option for “Limit decoding unless required,” and make sure “Enable overlays” is not checked. This last option will not show the date and time on live viewing or in recordings. You can turn on the date and time on each camera, but this may be troublesome. A few of my cameras do not keep accurate time, even though they are set to update their time from an NTP server on the network. Make sure anything else I outlined in red is the same on your system and select “OK.”

2019-05-04 11_17_33-High View

NOTE: Limit decoding on my system made a big difference. My CPU usage went from 10% to 2.5% and my GPU from 20% to 3%.

Now, go to the “Record” tab and make sure the “Pre-trigger video buffer” is off unless you really need it. Also select “Video file format and compression…” to open another window.

2019-05-04 11_18_05-High View

Make sure “Direct-to-disc” is selected. I don’t know if it makes a difference with CPU usage, but I have always used “Blue Iris DVR” (BVR) as the file container. Select OK when you’ve made the desired changes.

2019-05-04 11_18_23-High View

Please leave a comment below if you have some Blue Iris tweaks or have an issue with what I have recommended.

UPDATE 12/23/2020

I moved to Blue Iris v5 and GPU use was high, even with the settings outlined in this post. I discovered that using sub streams for viewing all cameras in the GUI greatly reduced my GPU use.