Friday, August 17, 2018

Shinobi on the Raspberry Pi 3 B+

The Raspberry Pi is a $35 single board computer that runs on a 64-bit ARM processor. It uses the Broadcom BCM2835 SoC which includes hardware-accelerated h264 encoding/decoding, making it a great choice for running a small Shinobi setup with network h264 cameras.
In fact, there's even a wireless $10 version called the Raspberry Pi Zero W!
Shinobi is a powerful CCTV software that forms the "brain" for your cameras, allowing you to configure how, what, and when they record. 
If you've ever futzed around with the CCTV software that comes preinstalled with NVR cameras you'll realize quickly how awful they are. I fully replaced my CCTV NVR with a PoE switch attached to a Raspberry Pi.
Together with the Raspberry Pi 3, you have an affordable, reliable, and power-efficient way to manage your CCTV setup. This "guide" will walk you through my setup and provide some tips if you want to use the RPI for this purpose.

The setup

My setup includes 4 cameras: 2x 1080p@30, and 2x 720p@30. I have them monitoring for motion and recording. The RPI is able to handle this workload with surprisingly little power usage.
In order for this setup to work, there's a few conditions that must be met:
  1. No transcoding. It's simply too CPU intensive even with hardware-accelerated h264_omx encoder. For any recording/streaming you'll need to set the video codec to "copy" (or possibly the jpeg API).
  2. Depending on the camera quality settings, you'll need to bump the GPU memory share up to 256MB. Even to me this seemed too high, but without it I was getting mmal decoding errors with more than 2 1080p@30 NVR camera.
  3. For any decoding you'll need to use the hardware-accelerated h264_mmal codec. Without specifying this codec there will be too much CPU usage. Using MMAL ensures that the heavy lifting of deciding the h264 stream is done on the GPU.
  4. Real 2.5a power supply. Your RPI needs all of it.
  5. (Optional) Active cooling. My RPI case has a small fan hooked up the 3v power supply. I have small heatsinks attached to the SoC.


The default ffmpeg binary installed from apt includes all of the necessary codecs to use RPI's GPU for hardware acceleration. There's no need to do any compiling or ffmpeg, just get it from apt and you're done.
Despite what you find online. There is no need to recompile FFMPEG on the Raspberry Pi to do hardware accelerated h264 encoding/decoding! It's amazing how out-of-date a lot of these guides are.
# ffmpeg
ffmpeg version 3.2.10-1~deb9u1+rpt2 Copyright (c) 2000-2018 the FFmpeg 

# The encoder to use (if any -- see comment about "copy")
$ ffmpeg -encoders | grep omx
 V..... h264_omx             OpenMAX IL H.264 video encoder (codec h264)

# The decoder to use
$ ffmpeg -decoders | grep mmal
 V..... h264_mmal            h264 (mmal) (codec h264)
 V..... mpeg2_mmal           mpeg2 (mmal) (codec mpeg2video)
 V..... mpeg4_mmal           mpeg4 (mmal) (codec mpeg4)
 V..... vc1_mmal             vc1 (mmal) (codec vc1)

Configuring Shinobi

I made a few small tweaks to Shinobi to expose the Raspberry Pi's native decoding methods. The changes are in the dev branch now but should be merged into master soon. 
You can check out the repo here: Depending on when you read this blog you may be able to checkout master.
There are several guides on the site for getting the software installed.
To expose the hardware acceleration method select yes for hardware acceleration dropdown. Leave the HWAccel option as auto and select H.264 (Raspberry Pi) as the decoder.
This will use the hardware-accelerated h264_mmal codec.
For streaming/output I highly recommend you set it to copy to save yourself the CPU cycles of transcoding. If you need to encode in h264, make sure to use the h264_omx codec so that it's hardware accelerated.
Another option for transcoding is simply to setup a cron to do the transcoding in the background with low CPU affinity.
That's about it as far as configurations go.


You basically have three options: root storage, attached storaged, and network-attached storage.
The raspberry pi's main storage is micro SD. You can use the root storage as your primary storage if you're careful about space and set the appropriate video expirations.
You can attach storage via the USB interface. Be careful with additional power draw if your device is unpowered.
The last option is to use a network-attached storage device. This can be a NAS or something similar. This method is the most flexible if the hardware is available to you, such as a NAS server.


mmal encoding errors

The most common problem I encountered was mmal encoding errors. If your cameras are restarting because of these errors, bump up the available memory to the GPU. You may also need to downscale the quality/bitrate of your cameras.

Unstable Pi / reboots

Make sure you're using a 2.5a (or above) power supply. Most power supplies do not supply 2.5a. Usually ones that are marketed for iPads or tablets will supply the amperage, but you must check the back of your adapter to see what its rating is.
Depending on your setup (and how hard you're pushing your Pi), you may need active cooling. You can also try adding a heatsink to the CPU/GPU SoC.

Slow / sluggish performance

If you're using the h264_mmal codec with a 1080p@30fps camera, it takes about 100% of a single CPU. If you're seeing higher CPU usage (such as 200-250%) you're probably not using the codec or something is else is misconfigured.
You can check by running ps aux | grep ffmpeg and taking note of the setting just before the -i rtsp://... line. It should say -c:v h264_mmal.
$ sudo ps aux | grep ffmpeg
ffmpeg ... -c:v h264_mmal -i rtsp://....


  1. Hi, I've tried this (running on a Pi 3 B+) but as soon as I turn on motion detection, my stream goes black. Can you turn on motion detection?

    1. Check the logs and let me know if you see anything strange.

  2. michealkd@gmail.comDecember 7, 2018 at 4:17 PM

    got a pi 3 b+ to try this and the thing croaked. looked at the cli (ps) and it appears ffmpeg is using h264_omx. situation here is i have only 1 cam added and the browser just locks up, tried killing ffmpeg sessions but just not fast enough. Shinobi is testing my patience. I am not far off from buying a proper NVR. Should not be this difficult to use a software but guess not everyone feels the same. Just venting. Cheers.

  3. Hi what dist do you use
    Ubuntu core ?
    or raspbian ?

  4. Hi Stephen, thank you for your detailed explanation. May I ask what linux distro do you recommend?

  5. thank you! Helped me through putting together a setup myself.

  6. Thanks for make post about this. my family just got robbed in house and me want to make cctv system and this post help me :D thanks

  7. Don't suppose you have got a RP 4, and tried this on there? Would love to move away from my old Pentium box to a RP server?

    1. This should work just fine on a RPI4 though I haven't tested it myself.

  8. Hello, I am using Raspberry Pi 4B, os is Hypriotos.
    Follow your setup with one 1080P camera
    CPU usage is under 20%, But Camera display screen (onvif) will delay about 2~3 seconds.
    I don't want to change camera to low resolution.
    How to fix this issue??
    Thank you.

    1. I'm not sure if there's anything you can do to adjust the delay.

  9. Hi , i read your post in the raspberry community , is it possible to install this software in a Raspberry Pi4?

    1. There's nothing different you need to do to run this on a raspberry pi 4.

  10. so a little out of scope but similar ... i have a pi 0w in my garage and an e5 2660v3 box on floor of apt. i just got gpu passthrough working to let an nvidia 1050ti handle encode decode/ offload but neither before or after can i get shinobi to take anything but mjpg from pi... ive tried the uv4l and picam driver and the built in server for uv4l and vlc stream... i can get the stream in all of the above through vlc... both on laptop and ubuntu on virtual guest but geting it into shinobi has yet to happen with h.264, any ideas? i know it works in general becasuse mjpg and an elp cam that does h.264 works so the issue apears to be how the pi lacks encapsulation of the stream and or how to tell the pvr how to deal with that

  11. Thanks for the guide. Enjoyed reading

  12. Hi Stephen,
    After spending about 40 hours, I'm still unable to get hardware accelerated decoding in conjunction with motion detection working on the raspberry pi. Here's what I believe I've established:
    The static ffmpeg builds do not contain the h264_mmal decoder (maybe it's changed since your build), which is required if using the H.264 (Raspberry Pi) video input decoder from the drop down menu in Shinobi.

    Therefore, I've compiled (many versions/times) ffmpeg, successfully using:
    configure --enable-mmal --enable-omx --enable-omx-rpi --enable-gpl --enable-version3 --extra-ldflags=-latomic --enable-libx264
    I can successfully decode the steam and view the camera in Shinobi (yay, halfway there)

    As soon as I enable motion detection, ffmpeg crashes because it doesn't recognize the '-tune zerolatency' command. Forums suggest that this is because the command is not supported by the stream type. I've tried stripping this command out of the Shinobi library files, but the rabbit hole just gets deeper as I then receive 'No device available for decoder: device type vaapi needed for codec h264_mmal'.

    I've even bought a new Hikvision camera to try and standardize my setup, but just not having any luck.

    You appear to be one of the few people who have managed to get this working, and I would appreciate any help you're able to give.

    With thanks,

    1. Hi S4NDR1. It's possible something has changed in the versions of ffmpeg. When I wrote the guide ffmpeg was at 3.2.10, now I see it's at 4.1.6. My guess is that it has something to do with your compiled version. Sorry I can't help further as I'm not longer using my Pi for shinobi. Good luck.

    2. [RESOLVED]

      Thanks very much for replying Stephen. I rolled ffmpeg back to ffmpeg-3.2.15 and it started working straight away! :)

    3. Have a Rpi4 4GB running rasbian buster 32bit, 1.1gb O/S
      have had tried lots of 64bit O/S,No joy just yet pi4 needs a software update from the foundation apparently??..Early days with this Soc.
      I Purged my Shinobi FFmpeg install rebooted & Used this FFmpeg compile install @ PimpmyLifeup here; ..By Emmet
      I now have hardware encoding for stream & recording running @ ~3% Cpu ..very happy as using software on 64Bit o/s was ~80% Took about 40min to compile the ffmpeg. I will be backing this system up as it took all day searching to get this Shinobi to work, now have Omx -Omx-rpi H264-Mmal & OpenMax-IL-H.264 amoungst everything else FFmpeg has.. Hope this helps others with this issue, Thanks Mr Wood been lurking since Rpi 2

    4. @S4NDR1, I am having a very similar experience to you. Can you describe how you "rolled ffmpeg back to ffmpeg-3.2.15"? I have this running on a pi3 with motion detection but without hardware decoding. CPU utilization is approximately 80% so it is usable, but I was hoping that using hardware would reduce it further. I tried 4.1.6 from raspbian repos and compiled git and 3.2.15 builds of ffmpeg and they all seem to break when trying to do motion detection with hardware decoding. Is there one pre-compiled for raspbian buster that you are using? If so, could you share where you found it?

      @Paul I tried the guide you posted when compiling the various releases of ffmpeg. Based on your CPU utilization at 3%, I suspect that you are not using motion detection? Mine reduces to 1-2% without motion detection whether I select hardware or software decoding.

      I wonder if the issue is one of the compiled options instead of the version of ffmpeg. Could someone with a version that functions with hardware decoding and motion detection post their output of:
      ffmpeg -version

      Mine with the raspbian buster build outputs:
      ffmpeg version 4.1.6-1~deb10u1+rpt1 Copyright (c) 2000-2020 the FFmpeg developers
      built with gcc 8 (Raspbian 8.3.0-6+rpi1)
      configuration: --prefix=/usr --extra-version='1~deb10u1+rpt1' --toolchain=hardened --incdir=/usr/include/arm-linux-gnueabihf --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-omx-rpi --enable-mmal --enable-neon --enable-rpi --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared --libdir=/usr/lib/arm-linux-gnueabihf --cpu=arm1176jzf-s --arch=arm
      libavutil 56. 22.100 / 56. 22.100
      libavcodec 58. 35.100 / 58. 35.100
      libavformat 58. 20.100 / 58. 20.100
      libavdevice 58. 5.100 / 58. 5.100
      libavfilter 7. 40.101 / 7. 40.101
      libavresample 4. 0. 0 / 4. 0. 0
      libswscale 5. 3.100 / 5. 3.100
      libswresample 3. 3.100 / 3. 3.100
      libpostproc 55. 3.100 / 55. 3.100

  13. Stephen, could use your expert opinion. I'm running Shinobi on a Pi 4B with Ubuntu 20.10. All is configured correctly.

    Can't seem to get a basic MJPEG stream to work though.

    Here's my URI: http://admin:00000000@ This is an RTSP aware stream from a DVR.

    I get a single still frame in Shinobi and the stream keeps crashing. Probe works fine though. Here's a short snapshot of the log messaging:

    Process Started3 minutes ago
    cmd : -progress pipe:5 -r 2 -analyzeduration 100 -probesize 100 -fflags +igndts -loglevel warning -re -reconnect 1 -f mjpeg -i "" -strict -2 -an -q:v 15 -vf "fps=2" -an -c:v mjpeg -f image2pipe pipe:1

    Should work I think but it won't.

    I was thinking about some of your previous posts about GPU RAM and perhaps FFMPEG options.. maybe something I could try.

    Thank you for your time!


    1. What do the logs say when there's a crash? The only thing that sticks out to me are the "+igndts" flags. Are those defaults?

    2. First, thank you for the response, I read further that you aren't using the Pi platform any longer so I appreciate it!

      I got it to work. The Shinobi platform is slick, but very sensitive. Trying to slice an older-style JPEG stream was giving it problems.

      Ultimately, I got it to work by creating a monitor type of MJPEG (vs JPEG), keeping the frame rate very low (like 2), and here's what I think really did it, taking down the probe inspection time and size to a small number like 50,000 each.

      This is on a Pi 4 Mod B btw and it works great and supports the overclocking hints as well.

      Thanks again for your response.

  14. Hi stephen, would it be advisable to use poe hat? It would simplify the installation greatly

    1. I don't see any reason why that might be an issue.