Getting a good lightweight video stream off of the Beaglebone Black has been a struggle, but I think I’ve found a way to do it at practically no CPU cost. After all, what fun is a badass robotic platform if you can’t get realtime video off of it?
More seriously, debugging sensor inputs and then getting a view augmented with a data overlay to an operator would probably speed things along. When I was building the first mobile platform I spent a lot of time watching it pan it’s Sharp IR sensor around, wondering what it saw. I had some output to a ‘radar’ style display on a 1.8″ TFT display, but it was hard to read such a small screen when standing behind it as it drove along.
The first thing I thought of was ffmpeg. What a gong show. I could get ffmpeg to write video from a USB webcam to a file, but I couldn’t get ffserver to actually stream anything to a browser or VLC. Yuck. I also tried jsmpeg, which also needed an updated node.js. A fun adventure, no big errors, but displayed no video in the js in-browser client. Fail.
My quest was to find a better way to do this; the goal being a container on a web page either showing a streamed video feed from the BBB or pseudo-streaming by showing a series of still jpegs.
Finally, I came across MJPEG-Streamer. Works like a charm.
Here is the method, generally from a mix of http://blog.miguelgrinberg.com/post/how-to-build-and-run-mjpg-streamer-on-the-raspberry-pi and here http://www.linuxcircle.com/2013/02/06/faster-video-streaming-on-raspberry-media-server-with-mjpg-streamer/, so all credit to them for writing up Pi-specific methods.
I figured that if it worked on the Pi, and didn’t use any h.264 hardware acceleration that isn’t on the Beaglebone, I’d be fine.
In the commands that are apt centric I substituted opkg (I’m on Angstrom), and because I had just done a whole shwack of install/compile/curse/compile-again with ffmpeg I think I had the majority of the libraries installed or updated.
So, basically; prepare the environment by either running svn, or just grabbing the source code with wget. My source was mjpeg-streamer-code-181, which might be a fork of the original.
Oh, BTW, it won’t compile yet; don’t try until you trick it. This is a total hack, and will one day be known as a Bad Thing, but for now it works:
You might need to update the lib’s used in the compile; interestingly I couldn’t get libjpeg8-dev as per the recommended steps, but it still works … that could be all the prior updates done for ffmpeg. I did this, but it was already current:
Now do the make and install steps:
make mjpg_streamer input_file.so output_http.so
cp mjpg_streamer /usr/local/bin
cp output_http.so input_file.so /usr/local/lib/
cp -R www /your/www
make DESTDIR=/usr install
That last step glosses over the fact that in my prior adventures I’d installed and run lighttpd, and set the conf file to bind to port 81. That way I still get the default BBB web interface on port 80. My www is something like /mnt/usb2/www/pages, so that’s where the client files go; a web page that has the img tags for the stream is all that’s needed:
<img src=”http://rover:8085/?action=stream” width=”320″>
Then fire up the process that outputs a stream from the /dev/video0 feed. This is like a combined ffserver / ffmpeg setup, where you’d have ffmpeg sending data to ffserver, and then clients connecting to ffserver. The only difference, besides taking practically zero-CPU, is that it’s just one command:
On startup it does complain a dozen times or so about ‘Inappropriate ioctl for device‘ with references to hardware that isn’t in this particular webcam (pan, tilt, focus, LED, etc). No big deal.
I added the recommended startup as well, so it should always be running… all the webcam.sh stuff.
An interesting thing about lag/latency is that it seems tail-ended… the faster the framerate is (I use 5 or 10) the less lag there is. When I first ran it with -f 1 the lag was at least 3-4 seconds… setting it to 10 made it lag only a fraction of a second. Nice.
Note that nowhere did I use the raspstill command, since this isn’t an rPi, so that’s why I mentioned those two different articles in the top of this post. It’s sort of mix-and-match-and-pray, and it works!
This isn’t what I was intending to do, but for now it’s fine. The missing bit is doing any kind of text overlay. I had installed freetype in Go, and was about to start down the path of opencv for Go, so I could grab an image, timestamp it and add any sensor graphic overlays, and then spit it back out to the stream. I might still do that, but for now there isn’t a good reason to. I’m getting 2.3%-2.6% CPU used by the mjpg_streamer process when serving video, so I’m happy.
Oh, one final thing. As I was watching the output of top in the console I couldn’t believe how much crap-ola was running to support the desktop environment. Since I’m not plugging this thing into an HDMI display, but logging in via ssh, let’s fix that:
systemctl stop gdm.service