Running Owncast with Hardware Acceleration on a Raspberry Pi 4
05 Mar 2023
Owncast is a self-hosted platform for streaming video, which I've been using for sharing all kinds of events, from Zwift racing to my wedding. It's been really pleasant to use as a Twitch alternative.
Historically, I've run my instance (https://stream.steele.blue) on a virtual server in The Cloud (specifically, a $6/month DigitalOcean instance), but wanted to explore moving it on-prem, using my own hardware. I had a few goals in mind:
- Performance - Owncast uses ffmpeg under the hood to encode videos. If you have access to 'native' hardware you can enable hardware acceleration to improve performance by offloading rendering to a GPU. This isn't just a theoretical gain; on the VPS I can only encode a single medium-quality stream before I peg the CPU.
- Utility - I had a Raspberry Pi lying in storage after finishing up an earlier project, which feels like an incredible waste of resources. Imagine an entire Linux computer, just sitting in a drawer!
- Cost - Six bucks a month for a VPS isn't exorbitant, but it adds up. Within a year, I could buy a brand new Raspberry Pi (even at their inflated current prices)!
The Owncast docs includes a nice reference on configuring a Raspberry Pi with hardware encoding.
I used a Raspberry Pi 4 Model B with 2GB of RAM, and a 32GB MicroSD card I had leftover from previous projects.
It's connected to my home network via Wi-Fi; which is also where the source RTMP stream is at.
tl;dr
Here's what currently works, as of March 2023:
Software
You'll need to run an old version of Raspbian OS. This is a 32-bit OS, even though the Pi 4 has a 64-bit CPU. More on that later.
You'll also need a copy of ffmpeg which supports the OpenMAX (omx) codec. The version available from the package manager will be sufficient: sudo apt install ffmpeg
.
The remainder of the setup from the Owncast quickstart is sufficient. For example, I enabled HTTPS by running an instance of Caddy on the Pi.
Configuration
Set up Owncast like the quickstart tells you. In the Admin console, under "Advanced Settings", change the Video Codec to "OpenMax (omx) for Raspberry Pi".
My video configuration has two stream outputs defined:
- 1200kbps, Low hardware usage, 24fps (Low quality)
- 4100kbps, Medium hardware usage, 60fps (High quality)
Feel free to experiment with other configurations, but this worked for me.
I still serve the videos from S3 storage, which I had hosted in AWS, but any of the S3 options should suffice. S3 is cheap and easily scalable, so there wasn't a need to pull that on-prem
What went wrong
As described above, I had to use an older, 32-bit version of Raspberry Pi OS, as well as an older version of Owncast, and won't be able to upgrade either of them as new releases come out. That sucks! Here's the issues I ran into when trying to use current versions. Again, this is as current as of March 2023; I'm hoping to try again in a few months.
Newer Raspberry Pi OSs don't have turnkey hardware encoding
Support for OMX on newer Raspberry Pi OS versions has gotten worse. Previously it wasn't supported on 64-bit Pi OS, but now it's gone from bullseye distros even on 32bit: https://github.com/raspberrypi/firmware/issues/1366#issuecomment-1034726587
So I took a look at enabling the new option for hardware encoding (Video4Linux). After hours of investigation, I wasn't able to find a version of ffmpeg that would let me successfully encode/decode videos with V4L hardware acceleration. The full gory details are in Issue #1379.
I'm still pretty happy with this setup
Even with the issues I ran into getting this all running, I like where it ended up. I got a better-performing instance running on cheaper hardware, at the cost of a few hours of experimentation. If I ever need to reconfigure things, it should be a 20-minute setup.
Ideally I'd have this running in a Docker container, but I didn't want to run out of spoons and was ready to declare victory and just use the software.
Over the last few months I've been reevaluating my use of cloud services, with a particular focus on virtual servers. They occupy a "muddy middle" in terms of management responsibilities. Sure, Linode is going to spin up the instance, but I'm still responsible for installing my app, applying OS patches, isolating it within my tenant, etc. All the while, I'm paying as much to rent the donkey as it would cost to buy it.
I wouldn't want to run a big distributed project on hardware I have at home, but then again, I wouldn't want to run any home project that requires a distributed setup.
But with the advent of simple Docker management tools like Portainer, I've been able to move a number of apps previously hosted on DigitalOcean or Linode out of the cloud, and onto machines running in my living room. They've got enough resources that I can scale vertically for quite a while, or pick up another Raspberry Pi or two if I want more "single app appliances". And with tools like Watchtower to help keep my containers up to date, and easier at-home network segmentation with Unifi VLANs, it's never been easier to get a simple, secure home server up and running.