For months I've been working on a system that can transmit real time video to the ground from a high altitude balloon. Last weekend, on Saturday, September 9, we did a test launch of this system. I'm going to go over the functionality and test launch, as well as some tweaks I still need to make.
The new system is essentially just an extension of what I outlined here. To recap, I'm using an RF4463 to put out 1W of FSK on the 70cm band. Data is LDPC-encoded to allow for some amount of bit flips, and the theoretical range is above 250km.
On the payload, I'm encoding video from one of the cameras and sending it over the same link, with the same LDPC encoding as before. On the ground station, we're filtering out the video packets to reassemble the stream from. On both ends I'm using gstreamer. I was a bit worried at first of it being overkill but it was super easy to get going with, and the system was rock solid. A huge thanks to everyone in the
#gstreamer IRC channel for helping me out; I asked a lot of silly questions and they were very patient with me!
The video is encoded as H264 - the most I can get a Pi 4 to do. Currently it pegs the CPU at around 50%, but the hope is to eventually move to GPU transcoding to save some power - and heat! The video is encapsulated in an MPEG-TS stream to allow graceful recovery from packet loss. The video is 640x480, 12FPS, with a maximum bitrate of 200kbit/s, though gstreamer seems to exceed this at times.
On the ground, the decoded video is displayed on the screen in real time, but it's also saved to the disk. The system is still able to do images as before, so any free bandwidth is used for that.
On September 9, we (the rest of the team and me) did a launch at 9am from Florenceville, New Brunswick, Canada. It's about an hour and a half from me and we needed prep time so I got up at 5am. The weather was perfect - it was a good temperature, sunny, and there was absolutely no wind. Other members had their own stuff to test, of course, but I was most interested to see how the communications system would work, since that's what I contributed. Until then, we hadn't done a proper test of the system on an actual payload.The payload train, ready for liftoff
Here's what my portion of the ground station looked like just before launch time. From left to right, I have an arrow antenna mounted on a tripod. The 2m portion is used for APRS reception, and the 70cm portion is used for the high speed downlink. Attached to the tripod is an RSP1a SDR receiver. As usual, I'll say to never buy one - it requires proprietary drivers. But unfortunately, it's the only SDR I have with the required sample rate. In the image, it's not connected to the antenna yet. I waited until after liftoff so that it wouldn't overload the receiver.
On the table is a lead acid battery, TM-231A 2-meter radio, and an audio interface. These are for decoding APRS packets. I set my laptop up as an I-gate to the APRS-IS network to improve coverage a bit before the balloon got high enough to be heard by some of the other ones.
On my laptop is the ground station software. It's hard to see, but it's displaying the last received image, the live video feed, telemetry, and a few graphs for evaluating our received signal.
On the right, next to the laptop, is a small diagnostic board I built. It can receive and decode packets, though it doesn't do LDPC. Even at 50km it could still receive most packets on its tiny wire antenna!
Below is the video as received by the ground station. Once again, I have to stress that this was sent to us in real-time over the 70cm band, over a distance of over 50km, at only 1 watt!
The most important result to me is that the video stream works, and it works extremely well. The Eb/N0 value remained above 20dB even at 50km - the system should certainly work at a much longer distance.
I was curious to see how the Si4463 would hold up to the temperature extremes. With the thinner air I was a bit concerned it might overheat. On the ground, it ran at about 50C. Once launched, it steadily got colder until it was reading about -5C, and then it continued to climb back to roughly room temperature and the air thinned. Frequency drift was essentially nil even at low temperatures, which I was pleased about.
There are a few minor issues I need to resolve for the next launch.
The top surface of the clouds is quite similar to white noise and makes the h264 encoding less effective. This causes the video bitrate to rise until it's slightly above what the downlink can handle. The result of this is that we start throwing away packets, leading to occasional video corruption. What's especially annoying is that during more interesting parts of the flight - such as when the balloon pops - the video changes more and we almost surely get video corruption! This is the biggest thing I need to fix.
I discovered a small bug where packets were getting corrupted on initial transmission. I was seeing this during local testing but I thought it was due to receiver overloading. I've already fixed this and it decimated my packet error rate. It used to be around 1 per 2-3k packets and now it's less than 1 per 30k packets.
I'd like to have some visibility into how full the payload's video buffer is. For the next launch, I'm going to add some telemetry to report that information to the ground.
We want to stream the next launch to Youtube. To facilitate this I'm going to add an RTSP server to the ground station. I figure I can use existing software to consume this stream and pipe it to the Internet.
Once again, this was pretty much as successful as it could be. The core system worked perfectly and I identified some improvements to make. Other people are still evaluating how their systems did, and I hope they went as well as mine!
As with all my projects, the code is freely available.
Ground station dependency (for video)
Ground station software (based on Wenet)
And by the way, we did recover the payload: