Finding the right hardware is one thing but all of this is for naught if there isn’t software to actually bring all of this functionality to life.
Companion App
The obvious first choice for software when it comes to creating a Pip-Boy is the companion mobile app that was used alongside Fallout 4 on release, the developers did a great job replicating the in game UI and the built in demo mode was perfect for prop use. If you’re not using a phone though then that’s a bit more challenging, especially if you want to start customising things a bit more and using your own input controls.
Pygame
The OG PipBoy app that was also designed for Raspberry Pi was the pypboy app by grieve, this was a replica of the Fallout 3 Pip-Boy UI and built using the Python based game engine pygame, since then this has been forked multiple times and modified and customised for various purposes, more recently and notably by Zapwizard who created a more full featured and Fallout 4 flavoured PIP-OS based on the original code.
The main catch with the previous methods for me is that pygame is a game engine, so it’s running on a constant tick to achieve a fixed framerate and is somewhat overkill for what is needed to actually drive the UI, I also wanted to run this on a Pi Zero so I didn’t want to run X11 or Wayland, ideally whatever I ran would write straight to the framebuffer so I don’t need to run a whole display server or window manager. I had some attempts at setting up pygame to use the framebuffer device with varying degrees of success and even when something was running, it was by no means efficient.
Bitmaps over serial
Whilst in the midst of playing with various other graphics libraries and use of the Linux framebuffer I had changed the display I was using to the USB connected IPS display mentioned in a previous post, this display didn’t function like an ordinary display and instead updates to the display were achieved by writing each pixel to the display over a serial connection. All this meant that any previously created apps were pretty much useless so to use this screen I’d have to write something fresh. The approach would be for each page to write a basic layout image to the whole screen and then write text and other dynamic elements over the top of this layout. This did actually make development quite simple as a lot of the UI layout was done as vector graphics in Affinity Designer in different layers, each layer exported as a bitmap of the correct resolution for the screen and then all the software had to do was write a byte at a time over the serial connection.
The data rate of the serial connection is quite low though, so a full screen rewrite when changing pages could take a couple of seconds as you can see in the clip below. I tried implementing a few tricks to improve this performance, one trick was to keep an in memory bitmap of the current screen state, then the next frame could also be rendered in memory and each bit of the bitmap XOR’d to give an array of any bits that changed and then try to just write those bits over the serial, but doing it that way meant I also had to write the X,Y coords of each bit rather than just one contiguous range of bits, so depending on what changed you could end up writing a lot more data than doing a whole screen rewrite.
Back to the drawing board
When I changed displays once more to using the Hyperpixel 4, I was now able to go back to the drawing board with everything I’d learned along the way and try to come up with a better solution. I spent a while evaluating a whole array of different options, my first proof of concept was in Ebitengine purely as Go was a language I was comfortable writing code in, but it wasn’t really mature enough that I could easily hack it to do what I wanted. I then had a quick play with Flutter which seemed to be a lot more mature and to get a basic app running was fairly easy, but to try replicate the PIP-OS UI was a bit more complicated, Flutter has a catalog of standard UI components much like any modern web UI framework which is fantastic for certain use cases but felt like a bit of a beast for what I was trying to achieve, I’m sure I could achieve it in Flutter but it didn’t feel like a natural fit.
Qt6
Developing a UI for an embedded system is far from a unique challenge however, people do it every day for car dashboards, fridges, lightswitches and you name it, in 2024 what doesn’t have a small screen and WiFi. So what does everyone else use? Well, as it turns out, Qt predominantly. Qt powers user interfaces in everything from coffee machines to excavators according to their success stories.
Getting started with Qt was actually surprisingly easy, Qt Creator isn’t the worst IDE I’ve ever used and with QML it also didn’t feel like I was on a particularly steep learning curve, most of QML is actually just JavaScript and the documentation around QML is actually really quite good, it wasn’t too long before I’d got the main parts of the PIP-OS layout itself done.
Qt also has some cool effects overlays that meant I could develop the whole UI in a basic black and white layout and then the colours, scan lines and screen effects could be applied over the top, which is actually much the same way the games do it inside the Creation Engine (the PipBoy UI in the game is actually using Scaleform, rendering Adobe Flash files!).
Getting it to the device
For me this is where Qt started to become the bane of my life, developing and running the UI on either my Windows desktop or my MacBooks was absolutely so simple, but that’s not the intended target of the app. In Qt Creator there is actually what appears to be really great tools for running your app straight onto a phone, embedded device or anything else you can imagine, getting this toolchain setup though is a complete nightmare. My ideal goal was to have Qt cross compile into ARM so I can just bundle it and ship it straight to my Pi Zero, and there are numerous guides on how to achieve this, all of which seemed to have totally different approaches as the methods have changed over time. Since starting out I’ve compiled, re-compiled, reconfigured and switched dev laptops more times than I can count and still don’t have a solution I’m happy with, I could complain about this aspect of Qt all day but I’m not going to.
I did eventually come up with something that works though, and it is as lightweight and performant as you would expect. Graphics written straight to the GPU with hardware acceleration, without the need for a desktop, a window manager etc. The source code for everything I’ve done here so far is published and available for use at https://gitlab.com/robco-industries/pip-os and as always, contributions are welcome.