This article covers the software I’ve written for my daylight clock, and some of the adventures along the way.
(Un)finished product first:
If you see this message, the simulator code hasn't loaded yet. Sorry!
Code here. Thanks to Nicole for the suggestion of making the simulator web-compatible!
I’ll list the full Bill of Materials when talking about the electronics. For software purposes, we’re working with:
I started by reading the Adafruit tutorials for the LED Matrix and NeoPixel strip separately, to make sure the hardware works. These tutorials were good for that purpose, but I wasn’t excited about using CircuitPython for the whole project. I had a couple hiccups while trying to install the software,1 and I wasn’t sure I’d be able to maintain a Python-based environment.2
Luckily, the Python libraries Adafruit uses are in turn backed by C/C++ libraries: rpi-led-matrix and rpi-ws281x. Both of these libraries also have Rust bindings: for the matrix and NeoPixel.
The lower-level libraries have lots more options, e.g. to use different GPIO and pixel layouts. I had to experiment to figure out which settings to use.
I reconfigured the hardware and NeoPixel software to use GPIO10
(SPI MOSI) for the NeoPixel data.
This leaves GPIO18 open if I want to use it for the “quality” setting on the matrix later.
I made sure SPI was enabled via raspi-config
so the library could use the relevant kernel driver.
The NeoPixels have 4 u8
channels: R, G, B, W…
but they aren’t necessarily delivered to the hardware in that order.
I illuminated one channel at a time:3
type NeoPixelValues = [u8;4];
let pixels: &mut [NeoPixelValues] = /* ... */;
for px in pixels {
px = [255, 0, 0, 0];
}
thread::sleep(Duration::from_secs(10));
for px in pixels {
px = [0, 255, 0, 0];
}
/* ditto for third and fourth channels */
and looked at the pixel strip to see which colors actually lit up. Turns out I had a GRBW-ordered panel! Luckily, that’s a setup-time option, so the rest of the program can treat the values as RGBW tuples.
The matrix’s orientation took me longer to work out. My test program illuminated one pixel at a time:4
for x in 0..32 {
for y in 0..16 {
display.clear().unwrap();
display.draw_iter(std::iter::once(Pixel(
Point::new(x, y),
Rgb888::new(255, 255, 255)
));
thread::sleep(Duration::from_millis(10))
}
}
I varied the library’s address settings in several ways to try to make the output look right. Mostly, I got it wrong! Eventually I determined my 32x16 panel acts like two chained 16x16 panels.
With the basic hardware settings configured, I wanted to quickly iterate on the content displayed: Does the daylight arc look reasonable? Is the font legible?
I worked out a couple tricks to quickly deploy to hardware, and built a simulator to run the software on my dev machine.
I was a little worried that incorporating the C/C++ libraries into my build would make it hard to cross-compile. But it was surprisingly easy to get set up!
Cargo can crossbuild Rust for RPi 64-bit using the --target
flag…
as long as you’re careful to specify the linker in .cargo/config.toml.
The -sys
crates for the LED libraries include build steps,
which seem to pick up the CC
and CXX
environment variables
with no hassle.
I bundled all this into a redo rule
that let me crossbuild any of the binaries in my crate with redo <name>.bin
:
redo-ifchange Cargo.toml Cargo.lock $(find src/)
set -eu
if uname -m | grep -q x86
then
export CXX=aarch64-linux-gnu-g++
export CC=aarch64-linux-gnu-gcc
TARGET="--target aarch64-unknown-linux-gnu"
else
TARGET_DIR="target/"
fi
# Build normally first, to show errors in the stderr stream
cargo build --release $TARGET --no-default-features --features hardware
cargo build --release $TARGET --message-format=json --no-default-features --features hardware \
| jq -r "select(.target.name == \"$2\") | select(.executable) | .executable" \
>"$3"
OUTPUT="$(cat "$3")"
rm "$3"
cp "$OUTPUT" "$3"
I worked on this project at a couple of locations where I didn’t have WiFi, just a USB-serial connection to the Pi. At first, I didn’t know how to upload new binaries to the Raspberry Pi.
I wound up using the picocom
serial client on my laptop, which can transfer files if
sz
is installed on the other end
(serial “server”).
I made sure sz
was installed on the Pi before leaving the house, and all was almost well.
The remaining problem was transfer time. At time of writing, a debug build of the main program is 19MB – at a typical 115200 baud, that’s a 20-minute transfer!
Using --release
builds and transferring gzip
ped binaries resulted
in much more managable sizes (< 1MB) and times (<1 minute).
This worked fine: mostly “offline time” was working on the physical design,
so I didn’t need to carry debug info onto the Pi.
The rpi_led_matrix Rust crate implements traits defined in embedded_graphics – that is, the matrix library implements the embedded_graphics interface. Another implementation is the embedded_graphics_simulator crate, which outputs to a desktop window instead of a physical matrix. This is a great feature!
I was able to quickly drop this into place instead of the LED matrix output. However, I couldn’t render a separate window for the LED strip at the same time; this is a known issue with the simulator.
To work around this, I refactored the main logic expect a different “output” trait:
pub trait Displays {
/// Set the color of the edge pixels
fn edge(&mut self) -> &mut [NeoPixelColor];
/// Draw onto a buffer for the face (LED matrix)
fn face(
&mut self,
) -> impl embedded_graphics_core::draw_target::DrawTarget<Color = Rgb888, Error = Infallible>;
/// Flush any pending pixels: update the edge and face.
fn flush(&mut self) -> Result<(), String>;
}
I used the aforementioned Raspbery Pi-specific libraries to implement this “for real”.5 For simulation, I added a one-pixel margin around the display, then a border to show the NeoPixel colors.6 This let me experiment with layouts, font sizes, etc. without deploying to the hardware stack.
I could also run the simulator faster than real-time, and capture the results as a video:
When I told Nicole about this simulation, she immediately suggested running the simulator in the browser – which you see above! The geometry of the web simulator is more representative of what the clock will look like.
The key feature of this clock is showing the sun-light hours, given the clock’s latitude and longitude.
Initially, I used a library called satkit
to compute sunrise and sunset times, since it provided
a convenient function
for just that purpose.
But once I had the simulator running for a year’s duration, I noticed a problem. Using an example location of Washington, DC, the daylight hours lengthened until the winter solstice, then abruptly shrunk:
2024-11-08 04:59:22 -05:00 // 2024-11-08 18:43:41 -05:00
...
2024-12-19 04:48:20 -05:00 // 2024-12-19 19:22:28 -05:00
2024-12-20 04:48:49 -05:00 // 2024-12-20 19:23:00 -05:00
2024-12-21 07:23:31 -05:00 // 2024-12-21 16:49:18 -05:00
2024-12-22 07:23:59 -05:00 // 2024-12-22 16:49:50 -05:00
A discontinuity doesn’t happen unless we teleport the planet! Moreover, I’ve lived in DC in December – sunrise is not in the 4AM hour.
To their credit, the library author fixed the issue promptly.
But I also realized satkit
brought along a lot of dependencies that my clock didn’t need,
e.g. an HTTP library for fetching updated ephemerides.
I decided to rewrite the rise/set functions from scratch. I found NOAA’s site on solar calculations, including this “just-the-equations” worksheet. I struggled a bit with getting the units right, but I was able to line it up with equations in the spreadsheets here and get a working function.7
As the Go proverb says, “A little copying is better than a little coupling.”
All the code is available in the repository if you want to try it out!
I haven’t fully tested out the integration of the SCD30 atmosphere sensor package, nor have I enabled the HAT’s on-board RTC. I need to experiment with what is legible, and useful, once more of the physical design is put together.
There’s also lots of bonus features that having a full-fledged server could offer. Nicole suggested a pomodoro timer that runs along the edge; Hannah suggested an interactive map, where the clock would “travel” to the touched location.
If you have ideas, suggestions, or other feedback, let me know!