Measuring Temperature With The Raspberry Pico

Measuring temperature with the Pico is one of the easiest “quick wins” for getting familiar with the platform. Along with turning the onboard LED on and off, you can achieve something that feels like interaction with the real world with nothing more than a few lines of Micropython due to the Pico’s onboard temperature sensor.

A quick search for some example code brought up:

import machine
import utime
 
sensor_temp = machine.ADC(4)
conversion_factor = 3.3 / (65535)
 
while True:
    reading = sensor_temp.read_u16() * conversion_factor 
    temperature = 27 - (reading - 0.706)/0.001721
    print(temperature)
    utime.sleep(2)

It works by measuring differences in internal voltage that vary by temperature. A better explanation, and the reason for all the maths, can be found on the same page as the code.

So far so good, until I looked at the output, which was a couple of degrees warmer than the Hive radiator thermostat that I was sitting right next to. It wasn’t beyond the realms of possibility that Hive was wrong, or was measuring a very localised temperature (although I had no reason to think so).

Some more searching raised doubts about the accuracy of the Pico, and the variability between different boards. I tried a couple of others and got similar results, all around the same range, and all warmer than Hive was telling me. I was more inclined to believe the Hive thermostat anyway, especially as it didn’t feel all that warm at the time.

A bit more searching and it appears that voltage variability can be a big factor in these calculations. Basically the conversion_factor above takes the Pico voltage at 3.3v, which it more-or-less is, but if this varies by even the smallest amount then the temperature can be out by a couple of degrees.

It seemed likely that I’d found my culprit, especially since I had the Pico plugged into a USB hub connected by USB-C to the PC, which was turning AC into DC voltage anyway. There were a number of places where a few millivolts could get lost.

A Temperature Measurement Comparison

I decided to experiment with some other measurement systems, including:

  • A very basic analog temperature sensor, the TMP36
  • A TMP225 analog temperature sensor
  • A Monk Makes plant monitor, which has its own analog to digital convertor onboard (and does things other than just temperature)

They look like this respectively:

A wooden thermometer

I can’t tell if there’s much difference in principle between the first two TMP sensors, beyong one being on a board, but they were both about the same price.

And for comparison I have a wooden thermometer, which I hope should be accurate enough. Plus I still have the trusty Hive radiator thermostat.

To cut a long story short, I ended up with all three components connected to the Pico at once, outputting a temperature on a polled basis. They were very simple to connect up and find code examples for. The wooden thermometer would just have to sit on its own.

And the results? These are averaged over 10 readings to smooth out voltage fluctuations after leaving running for a few minutes:

Hive radiator valve21.1°C
Wooden thermometerJust over 21°C
Pico onboard sensor21.9°C
TMP3623.0°C
TMP23523.4°C
Monk Makes plant monitor22.6°C

Maybe I’m too willing to accept an answer that happens to be believable, but to my eyes the fact that the thermometer agrees with the professionally constructed Hive radiator valve points to the four Pico measurements all being out by some degree. Reading a few more posts on forums certainly seems to indicate that none of these temperature sensors are going to be really accurate with a variable input voltage, especially since the Pico and the two TMP models explicitly need the 3.3v figure in the temperature calculation.

Changing The Temperature Calculation Formula

For all three except the Monk Makes plant monitor the voltage figure is in the temperature calculation equation. If the voltage is less than 3.3v then output temperature will show as higher than actual. The equation looks like this:

adc_value = adc2.read_u16()
volt = adc_value * (3.3 / 65535)
temperature = 100 * volt - 50

Changing the 3.3 figure to something a little lower has a substantial effect on temperature. adc_value is generally quite low and therefore the voltages are quite low – somewhere between .5v (which would be 0°C), and .8v at 30°C. In the equation above a voltage change of just .01V equates to a temperature variance of 1°C.

For the purposes of “calibration” I’m going to assume that the only thing I need to change is the value of the input voltage. And hence if I amend the above code for the TMP36 sensor to:

adc_value = adc2.read_u16()
volt = adc_value * (3.21 / 65535)
temperature = 100 * volt - 50

then I start to get about the same temperature as Hive and the thermometer are telling me, with a tenth of a degree anyway. Similarly, 3.20v seems to work for the TMP225.

For the Monk Makes sensor I just scale down the temperature it tells me, by 0.925.

Next Steps

The results all seem to be plausible now, but it’s not very scientific. Plus, I’ve only tested it at a very narrow temperature range. If I want to put sensors in a greenhouse I’ll need to check that the linear scale assumptions I’ve made above still hold.

Secondly, I’m assuming that it’s the voltage that’s the culprit. I have a multimeter ordered so I can test what’s actually being passed to the sensors and that’ll hopefully tell me if my estimate of 3.2v is correct or if the inaccuracy is somewhere else in the system.

Thirdly, this has been powered by a USB hub, which is much noisier than a smooth DC power source (like a battery). I did a quick experiment with a USB battery power pack and got similar readings, but there’s no real way to know if that’s outputting exactly 5v or not.

The good news? The cheap and cheerful TMP36 seems to be just as reliable (or unreliable) as any of the other options.

Playing Simple Sounds With The Pico And Adafruit STEMMA Speaker

I’m starting to get more confident with putting components together with the Pico, so without too much research I bought a simple speaker/amplifier breakout board. Still, given how every time I try something simple on the Pico it turns out to involve hours of Googling I assumed that getting a speaker to work would be more fiddly than it should be.

But, au contraire, it turned out to be remarkably easy. Just three cables and not even a dozen lines of code (stolen by Googling, of course…) produced a simple note:

A Raspberry Pico connected to a Adafruit STEMMA Speaker by three crodocile clips

With this code:

from machine import Pin, PWM
from utime import sleep

SPEAKER_PIN = 21

# create a Pulse Width Modulation Object on this pin
speaker = PWM(Pin(SPEAKER_PIN))
# set the duty cycle to be 50%
speaker.duty_u16(1000)
speaker.freq(1000) # 50% on and off
sleep(1) # wait a second
speaker.duty_u16(0)
# turn off the PWM circuits off with a zero duty cycle
speaker.duty_u16(0)

Despite really haven’t done nothing except copy and paste I was proud of the beep that emitted from the small speaker.

The next challenge: to play an audio file, of any format. Back to Google I went, and after some dead ends (due to Micropython changing over the previous five years) I settled on a library called PicoAudioPWM by Daniel Perron.

A bit of MP3 to WAV mangling later, and this code produced a very ropey sound:

from machine import Pin
import wave
from wavePlayer import wavePlayer

try:
    player = wavePlayer()
    player.play('clip.wav')
    player.stop()
except Exception as ex:
    Pin(2, Pin.OUT, Pin.PULL_DOWN)

Pin(2, Pin.OUT, Pin.PULL_DOWN)

I changed the output pin from 21 in the simple beep experiment to 2 here, as that’s the default the library uses.

For a reason that I expect is specific to the speaker I had to add a Pin PULL_DOWN after playback was complete otherwise the audio was followed by constant hissing from the speaker, even after the program had stopped running.

The sound quality was pretty awful, but hey-ho: at least I got noise out of the thing.

Now to go and read about what “Pulse Width Modulation” might be…

The Arduino Mini Camera OV2640 2MP Plus And The Pico

An Arduino Mini Camera OV2640 2MP Plus

I’m starting to realise that “compatible with the Pico” might have as much meaning as “sure it’s possible to ride a tiger”: “compatible” is not the same as “conducive” in the same way that “possible” is not the same as advisable.

Perhaps I was boosted by the “ease” with which I got the Pico and e-ink display working, but for my next project I have my sights on a wildlife camera for the garden. I realise there are plenty on the market, but they all seem to require a row batteries, or need charging every few days, or a mains power cable. All of these are possible, but for me not ideal.

My ideal wildlife camera would be charged from solar (which may or may not be possible) and would automatically upload images to a server. Both these things mean that once I put it up somewhere I’ll never have to touch it again. (Hah!)

There seemed to be no reason why a Pico W, motion sensor and camera couldn’t do most of the work, and then I’d assess the viability of solar power later. Battery power, as I said, is not ideal, but not a deal-breaker. I’ll also need to make sure that if the Pico is out in the garden that it can still connect to the Wifi, because I didn’t want to have to store images locally.

Knowing that I had to take small steps, I decided to get the camera part working first. I bought a 2mp camera that claims compatibilty with the Pico, the Arduino Mini OV2640 2MP Plus, and instantly hit the first hurdle when I tried to find code to run it on.

To cut a long story short, I found I needed to be using CircuitPython (what’s that?) instead of Micropython. That proved to be a non-issue, although slightly annoying as I was just getting used to the libraries available in Micropython.

The official example worked, once I’d followed the instructions to the letter, but all it did was display the output of the camera in a bespoke Windows app. At least it told me that the camera was wired up properly and the code on the Pico was working, but it didn’t give me much of a clue as to how to get images into a state where I could do something with them, preferably something like a JPG file, but at least a byte array.

More Googling brought back some clues, and eventually I had something that appeared to be reading from the camera. Except it was hard to tell what I was reading as there’s nowhere to save the images. Circuitpython mounts the Pico memory as an external drive, and that locks out any attempts in code to write to it. It’s possible to change it to writeable mode, but then it’s a pain to change it back again to put more code onto it.

In my simple little mind I’d thought I could solve the problem by POSTing the output of the camera to a lightweight web server somewhere. Except the problem of base64 encoding bits reared its head. No problem, I thought: installing a base64 encoding module will fix that. Except, even a 7kb JPG caused an out of memory error when I tried to encode it. Base64 encoding seemed to be right out.

Like Inception, I needed to go deeper, so I dug into the Sockets documentation, both for Python and CircuitPython. Half a day later I was getting somewhere, managing to pipe JPG pixels straight from the camera to a tiny socket connection on my PC and save them to the hard drive. The problem was that only about one in four images seemed to be valid.

More long story to cut short, but just after I capture the image in the code I’ve added

time.sleep(1)

and that seems to have “fixed” it. Le shrug.

For posterity, here is the v1 of “click run and it’ll send a picture somewhere” code, with Wifi and IP details placeholdered, and using the Arducam library:

import time
import os
from digitalio import DigitalInOut
import wifi
import socketpool

#import wifi
import gc
from Arducam import *

gc.enable()

led = digitalio.DigitalInOut(board.LED)
led.direction = digitalio.Direction.OUTPUT
led.value = False

wifi.radio.connect(ssid='SSID',password='WIFI PASSWORD")

mycam = ArducamClass(OV2640)
mycam.Camera_Detection()
mycam.Spi_Test()
mycam.Camera_Init()
time.sleep(1)
mycam.clear_fifo_flag()
mycam.OV2640_set_JPEG_size(OV2640_1600x1200)

def get_still(mycam):
    mycam.flush_fifo()
    mycam.clear_fifo_flag()
    mycam.start_capture()
    once_number = 1024
    buffer=bytearray(once_number)
    count = 0
    finished = 0
    
    time.sleep(1)

    pool = socketpool.SocketPool(wifi.radio)
    s = pool.socket(pool.AF_INET, pool.SOCK_STREAM)
    s.connect(("SERVER IP ADDRESS", SERVER PORT))
    
    length = mycam.read_fifo_length()
    mycam.SPI_CS_LOW()
    mycam.set_fifo_burst()
    
    while finished == 0:
        mycam.spi.readinto(buffer, start=0, end=once_number)
        count += once_number
        s.send(buffer)
        
        if count + once_number > length:
            count = length - count
            mycam.spi.readinto(buffer, start=0, end=count)
            s.send(buffer)
            mycam.SPI_CS_HIGH()
            finished = 1
    s.close()
    gc.collect()

get_still(mycam)

And the Python server code to receive and save the image with consecutive numbered file names, in the most basic POC (piece of crap?) form possible:

from os import curdir
from os.path import join as pjoin
import socket

HOST = IP ADDRESS  # IP of this server
PORT = PORT NUMBER  # Port to listen on
photo_num = 0

while True:
    with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
        s.bind((HOST, PORT))
        s.listen()
        conn, addr = s.accept()
        with conn:
            with open(pjoin(curdir, 'photo_{0}.jpg'.format(photo_num)), 'ab+') as fh:
                while True:
                    part = conn.recv(1024)
                    if len(part) <= 0:
                        break
                    fh.write(part)
            fh.close()
            conn.sendall(b"Complete")
    photo_num = photo_num + 1

So far so feasible at least, so the next step is to prototype movement sensing. In the latest edition of “famous last words”: the movement (or PIR) sensor part looks like it might be quite easy. On to that next!

The Final E-ink Weather Display

Pico project number one is finally “shipped” with the addition of a badly made wooden frame to hold the thing and let me stick it to the wall, plugged into a USB power socket.

Let’s face it: the frame is not it’s strong point, but just putting the raw components on the wall would look very unfinished.

Is it actually useful? Who knows, although I can’t imagine I’ll ever save as much time from not opening my phone to look at the weather as I spent making it. Was it fun? Totally.

The journey to get here:

I’ve already discovered a couple of downsides to the frame I made (apart from how rough it looks):

  1. It’s a bit chunky and projects from the wall too far
  2. I thought a USB power cable with right angle connectors would let me really shrink the space requirements, which it has, but it’s also limited the placement options
  3. It’s fixed to the wall by stick picture hanger things, which is fine, but when I spotted a problem I wanted to fix I had to take it off and then use another pair of stickers to put it back up again. i.e. it’s a real pain in the bum to update
  4. And talking of which, if the Pico inside ever needed replacing I’d basically be taking the case to pieces to swap it out

Next time I think a longer cable so it can sit on a table or shelf rather than fix to the wall.

Now onto the next project.

A Raspberry Pico Bird Camera

Hot on the heels of, if not success, then at least lack of failure in my rather simple project to build a Raspberry Pico e-paper weather display, what I’d really like to have is a camera to photograph the very infrequent visits to the feeder in the garden.

There are plenty of cameras on the market but I have something quite specific in mind. It would upload images to a server, so that I didn’t have to go and check a memory card, and ideally be solar powered so I didn’t have to go and recharge or swap batteries. Many of the available cameras seem to use disposable batteries and that just seems like a bad thing in this day and age.

At least I have a clear idea of what I want, but I don’t have a clear idea of whether it’s achievable. But I have a plan.

  1. Check I can get a Pico and camera combo working. Like my first step with e-paper, given that I don’t really know what I’m doing I might not even get a basic image out
  2. Post the photos to a Wifi-connected server, which will probably just be my desktop in the first instance
  3. Get a motion detector working
  4. Put the two together to take photos based on my own movement
  5. Test to see if bird movements trigger the motion detector, and if they do see if the image quality is good enough. I’m more interested in knowing what kinds of birds come to visit and how often than winning Wildlife Photographer Of The Year
  6. Assess power consumption and feasibility of a solar supply
  7. If solar is reasonable, decide on whether to get a retail package with a USB micro, or to go with something more home-spun, that could potentially power multiple devices in the future (because I have many more plans!)
  8. Make some kind of weatherproof case, possibly just for the “camera” part, but possibly also the solar panel (and battery) depending on size requirement. I’m as useless with my hands as I am with electronics so this might be the hardest part.

For components, as well as another Pico W, I’m going for an Arducam Mini 2MP Plus OV2640 SPI camera and a Mini PIR sensor, although since they’re so cheap I’ll give the slightly larger (yet strangely cheaper) regular PIR sensor a try too. I have a very small garden, at only 9m long, and a mesh transmitter right near the door already so Wifi range shouldn’t be an issue.

A Arducam Mini 2MP Plus OV2640 SPI camera
The Arducam Mini 2MP Plus OV2640 SPI camera
The smaller Mini PIR sensor

And so a new adventure begins.

Pico E-ink Weather Display: Version 2 (or 2.9?)

For my first Pico experiments I bought the smallest e-ink display I could, because it was the cheapest and who knows if I’d even pass the first hurdle of making a prototype. I’d also bought the wrong type, with a cable connection instead of a nice and easy Pico-compatible HAT. So I ended up with a little cable nest like this:

The display looked decent, barring some pixel tweaking on the icons, but I still thought that a slightly bigger display would be better at a glance. This time I bought the correct HAT type too, and 2.9inches seemed just a bit bigger, but not too big. The display also has red as well as black, but I didn’t plan to make use of it this time around. It crossed my mind that weather warnings might be good in red, but they might also be harder to see than black on white – which kind of defeats the purpose.

To cut lots of medium-length stories short, I also discovered I had a bit more space on the 2.9 inch display, so I added sunrise and sunset to the display, as well as spacing things out a bit more. The display is 296×128 pixels, which isn’t a lot more than 250×120, but I think the pixels are also a little bit bigger. It also helps that the “height” (128 pixels, or 128 bits) is a whole number of bytes, so my rotation fudge doesn’t result in the top left of the screen being at coordinate (0,6). The 2.9b “driver” file worked straight out of the box this time too. On the other hand, if I’d bought the right display the first time I wouldn’t have learnt as much.

The HAT fits pretty nicely too:

And I’m quite happy with the end result (even if I’m not happy with the weather). I hand-tweaked a few of the images, such as the sunrise and sunset (as they’re only 16×16 and every pixel counts), but for the most part converting to 1-bit image at a 50% gray threshold looks good enough. Even with some of the text at exactly the same pixel height as on the 2.13 inch display it just looks a lot crisper:

The only thing left to do now is to try and frame it somehow so it doesn’t look like a scruffy bit of tech hanging by a USB cable in the hallway.

Optimising RAW Images For An E-ink Display

Whilst building the e-ink weather display for the Pico I’ve had to make a number of RAW images. I’ve been doing this by opening PNGs in Photoshop, converting them to grayscale, changing them to Bitmap (or 1-bit images) and then, because Photoshop won’t let me save a 1-bit image as RAW, I convert them back to grayscale and save as RAW.

For the 65×65 image for the daily weather summary this comes out to 4,224 bytes. Not a lot in the modern world, but when the Pico has such small storage space, and as I was planning to do day and night versions of each thumbnail, I needed two copies of fifteen different icons, making 123kb in total.

It was a waste of space as the e-ink display only displays 1-bit data, and therefore every image was taking eight times more storage space than necessary.

There are scripts available for creating 1-bit RAW images from PNGs, but I decided to experiment with another very clever thing that somebody else (Lucky Resistor) had written: the Micropython Bitmap Tool.

Essentially, it takes an image and turns it into a Python (or Micropython) byte string, so the image of the sun (first one above) becomes this:

bytearray(
        b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
        b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
        b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x80\x80\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x7c'
        b'\xfe\xff\xfe\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x80\x80\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
        b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x07\x0f\x1f\x1f\x3e\x1c\x08\x00\x80\x80\xc0\xe0\xe0\xe0'
        b'\xf0\xf1\xf1\xf1\xf0\xe0\xe0\xe0\xc0\x80\x80\x00\x08\x1c\x3e\x1f\x0f\x0f\x07\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
        b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x80\xc0\xc0\xc0\xc0\xc0\xc0\xc0\x80\x00\x00\x00\xf0\xfc\xfe\xff\x1f\x07\x03\x01'
        b'\x01\x01\x00\x00\x00\x00\x01\x01\x03\x07\x1f\xff\xfe\xfc\xe0\x00\x00\x00\x80\xc0\xc0\xc0\xc0\xc0\xc0\xc0\x80\x00\x00\x00\x00\x00'
        b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x03\x03\x03\x03\x03\x01\x00\x00\x00\x00\x07\x1f\x7f\xff\xf8\xf0\xe0'
        b'\xc0\x80\x80\x80\x80\x80\x80\x80\xc0\xe0\xf0\xfc\xff\x7f\x1f\x07\x00\x00\x00\x00\x01\x03\x03\x03\x03\x01\x01\x00\x00\x00\x00\x00'
        b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x40\xe0\xf0\xf8\xfc\x7c\x3c\x08\x00\x00\x01\x01'
        b'\x03\x03\x07\x07\x87\x87\x87\x07\x07\x03\x03\x01\x01\x00\x00\x08\x3c\x7c\xfc\xf8\xf0\xe0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
        b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x00'
        b'\x00\x00\x00\x00\x1f\x7f\x7f\x7f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
        b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
        b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
        b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
        b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
        b'\x00\x00\x00\x00\x00\x00\x00\x00\x00')

The advantage of this is that it takes a lot less space.

The disadvantage, as I found, is that loading images and exporting them and pasting into an array to do image lookups was really boring. And in the end, once I’d written a bit of code to look up which bytearray to use, instead of which image file to load, I had a 77kb file i.e. I’d only saved about 40% versus the grayscale versions.

The lesson, I guess, is that maybe next time I should learn how to make 1-bit RAW images with something like ImageMagick. The saving was so negligible for the effort it took that for the tiny images on the display I just stuck with my 8-bit RAW files – all 46.8kb of them.

Customising The E-ink Weather Display

With all the pieces in place to allow me to draw to the e-ink screen, I made some 8-bit RAW images and wrote values to the display. It worked, but it looked a bit crusty, not least because the Framebuffer.text method for putting text onto the screen is a fixed size, and a fixed font. Remember that I’m aiming for something like this, which uses a few different font sizes:

As with so many things, this has been solved before, with a piece of work that I think is absolute genius going by the catchy title of “font_to_py“. Basically, it’s a Python command line tool that inputs a font file (like a ttf) and some parameters, such as pixel size, and outputs a Python file containing all the data necessary to render the font as byte data. The files are very small too, with a 20 pixel font “compiling” down to 17kb.

Then you just import the Python files, which in my case were three different sizes of Calibri, and the functions to draw the font to a Framebuffer:

import calibri_36, calibri_15, calibri_12
from writer import Writer

And to draw to the display something like:

wri = Writer(display_buffer, calibri_36)
Writer.set_textpos(display_buffer, text_top, text_left)
wri.printstring(max_daily, True)

To print the value of the maximum daily temperature to the screen (via the Framebuffer). Somewhat strangely, the coordinates are passed in as (top, left) rather than (x, y). I also tweaked writer.py slightly to make sure it handled transparency (i.e. 0 pixels), by changing:

self.device.blit(fbc, s.text_col, s.text_row)

to

self.device.blit(fbc, s.text_col, s.text_row, 1 if invert else 0)

in the _printchar method.

The fonts still looked a little weedy, though, and rather than make a bold version of Calibri my own slightly hacky method was just to print the same characters four times over, offset by 1 pixel each time:

Writer.set_textpos(display_buffer, text_top, text_left)
wri.printstring(max_daily, True)
Writer.set_textpos(display_buffer, text_top, text_left - 1)
wri.printstring(max_daily, True)
Writer.set_textpos(display_buffer, text_top + 1, text_left)
wri.printstring(max_daily, True)
Writer.set_textpos(display_buffer, text_top + 1, text_left - 1)
wri.printstring(max_daily, True)

The results were good enough.

For degree symbols I used Framebuffer.ellipse, and to calculate the position to draw the symbol in writer.py even provides a very useful “how wide is this piece of text?” function:

text_width = get_string_display_width(wri, max_daily)
display_buffer.ellipse(text_left + text_width + 4, text_top + 5, 4, 4, 0)
display_buffer.ellipse(text_left + text_width + 4, text_top + 5, 3, 3, 0)
display_buffer.ellipse(text_left + text_width + 4, text_top + 5, 2, 2, 0)

Note that the ellipse (circle) is also drawn three times to create a faux bold effect.

The last part was some trigonometry as I wanted to display wind direction as an arrow in a circle.

Putting it all together resulted in something usable, albeit with some pixel tweaking required to sharpen some of the smaller images:

An epaper display showing temperature, weather summary, wind direction and rainfall duration

The last thing was to schedule it to update every hour. I experimented with lightsleep and deepsleep, which are supposedly low power modes, but the Pico just never seemed to wake up (and from Googling that appears to be a known issue), so I just took to a sleep function:

HOURS_SLEEP = 1
sleep_time = 60 * 60 * HOURS_SLEEP

Having made something functional, I now had a list of things I wanted to improve:

  1. The size. It’s a little bit too small. I decided to order a 2.9inch e-paper display, and this time get one with the right HAT
  2. The four hour summary is useful, but the weather at 7am is irrelevant once that time of day is passed. I wanted to change it to a rolling twenty-hour period
  3. What happens if I hit an error? I might think the weather’s the same every day and not notice the program has crashed. I needed to show the date/time of last update, and the “feels like” temperature is probably the least useful metric to replace it with
  4. The sleep function updated after an hour, but every update took maybe twenty seconds. If I was going to show update time, I wanted it to be on the hour, not to drift by a couple of minutes per day

I placed an order for another Pico (W, with headers, of course) and the larger e-ink display and waited.

Show Me The Weather

After connecting the e-ink display to the Pico and managing to get it to display in landscape mode I felt I’d solved the unknowns in making this e-ink weather display. Now it was time to decide what I was going to show, and how I was going to show it.

One Python example I’d found used a weather API called OpenWeather, so I went and signed up. Yet no matter what I seemed to do in terms of requesting API keys, and even waiting a day for activation, I was never able to make a valid request. In the meantime, with a little more Googling I found Meteo’s API. It turned out to be very easy to work with and had everything I’d need.

I chose some data fields I wanted to display and opened Photoshop to start a basic layout.

Design for an e-ink weather display showing maximum and minimum daily temperature, overall weather, wind gusts, and a summary for every 4 hours

I wanted minimum and maximum temperatures during the day, a nice summary icon, average wind speed, and then a bit more info about how it would feel which is where the 9 degrees (“feels like” temperature), 20mph (wind gusts), and 2hrs (duration of rainfall during the day) came from. Below that is a little summary of what the weather looks like at four hour intervals..

The API call is just a request to a URL, such as this:

https://api.open-meteo.com/v1/forecast?latitude=XXXXX&longitude=XXXXX&hourly=temperature_2m,windspeed_10m,winddirection_10m,windgusts_10m,weathercode&daily=temperature_2m_min,temperature_2m_max,sunrise,sunset,windspeed_10m_max,windgusts_10m_max,winddirection_10m_dominant,precipitation_hours,apparent_temperature_max,weathercode&timezone=GMT&windspeed_unit=mph

You can request more fields in either the hourly or daily summary, and the documentation was pretty clear too.

The return is JSON and easily parsed with Micropython. I wrote a small module to do it all:

import network, rp2, time
import urequests
import json
import sys
from io import BytesIO

LATITUDE = 'XXXXX'
LONGITUDE = 'XXXXX'

BASE_URL = 'https://api.open-meteo.com/v1/forecast?' 
URL = BASE_URL + 'latitude=' + LATITUDE + '&longitude=' + LONGITUDE + '&hourly=temperature_2m,weathercode&daily=temperature_2m_min,temperature_2m_max,sunrise,sunset,windspeed_10m_max,windgusts_10m_max,winddirection_10m_dominant,precipitation_hours,weathercode&timezone=GMT&windspeed_unit=mph'

# API documentation at https://open-meteo.com/en/docs
class Meteo:
    def __init__(self):
        self.json = None
        return
    
    def get_data(self):
        # Make GET request
        response = urequests.get(URL)
        if response.status_code != 200:
            print("Error")
            print(response.status_code)
            print(response.content);
            response.close();
            raise Exception("Error getting weather data " + response.status_code)
        else:
            self.json = response.json();
            response.close();
            return self.json;

I’d call it from a function, just so it was easier to test:

def getWeatherData():
    weather = meteo.Meteo();
    weather.get_data();
    return weather;

And accessing the data is just like accessing any Python object e.g. to get the maximum daily temperature:

max_daily = str(round(weather.json["daily"]["temperature_2m_max"][0]))

So far so simple, and I was quickly on to drawing all of this to the display, which is the subject of the next post.

Micropython E-ink Display Rotation

By default, sending pixels from a Raspberry Pico to a Waveshare e-ink display will result in a portrait image. In the case of the 2.13inch display I started with, this means a screen of 122 pixels wide by 250 pixels high.

But what if you want to display in portrait mode? As someone who’s used to the simple world of CSS it was actually fun to get back into the world of bits and pixels again.

Just like the ZX Spectrum, where I started programming, the display is stored as a list of bytes where each byte is eight consecutive pixels, either set to 1 to display a dark pixel or 0 for white (or clear).

In Micropython, this is stored as a byte array. So the code that sets up the display looks like this:

self.buffer_black = bytearray(self.height * self.width // 8)

Thankfully, there’s a helper module in the shape of Framebuffer which makes changing the value of these pixels much easier. It contains methods to write text, lines, rectangles and circles and, most usefully, to draw images at arbitrary coordinates using the blit command.

Not only does blit make it easier to draw, say, a 50×50 image onto the 122×250 display, but it also makes it much easier to draw an 8-bit image onto the 1-bit display. You start by setting a Framebuffer object with its data source as the byte array, such as:

self.imageblack = framebuf.FrameBuffer(self.buffer_black, self.width, self.height, framebuf.MONO_HLSB)

Where the first argument is the byte array, with and height are self explanatory, and framebuf.MONO_HLSB means that this framebuffer is a mono (1-bit) image, with horizontal bits stored as a byte.

To draw an image on top, assuming you have the pixel data in a variable called image_buffer, you can do something like:

self.imageblack.blit(framebuf.FrameBuffer(image_buffer, width, height, framebuf.GS8), x, y, key)

Where width and height are the width and height of the image, framebuf.GS8 signifies an 8-bit grayscale image, x and y are the coordinates to draw to, and key tells the blit method which pixels in the image it should treat as transparent. This allows you to draw just the black pixels onto the screen without the “white” pixels (or 0 bits) overwriting anything that’s already on the screen. Because the frame buffer takes the image_buffer bytearray as a parameter but changes the pixels in it by reference, any changes to self.imageblack (which is a Framebuffer object) are applied to the bytearay in self.buffer_black – which represents the actual pixel data.

So far so simple as far as drawing an image to the screen goes, assuming you’re still working in portrait mode. I made myself a helper function to load raw files from the on-board memory:

def drawImage(display_buffer, name, x, y, width, height, background=0xFF):
    image_data = get_image_data(name)
    if image_data:
        display_buffer.blit(image_data, x, y, background)
    else:
        image_buffer = bytearray(width * height)
        filename = 'pic/' + name + '.raw'
        with open (filename, "rb") as file:
            position = 0
            while position < (width * height):
                current_byte = file.read(1)
                # if eof
                if len(current_byte) == 0:
                    break
                # copy to buffer
                image_buffer[position] = ord(current_byte)
                position += 1
        file.close()
        display_buffer.blit(framebuf.FrameBuffer(image_buffer, width, height, framebuf.GS8), x, y, background)

Drawing a landscape image instead of portrait should be as simple as:

  1. Set up a Framebuffer with a width equal to the e-ink display height, and a width equal to the e-ink display width
  2. Draw to the Framebuffer
  3. Transpose columns of pixels into rows in a new bytearray
  4. Send to the e-ink display

It seemed like exactly the sort of thing that someone would already have tackled, and so it proved. So with some Googling and adjustment of width/height variables to suit the display I was working with I came to this:

def rotateDisplay(display):
    epaper_display = bytearray(epd.height * epd.width // 8)
    x=0; y=-1; n=0; R=0
    for i in range(0, epd.width//8):
        for j in range(0, epd.height):
            R = (n-x)+(n-y)*((epd.width//8)-1)
            pixel = display[n]
            epaper_display[R] = pixel
            n +=1
        x = n+i+1
        y = n-1
    epaper_buffer = framebuf.FrameBuffer(epaper_display, epd.width, epd.height, framebuf.MONO_HLSB)
    return epaper_display;

And I have to confess I didn’t worry too much about the details. Not until I ran it and the output was complete garbage.

It took a while to work out what I was doing wrong, but once I found the problem the solution made complete sense: I was inputting a Framebuffer stored as MONO_HLSB but the function above transposed the position of bytes, not individual bits. In other words, chunks of 8 pixels were being moved to the right place on the screen, but forming a column rather than a row. No wonder it looked bad.

The solution was simple: start with a Framebuffer of the type MONO_VLSB, where the bytes represent a column of 8 pixels, not a row. Then the transposition naturally mapped to MONO_HLSB.

I ended up with something like this:

display = bytearray(epd.height * epd.width // 8)
    display_buffer = Screen(display, epd.height, epd.width, framebuf.MONO_VLSB)
    display_buffer.fill(0xff)
    --- 
      Draw somethings here
    ---
    epaper_display = rotateDisplay(display)

where Screen was an extension of the Framebuffer class, necessary because the width and height parameters weren’t accessible in a native Framebuffer object. It just felt nicer to store them “in place” rather than pass another parameter around.

class Screen(framebuf.FrameBuffer):
    def __init__(self, display, width, height, encoding):
        self.display = display
        self.width = width
        self.height = height
        self.encoding = encoding
        return

The final wrinkles was that the e-ink display was 122 pixels “wide” (or now 122 pixels high, as I was trying to work in landscape mode), and therefore not a number that formed a whole number of bytes. Because of the way I was treating the screen, it meant my top left origin was at the coordinates (0, 6), which felt a bit nasty but was something I could live with. The alternative would have been to transpose the whole thing down 6 pixels using another Framebuffer, but frankly I couldn’t be bothered.