Jul 20

Google Meet Mute with Stream Deck and Companion

I recently found Bitfocus Companion and it has done nothing but provide inspiration for new ideas for buttons to control things which normally I’d have to faff with a mouse to control.

Google Meet is our video platform of choice at work, and I was fed up with loosing the window and not being able to unmute myself quickly when someone asked a question in the middle of my presentation or meeting.

So a button on the Stream Deck XL is required to alleviate this dreer.


I’ve used Better Touch Tool for ages to provide an interface for controlling my mac from a phone or tablet when projecting films on the wall, and used its screen snapping features for years to arrange program windows quickly in multi screen workflows.

But now it was the turn of BTT to provide the tooling to programme a series of actions that would find a google meet window running the video call, switch to it, and send the ‘mute’ toggle keyboard shortcut. There is a preset you can download at the end of this post for a named trigger.

No matter which of your browser windows is hiding the Google Meet window amongst a million other tabs, we’ll find it and bring it front and focussed… and unmuted.

Working back up the signal chain… BTT comes with a webserver which can be enabled to listen to triggers on an HTTP port. You can copy the HTTP request for each trigger in the BTT admin, and you’ll paste that in the Companion config.

Companion has a generic HTTP plugin which can trigger BTT on the target machine. Simply paste your trigger URL in a new button using the http instance on the key down, and jobs a goodun.

Download the BTT preset here.

Apr 20

Companion Pi Touch Screen

Bitfocus Companion is an awesome way to make buttons on a keyboard device control other software or hardware. My interest in this is because we run church services with very minimal staff. Usually the one person on the sound desk is the same person operating ProPresenter.

We are a small church which has to setup and teardown each week, and I’ve currently got our setup down to three wheeled units to roll out, and three cables to plug in for all sound and vision. More about that and the desk I built to hold it all in the future, because right now we are doing everything from home.

Today is all about using Companion to activate a video switcher and handle running shortcuts in zoom and ProPresenter.

Basic overview

Installation steps.

I have linked to key documents were necessary. Some of these things are easy to find out with a quick google search, and they don’t have links, but the description should be enough to get you there.

  • Attach all the hardware together, but don’t put it in the case yet.
    • I used a separate power supply for the pi and touch screen.
  • Write the Companion Pi image to a micro SD card.
  • Insert the SD card and boot it all up.
  • Then used sudo raspi-config to Change timezone back to Europe/London (clearly this is an optional step, but I do like my time to show correctly on my companion buttons!!)
  • Added the following to the /boot/config.txt file lcd_display=2 This will rotate the touch screen display 180 degrees so that it is oriented correctly for the case.
  • Then installed chromium-browser and set it to run in kiosk mode. I followed the excellent instructions from desertbot.io, but my first step in those instructions was already completed as I had the companion pi image installed and running.
  • For your Kiosk URL use http://localhost:8000/emulator and this will launch the emulator every time the ‘kiosk’ launches.
  • And that’s it, you should have the touch screen ready to use with the companion 32 button emulator.
  • Finally I made some little feet to hold the screen at the right angle. Some T brackets bent to the right angle did the job here, and a couple of found screws that happened to be the right size to fit the holes in the case.

Things to to

  • Screen size. the emulator page is just a little to wide and long for my liking, as this causes scroll bars to be visible. I can’t seem to find a way to get rid of these without altering the emulator page code in companion pi, and I didn’t want to go faffing with the companion code.
  • Boot up seems patchy, sometimes it’s fine and sometimes I get a flashing activity light on the pi which sugests that it couldn’t read a file from the SD card. I suspect the SD card is flaky, so might try another one, but don’t have one spare at the moment…

Dec 19

Create directories for months of year

I wanted a way to quickly create directories for each month of the year. I wanted them named using a human readable name, but with a sort order that kept them in month order.

Using bash 4 and a pretty standard time formatting command gives us the command you can see in the gist

Apr 19

Giving and taking criticism

With love.

That’s the high level summary, but what does that mean in practice? And how did I get to that conclusion anyway?

In my first arts degree we had regular ‘crits’ at the end of each project brief. We’d lay out our wares in the corridor, supporting work and final pieces, and our work was marked by the lecturers. After the marking we’d all congregate in the corridor, and watch as each student was either pulled to shreds or had their ego massaged. Whether this was a good or bad thing was always highly subjective and ALWAYS resulted in several students going back to their workspaces in near-tears.

As a result I spent 3 years of my life learning that there was a good way to criticise someone, and some very nasty bad ways. I would often find myself as being one of the people who would go and offer support to those who had been torn to shreds. Sometimes I agreed with the criticism, but not how it was delivered, and sometimes I also did not agree with the criticism, and felt it was my duty to give a different perspective.

  1. Everyone needs encouragement. If you cannot find anything encouraging to say then you are not qualified to give criticism. Why? because encouragement is the thing that motivates the person being criticised to take action, and without encouragement, you simply demotivate and demoralise them.
  2. Accept that everyone makes mistakes. If you are the giver, this means you have to remember that this could so easily be you and deliver it humbly and preferably with real examples of where you encountered a similar situation and how you dealt with it. If you are the taker, you need to get to a point where you realise that the person giving you the criticism is trying to help you shortcut a learning they made before.
  3. Be constructive. If something needs to change, and you can see why, but someone else can’t, then you have to help them see why the criticism you are delivering is necessary. This might be by explaining how it is having an impact and give them a way to resolve it for themselves.
  4. Let them take control. With the best will in the world, it is rare that you can help someone who does not want to be helped. Often people get to a point where they feel like they have lost control and that there is no point to what they are doing. So be clear that they are free to ignore your advice and make their own decisions, and that you will support them in that as best you can.
  5. With love. When you love someone you want the best for them. You want to see them grow, to flourish, to bloom. You want to see them achieving their dreams and being the most amazing person they can be.

If everyone gave criticism with love we’d all have a much better time taking it. We’d be able to move on more quickly and have the input we need to work stuff out for ourselves.

If everyone took criticism with love, or looked for the love in the criticism, we’d know that the other person was doing their best to use their life experiences to help us. If we can’t see the love in the criticism then we shouldn’t take it on board without evaluating it carefully, as it may be meant to hurt us or hold us back.

It’s not easy.

Oct 15

Raspberry Pi as a Door Maid

DVR_Webcam_-_Dropbox_EditionI regularly work from home, and because there has been a great deal of house renovation to be done there are plenty of deliveries to be waiting in for. I work at the back of the house, and being such a solidly built house with blockwork walls throughout – no plasterboard stud walls here – it’s hard to hear people knock at the front door. Also the door bell doesn’t work.

Before I hear you say – why don’t you fix the door bell – well we are in the middle of doing that bit and the rewiring hasn’t happened yet – and the wireless battery powered door bell has been as useful as a chocolate teapot.

So as I had the parts to hand here’s what happened…

The cameraParts:

  • 1 usb webcam
  • 1 raspberry pi
  • 1 wireless usb dongle (or you can plug in to your network using a network cable)
  • 1 usb powered hub
  • a power brick to power the hub
  • Another to power the pi
  • Another device to use as the door maid – I have used several…


  • Plug the usb webcam and wireless dongle into the hub.
  • plug the hub into the raspberry pi.
  • plug the power leads into both and switch on.

Rpi and attachmentsI am going to make an assumption that you have already got your raspberry pi bootable with the Raspbian operating system, and that you have already got your wireless dongle connected to your wireless network. There are plenty of guides ‘out there’ to take a look at to get you to the point that you have a bootable and workable raspberry pi that is networked.


Put the webcam somewhere where it has a view of the area outside the front door. (or of course point it any anything else you want to be able to see. You probably want to avoid getting too much sky in your shot – especially when the sun might be passing through the field of view of your webcam.  You also might want to avoid getting a busy road in your webcam’s field of view if you expect to use motion detection – otherwise motion will be detected every time a vehicle goes by!


I am running some software called motion which is a program that monitors the video signal from cameras. It has a couple of useful features, like motion detection, and the ability to take time-lapse videos. The main one I use is the streaming of the video feed to other devices.

Full details for installing motion and basic setup can be found here: http://pingbin.com/2012/12/raspberry-pi-web-cam-server-motion/

Consuming the feed:

Nov 14

Reading Lives – Beyond the book

Explore how reading has impacted people's lives.

Explore how reading has impacted people’s lives.

Over the summer and early autumn I had the opportunity to continue work on the Reading Lives project. The project has grown over the last few years from a discussion at a ‘hack-day’ about a column of data in a survey result-set that nobody knew what to do with, to a web based application that allows people to explore the answers given to the question “What role has reading played in your life?”. The fact that you are reading this now suggests that you too could answer that question, and share a short summary of the role that reading has played in your life.

About the project

The project has become what it is through some funding from the Arts and Humanities Research Council via the CATH (Collaborative Arts Triple Helix) project lead by do.collaboration at the University of Birmingham in partnership with the University of Leicester. CATH consists of several teams, each one of which is made up of a developer, an arts organisation and researchers. Our team consists of researchers Danielle Fuller (University of Birmingham) and DeNel Rehberg Sedo (Mount Saint Vincent University, Canada); myself Tim Hodson (the developer) and Writing West Midlands (the arts organisation).

Reading Lives word cloud at the Birmingham Literature Festival

Reading Lives word cloud at the Birmingham Literature Festival

At the time of writing this, the project is not only allowing people to explore the existing survey answers, but to contribute their own answers. The app presents a user profile which people can fill out with their own survey answers. Recently the app featured at the Birmingham Literature Festival, and was seen on the big screen at several of the events.

About the Data

The Answers as I call them are the answers to the question “What role has reading played in your life?”. Each answer is analysed for it’s word content by using a (relatively) simple algorithm called Term Frequency – Inverse Document Frequency. This algorithm allows me to decide how important a word is in a particular document based on it’s frequency in the Answers and it’s frequency in the corpus of all answers. To quote Wikipedia “The tf-idf value increases proportionally to the number of times a word appears in the document, but is offset by the frequency of the word in the corpus, which helps to control for the fact that some words are generally more common than others.”. This calculated importance of words is used to build a word cloud which is meant to act as an alternative way to explore the frequently occurring themes of peoples Answers. We also have demographic data for the Answers which was collected through the original survey. It is planned to use this to allow further empathetic connections between the viewer of the app and the original Answers, but we haven’t got to build that bit yet :).

About the app (warning – gets technical!)

As developer to the project I have been exploring several different ways to ‘do something’ with the answers. This exploration has tried out a number of different technologies with a view to finding a happy medium between ease of use and flexibility.  I have also deliberately challenge myself by forcing myself to learn something new. The app is written in HTML, CSS and Javascript, and uses the AngularJS framework to provide the app structure. There is a whole heap of build streamlining provided through use of Node, Bower, Less and Grunt and I’m using Live Reload to enable quick iterations in the browser when developing. For the backend I am using Firebase (an event driven cloud database) which updates when data in my app updates and — with Firebase’s Angular integration — any variables bound in Angular can also be bound to Firebase.  This three way binding means that as soon as a client app updates some data, it is sent back to the cloud — and — sent to any other clients that are also viewing the same bit of data.  Each app client is always kept up to date as soon as the Firebase server is updated. This kind of event driven database makes it really easy to create both a realtime app, and an app which is more tolerant of bad network connections. I chose to store all my assets for the app in Amazon’s Simple Storage Service (S3), so that Firebase and S3 are the only external services the app needs. Everything else runs in your browser. Although there were some late nights to get everything ready for the key events, it has been a lot of fun to build.

Apr 14

RPi Sensor Network – Collecting the data


The realtime sensor display live!

In a previous post I talked about how I put together some temperature sensors to log temperature in the green house and lounge. The sensors use XRF radio modules to send the data back to a Raspberry Pi (also sporting an XRF module) and are running from a 3.3v button cell battery.

The XRF module on the Raspberry Pi is sending the messages from the sensors to the RPi’s serial port, and this is where we start to talk about code…

The plan was to build a realtime display of the data from the temperature sensors.  You can now see the live temperature from our green house and lounge, along with a history of readings over the last 1, 12 and 24 hours.

The code I used is available in a public Github repo – but is considered to be ‘alpha’ code – in that it is not tested end to end or from build to deployment. So you use at your own risk and with an assumption that you’ll have to tinker with it to get things running for you.

The steps below give an overview of the architecture, and each of these steps is explained in more detail in the sections which follow. However, this is not intended to be an in-depth how-to guide.

  1. Sensor sends LLAP message to RPi serial port
  2. Python script is listening to the serial port every minute
  3. LLAP Messages from the sensors are parsed into a simple JSON document which is posted to a Firebase database
  4. Firebase is an event driven database where all additions and updates fire events.
  5. An AngularJS app (served from Amazon S3 backed storage) shows the latest readings in a live updating display by listening to those Firebase events.

Sensor Sends LLAP messages to RPi serial port.

This is the easy bit, as the hard work of programming the sensors is already done for you by the Ciseco people.  The sensors send messages using a simple 12 character message system called Lightweight Logical Application Protocol. The actual sensor setup and connection to the RPi is covered in the previous article.

Python script listens to serial port

I wrote a small python module to parse LLAP messages into a simple JSON document which could then be stored or otherwise manipulated as necessary. The module is available as part of the Github repo. The LLAP python module treats every message received as an event, and if you have registered a callback to a particular type of LLAP message, every time that message is received your callback will be notified and whatever your callback does – it will do! The LLAP module simply deals with the parsing of a string of messages captured from a serial port, and passes those messages on to your callbacks. This means that you can react to each temperature reading or battery low message as and when that message arrives.

It is up to your callbacks to decide whether to make a note of the time the message was received, and what action to take based on the message.  But using this method it would be simple to have a temperature change over or under some threshold trigger some action. For example, if it get’s too warm in the green house, a motorised window opener could be triggered to let in some fresh air.

The code which listens to the serial port and registers the callback is up to you to write, but you can see the code in the Github repo which I am using to listen to the sensor messages.

LLAP messages sent as JSON documents to a Firebase database


The readings as seen in the Firebase database.

This is where it starts to get fun!

Firebase is an awesome cloud based database which allows you to post and get data using simple HTTP requests.  Because the database is event driven, and because it already has deep integrations with Angular JS, you can quickly build a responsive, data driven site which instantly responds to many events happening in different browsers all over the web. For our purposes – we want to show a live updating latest temperature displayed in a webpage – this is ideal.

The python code mentioned in the previous section simply takes the parsed LLAP message, adds a timestamp, a priority (used for ordering the result sets in Firebase) and a reading id which is just the timestamp float with the period (.) replaced with the sensor id (you can’t have periods in a Firebase key!). The resulting JSON object is then posted as a reading to the Firebase database.

Firebase is event based and fires events that you can monitor

Every time a new reading is added to the database by the python script, Firebase fires off some events to any other clients which are listening for those events.

This means that we can write a web app which listens to those events and updates it’s display with the new readings.  This essentially means we can have a realtime display of the sensor readings.

So the next step is to build that interface…

AngularJS app to show readings in near realtime

sensors_timhodson_com___In the Github repo, you’ll find the code for an AngularJS app which shows the sensor readings for the sensors in my network.  Now it has to be said that the app has not been written to be generic, and if you decide to fork the repo to build your own, I suspect you’ll have to do a fair bit of ‘hacking’ to get it to work.

The app was an opportunity for me to play with the following tools and what you see here was built in a weekend – just goes to show how useful these tools are.

  • Yeoman – for building the initial angular and firebase app skeleton.
  • Grunt – for automating some of the build and preview.
  • Bower – for managing Javascript dependancies
  • AngularJS – for realtime binding of variables in the HTML to variables which are fed directly from Firebase data.
  • angularFire – for the abstraction of the code needed to actually talk to the Firebase database.
  • Bootstrap 3 for reactive presentational elements to make it work on mobile and desktop.

I don’t pretend that this code is pretty – and there are no proper tests, but it works and it was fun to build!


Finally, apologies to all those whom I have bored with recounting the current temperature in the green house!



Apr 14

Analyse RIS files

Reference managers like Endnote, Refworks or Zotero often allow you to export your bibliographic citations as a RIS file. You can import these into things like Talis Aspire Reading Lists.

The script below will look in the current directory for RIS files and analyse their contents. We are looking to see what types they have and how many of them have some sort of identifier that can be used to find better bibliographic data from some other source.


while IFS= read -r -d '' file
	echo -n "#=== "
	printf '%q\n' "$file"
	egrep "^TY" "$file" | sort | uniq -c 
	typeCount=$(egrep "^TY" "$file" | wc -l)
	snCount=$(egrep "^SN" "$file" | wc -l)
	echo $(($snCount*100/$typeCount))"% of records have an SN ("$snCount" of "$typeCount")"
done < <(find . \( -name "*.ris" -o -name "*.txt" \) -print0 )

Sample output:

#=== ./PMUP00DNMod3.txt
  17 TY  - CHAP
   4 TY  - JOUR
80% of records have an SN ( 17 of  21)

#=== ./PMUP00DNMod4.txt
  11 TY  - CHAP
  10 TY  - JOUR
95% of records have an SN ( 20 of  21)

Mar 14

RPi sensor network

I’ve been wanting to do some Raspberry Pi tinkering for some time. Having a little computer on hand to handle the logic processing and interfacing with the outside online world, while also having input/output pins directly controllable by code running on the pi is just too tempting.

A little while later after following an Adafruit guide to making a LED based new email notifier, I was hooked…

2014-03-29 11.37.59

The Gmail notifier – a simple true false statement turns a pin voltage high or low depending on presence of new mail in your inbox.


I am no electronics guru – very much in the category of hobbyist who can comfortably fill header sockets with too much solder without realising it! (yes that’s a bad thing!) Therefore I have been looking for something that combined ease of use with great versatility in order to start exploring how I could use sensors to begin on that hobbyist’s delight – home automation.

So what project would make a good first project?

The hub of the network, a slice of pi and an XRF radio module.

The hub of the network, a slice of pi and an XRF radio module.

In our new house we changed the boiler, and so I wanted to track temperature in at least one room to see when the temperature was comfortable and to allow us to re-programme the thermostat to be as efficient as we could get it.

The other thing I wanted to track was the green house temperature – although I must get some new glass to replace the missing panes, as it isn’t going to be that useful with a gale blowing through it.

Two immediate ideas which involved temperature measurement and tracking. Sounds like a good basis for a project.

I first looked at the 1-Wire network Dallas temperature sensors, but as these predominately needed a wire to make the network, and I didn’t want to run wire out and down the garden, I dismissed these. Though I did find various ideas to make them wireless – this didn’t seem the simple solution I was looking for.

2014-03-29 11.38.30

One of the sensors. You can see the thermister and aerial.

I then stumbled on this blog post which pointed me in the direction of small radio modules which had a potential range of hundreds of meters, combined with a very simple text based serial port message system that would make getting the readings super easy. not only that but I could theoretically have around 676 devices int he same sensor network, and some of them could also do things like actuate switches… needless to say this sounds terribly promising!

For the initial setup I decided to do exactly as described in the blog post above.

Parts ordered and delivered: within 1 evening and a saturday morning I had a network setup and sending temperature data on an periodic basis.

Next step is to capture the data somewhere (possibly using Firebase) and render the results as a chart. For that you’ll have to see the next post (when I have written it!).

Dec 12

How to get the most from any support desk

As I used to be a support analyst, and as I still work in a customer focused team, I get to see a lot of support tickets and how they are handled. This post summarises some learnings from over 10 years working in customer facing positions. Do these things, and you’ll have the support analyst on your side.

  • Be polite
    Too often people on support desks have to put up with people who are rude and impatient.  It is too easy to take frustrations out on the person at the other end of the support line. You won’t win any favours by being rude.
  • Be patient
    Every new ticket from every customer is important to the customer who raised it. It is also likely to be in a queue, and if there is a problem that is affecting several people the queue can sometimes be large. Smart support analysts will spot patterns in the tickets coming in, and can alert systems teams to deal with potential issues. System temas will then need to take 10 or more minutes to investigate thoroughly, and so patience is a useful attitude.
  • Raise a ticket for one issue at a time
    If you raise a ticket which rambles on about umpteen issues, you will confuse yourself and the support analyst as you won’t know which issue you are being asked questions about.
  • Don’t blame the computer system for your inadequacies
    Systems are fallible, but so are humans. Over the years I have seen several examples of customers who feel such anger toward the system, based on the feeling that they are failing because the system isn’t helping them. They then lash out at every opportunity to say the system is unworkable and doesn’t do what they were told it would. Usually however, it is not because the system is not working. There is often clear evidence that other users of the very same system are being successful due to using the system as a tool, and not expecting it to replace the strategy and planning needed to make it work for them.
  • Be helpful
    Give the support analyst as much information that is relevant to the ticket, but don’t be offended if the analyst asks for something else.
  • Demonstrate that you have used the knowledge base
    Sometimes you need to get beyond the “have you looked at the article in the knowledge base” response from the support desk.  Support analysts tend to assume that you didn’t bother to look, so show them that you tried and that you  didn’t find anything that helped.  Also tell them when you have followed some steps to fix an issue if that didn’t work. This will all help the analyst get to a resolution more quickly.
  • Say thank you
    Tell the support analyst working on your ticket that you appreciate the time they have taken out of their day to help you.

I could probably add to these, but these are probably the main things to get right. So go on – make a support analyst’s day, and tell them they’ve done a great job!