This month, as I was thinking about the Aus GLAM Blog Club theme of silence, I played Tacoma, the new game from Fullbright who did 'Gone Home'. Apart from being a great game that manages to be fun and inclusive it uses a very interesting mechanic that you could almost skip over - ASL. American Sign Language (ASL) is used almost as an aside when your character wants to interact with the space station's computer interface. Inputing in passwords or using it to run commands like 'start fix' to trigger the game's core AR mechanic. I didn't understand how much of a big deal this was till I watched this play through by Crow_Se7en:

No, you have something in your eye. DON'T LOOK AT ME!

Now the more I thought about it the more I realised that ASL is used in space because you're usually wearing thick heavy gloves. Touch screens wouldn't work, and physical keyboards would be difficult to use (as would doing a Japanese tea ceremony). Voice commands would be an obvious choice, and the game addresses this with the crew's interaction with the station AI, Odin. However, when you're wanting to type/input something sensitive do you want to yell out your password to the whole crew, not to mention calling out your space-google search history? This is where sign language can come in (excuse the pun), handy.

Those familiar with the Expanse series would know that the Belters' language includes a large amount of physical hand signals. This was developed from when the asteroids were first colonised and space suits were used extensively, hand signals were used to emphasise, and communicate non-verbally as it is hard to pick up facial expressions through tinted glass. You can see how they adapted it in the TV series, watch the Belter's as they talk many use large gestures to add emotion or emphasis:

That's cool, you might be thinking, but why are you telling me this? Well, dear reader, I'm telling you all this because I'm trying to force myself to do something rather than come up with an idea and instantly forget about it. I have access to a Raspberry Pi and a Leap motion controller and thought I'd try my hand at writing a small program that takes sign language and turns it into a keyboard input.

Turns out there are already several projects like this out there, however all the ones I could find were focused on the reverse - signing turned into speech. Like most disabilities, tech devs try and overcome them rather than embrace them. So, I'm writing this more of a challenge to myself lest it becomes like my submersible UAV that is sitting in pieces in my shed. Plus I've been looking for a project to kick start my coding, and it gives me an excuse to learn sign language too! I fear the problem will be that all the programs I've found are written for American Sign Language, rather than AUSLAN but that will be the next challenge. I'll post any updates here if you're interested.