A Look at the Code – Xan

As my blog posts near their end, I thought that it might be nice to take some time to explain a bit about how the hand electronics and code work. This post should give the reader a fair bit of information about what the code actually does, and may briefly touch on how the electronics work with the code. I want this post to be pretty thorough, so if you have any questions or want anything clarified by the end, feel free to ask in the comments!

I’ve written a lot of code for this project, all of which is available in my GitHub repository (Essentially a public google doc for code).  The best way to start will probably be by analyzing what certain programs in the repository do, and then analyzing how they work with the hand. Don’t worry; I won’t be going over every little facet of every single program – that’s what code comments are for. In this post, I’ll only be focusing on the two main programs in use right now, starting with the basics of what the programs do and then moving further up in complexity.

Our two main programs are:

multiEMGServoTest.ino (Found in the “Utilities” folder in the repository), and handDebugProcessing.pde (Found in the “Processing” folder in the repository.)

Note the different file extensions. “.ino” can be found as an extension on every piece of code in the repository except for handDebugProcessing.pde, which is written in a different language – Java, as opposed to C++. Part of the reason that I’m focusing only on these two programs is that they give a good overview of what everything else does – handDebugProcessing is unique, and multiEMGServoTest is a nice mixture of everything else .ino.

multiEMGServoTest (Henceforth referred to as MEST) and handDebugProcessing (Henceforth referred to as HDP) are not just written in different languages. As you might have guessed, they have entirely different purposes. MEST is code that runs on the hand itself and controls the hand physically, while HDP is code that runs on a computer to provide the programmer (me, in this case) with valuable information on how to make MEST better. MEST, written in C++, is loaded onto the Arduino, a tiny computer on the hand. It can read the EMG sensors and control the hand motors. HDP simply graphs and records the data that is sent to it by MEST.

The EMG sensors on their own can’t do very much. MEST is like a translator – it translates the data that the sensors gather into useful information that allows for stuff to happen – like opening and closing the hand. So MEST converts the data from the sensors into something that it can use and HDP can understand, and HDP converts the data from MEST into something that we humans can easily understand. (In this case, a colored graph).

In this way, HDP is a more abstract program, telling you what the hand is doing but not really influencing or controlling the motors and other hardware. MEST is more “hands on”, (get it?) because it communicates with the hand’s hardware directly.

There are three EMG sensors attached to the Arduino, and every 50 milliseconds, MEST will send a three-line message to HDP over a USB cable, each line containing a letter followed by a number. Here are some examples of real messages sent from MEST to HDP. (The bracketed text is not actually sent by MEST, and is only there to separate the messages.)

Screen Shot 2015-05-19 at 10.51.45 PM

If MEST sends the data “A20”, it means that sensor A (Out of sensors A, B, and C) is currently reading a value of 20. If MEST sends out B0, it means that sensor B is reading a value of 0. If it sends C61, it means that sensor C is reading a value of 61. It’s really just conveying the status of each sensor in every message. Higher numbers mean that the sensor is being activated more intensely, while low numbers mean that it isn’t being activated at all.

HDP reads these messages and graphs them. Based on the letter before each number, HDP draws a colored dot somewhere on the screen of my computer – the color representing the sensor ID and the vertical position of the dot representing its intensity. As with any graph, the further up the dot is, the stronger the signal. With a tiny bit of math, HDP can connect the dots and produce a pretty graph, as opposed to a collection of hard-to-process numbers like the ones seen earlier:

Screen Shot 2015-05-13 at 2.52.46 PM

HDP really is a useful program to have around, especially when you are dealing with tens of thousands of data points in any given session.

Conveying sensor data to HDP isn’t all MEST does. While MEST reads the sensors and conveys their respective values at any given moment to HDP, it’s also checking to see if a sensor value is higher than a certain number, that number being the threshold. If the threshold for sensor A is 150, and MEST detects that sensor A is reading a value of 178 (Sent to HDP as A178), it will do something – opening/closing the hand, for example.

…Which is the third and final thing that MEST does – it communicates with the motors on the hand to tell them what to do. Each motor on the hand has an ID from 0 to 4, 0 being the pinky and 4 being the thumb, and 1, 2, and 3 being the fingers in between. If a threshold is activated, MEST can activate any combination of those five motors to make the hand go into a predefined position – say a peace sign, or a closed fist.

So that’s how the hand works; The Arduino runs MEST, which reads the sensors and controls the hand, while HDP graphs the sensor data detected by MEST.

If electromyography weren’t so tricky, I’d stop here. Unfortunately for us, it is tricky – so there’s one more piece to this puzzle. Read on if you want, but by all means, feel free to stop.

Deciding whether to open or close the hand is a surprisingly difficult problem to tackle. EMGs are finicky – abruptly moving the electrode cables or accidentally flexing a muscle may cause the sensor to reach the threshold value even when activation is not intended. Figuring out whether an activation is intentional or not (EMG filtering) is a big part of MEST’s job of checking the sensors.

There are two types of accidental activations that we must defend against: Large, instantaneous spikes (which usually occur when a cable is hit accidentally), and muscle contractions that are probably accidental (short, low power contractions).

At the moment, there are two different ways of defending against those types of accidental activations. Take a look at these values for a spike happening on sensor C (from the cable being smacked). Note that 1023 is the maximum value that any sensor can read.

Screen Shot 2015-05-19 at 11.00.42 PM

The sensor reads 0 (the lowest possible value), then immediately after, it reads the maximum value, and then quickly approaches zero. All of this happens in 2/10 of a second. Compare it to this example of a normal activation:

Screen Shot 2015-05-19 at 11.02.06 PM

It happens much more slowly and smoothly. Rather than rocketing straight up, the numbers gradually approach a peak of 134, and then proceed to decline quickly. They’re almost opposites – a spike is a quick jump followed by a quick descent, while a deliberate activation is a slow climb followed by a quick descent. We don’t want spikes to happen, because they’re unintentional – can you imagine your hand suddenly closing if someone bumped into you?

We can have a rudimentary defense against the quick spikes by making an upper threshold of something like 512. That way, when the sensor sends a value, it has to be between the lower threshold and the upper threshold, or else it is ignored. In this example of a  spike,

Screen Shot 2015-05-19 at 10.58.27 PM

The first two values would be ignored. 0 is under the threshold, and 1023 is above it. So the first and last parts of the spike are ignored. But what about the 461 and the 34?

34 is low enough to be below the threshold, and so it would be ignored. 461, however, is between 150 and 512 – so it should activate the threshold and cause the hand to do something. It’s also part of a spike though, so we don’t want it to activate anything. Another method of filtering the data is required.

That data filtering method is averaging. When MEST gets a sensor value between the two thresholds, it doesn’t do anything right away – instead, it starts averaging sensor values together to make sure that an activation is probably meant to happen. Let’s pretend that the threshold of sensor C is 100. The averaging process is started when the threshold is reached. Take a look:

 

Screen Shot 2015-05-19 at 10.56.12 PM
When C reads a value of 122, the threshold is activated, but the program doesn’t activate any motors yet – the program instead starts to average the values together. MEST takes the next four sensor values after 122, adds them up, and divides them by five, finding the average of the values. (122 + 113 + 122 + 118 + 106) ÷ 5 = 116.2. 116.2 is greater than 100, and so an action takes place. Keep in mind that we’re pretending the threshold is 100, not 150.

Basically, averaging makes the computer wait to open the hand, to be sure that the user is intentionally trying to open it. The sensor has to sense a value bigger than the threshold over a certain amount of time, and cannot just activate something quickly from one unexpectedly large value.

So, that’s where the hand is now.

One final thing that I’d like to mention (An idea that I’ve been toying with, but not had the time to implement) is filtering using the percent change in sensor values. Spikes shoot up quickly, and the difference in values before a spike and when a spike starts are huge (immediately going from 0 to 1023). If we were to put a cap on the maximum percent change that’s allowed, we could filter out any pesky spikes that still manage to get through. Just an idea. I might write something about it later if I pursue it.

That’s really it! Please feel free to look through the code if you are so inclined. Occasionally I will go through and add loads of comments, so even if you don’t know the programming language you can still follow along in the process – the code is really just a recipe.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s