Wheelchair Design

I've talked a little bit about the hardware bits -- how we get the wheelchair to move at the computer's request. But what else is behind the wheelchair app? Let's take a look!

Parts & Pieces

ClickOnce Installer

ClickOnce is a technology that allows you to install a Windows desktop application as simply as clicking on a link in a web page. Your browser downloads a little file which Windows then knows how to use to download, run, and update the application.

Wheelchair UI & WebCam Viewer

The application & user interface for the wheelchair is written in Windows Presentation Foundation (e.g. WPF), a technology for making Windows desktop applications. This is an 'older' type of application that can't be distributed through the Windows Store that is included in Windows 10. Windows Store apps are 'sandboxed' to be more safe and to keep them from harming your computer, but the downside is they are less capable of doing things like talking to Eye Gaze sensors or Arduinos. It's not impossible to talk to these from a Windows Store app, but it's a lot more complicated so we use a Windows desktop app to keep things simpler.

The wheelchair UI is very simple -- translucent directional movement buttons overlaying a full screen webcam video of the 'forward facing camera'. We use the background video to try and help the Surface 'disappear'.

We started with a simple webcam viewer, WpfCap. We're moving to a newer library, WPF-MediaKit.

Eye Gaze Interaction

We get simple data from the eye gaze sensor -- a stream of X/Y coordinates telling us where you are looking. We turn those into interactions, aka clicks, when the gaze pauses for a period of time.

To help distinguish when you're looking at an element (e.g. button, textbox, or tab) we use WPF's VisualStateManager to change its look after a short delay (e.g. 250 milliseconds) and then perform the activation, or click, after an additional longer delay (e.g. 250 to 500 milliseconds). For some buttons, such as 'show settings' or 'exit app', we set a longer activation delay to prevent accidental clicks.

VisualStateManager.GoToState(hitTarget, "Click", true);  

We activate elements using WPF's AutomationPeers, for example

var button = hitTarget as Button;  
            if (button != null)
            {
                var peer = new ButtonAutomationPeer(button);
                var provider = (IInvokeProvider)peer.GetPattern(PatternInterface.Invoke);
                provider.Invoke();
                return;
            }

            var toggleButton = hitTarget as ToggleButton;
            if (toggleButton != null)
            {
                var peer = new ToggleButtonAutomationPeer(toggleButton);
                var provider = (IToggleProvider)peer.GetPattern(PatternInterface.Toggle);
                provider.Toggle();
                return;
            }

            var textbox = hitTarget as TextBox;
            if (textbox != null)
            {
                textbox.Focus();
                return;
            }

            var tabItem = hitTarget as TabItem;
            if (tabItem != null)
            {
                tabItem.IsSelected = true;
                return;
            }

For safety, it's important to know when the wheelchair driver is no longer looking at the screen or has closed her eyes. We check the incoming data to look for idle timeouts (e.g. no data coming in because no eyes are detected) or out of window bounds -- in either of these cases we fire an EyesOff event and stop the movement of the wheelchair.

Sensor Abstraction

We want our system to work with multiple different types of sensors. This allows us to compare technologies and buy sensors from multiple vendors so we can get the best price/performance. The market for eye gaze sensors has been disrupted over the past few years so prices have dropped from ~$8000 to ~$150 and we'd like to be able to take advantage of the latest & best.

We abstract away the eye gaze sensors behind a common interface which looks like:

public interface IGazeDataProvider  
    {
        event EventHandler<GazeEventArgs> GazeEvent;
        bool Initialize();
        void Terminate();

        void LaunchRecalibration();
    }
Filters

We have implemented multiple different filters to test different algorithms for dealing with the high noise coming out of sensors like the Tobii EyeX. The filter we use most commonly is called the GainFilter. This is a simple formula that looks like:

double distance = Math.Sqrt(((_filteredX - measuredX) * (_filteredX - measuredX)) + ((_filteredY - measuredY) * (_filteredY - measuredY)));

            if (distance > SaccadeDistance)
            {
                _filteredX = measuredX;
                _filteredY = measuredY;
            }
            else
            {
                _filteredX = _filteredX + (Gain * (measuredX - _filteredX));
                _filteredY = _filteredY + (Gain * (measuredY - _filteredY));
            }

SaccadeDistance is used to disengage the filtering when a rapid eye movement (or saccade) occurs.

The SaccadeDistance should be set to around 'one half of the average movement distance between targets' so that when a person moves her gaze over halfway between one target to another, the filter 'releases' the gaze to allow it to quickly travel. In our testing, we generally find a value of 0.07 to be a good initial SaccadeDistance for our applications.

The Gain should be set based on the expected noise in the system. In our testing for a Tobii EyeX Gain for indoor settings with low external IR interference should be set between 0.04 for a precise user to 0.07 for an imprecise user. As the noise increases, either due to poor eye modeling conditions (thick glasses, dirty glasses, bifocals, older eyes, astigmatism, etc.) or external interference (e.g. halogen lights or sunlight), the Gain should be increased.

Additional filters we have implemented are AveragingFilter, StampeFilter, and SimpleKalmanFilter but we most commonly use the GainFilter which works well for us.

Tobii EyeX SDK

We use the Tobii EyeX SDK but in order to support multiple different sensor types equally we primarily use GazePointDataStream and translate it into our own device agnostic GazeEvent data format.

We do not leverage the WPF integration libraries that Tobii provides as part of the EyeX SDK as this would tightly couple our system to just the Tobii EyeX.

Chairduino Client

The Chairduino communicates to the app via a protocol called Reflecta, discussed below. Reflecta is basically a remote procedure call (or RPC) client for microcontrollers like the Arduino.

The Chairduino client is essentially just ReflectaClient plus an wheelchair specific interface named 'whel1'. 'whel1' defines a remote function called ListPins which tells the wheelchair app which digital output pin refers to which drive command on the TRACE port.

The 'whel1' interface looks like:

public class whel1 : ReflectaInterface  
    {
        private byte _interfaceOffset;
        private ReflectaClient _client;

        private enum functionOffset : byte
        {
            ListPins = 0x00
        }

        public void Initialize(ReflectaClient client, byte interfaceOffset)
        {
            _client = client;
            _interfaceOffset = interfaceOffset;
        }

        public async Task<byte[]> listPins()
        {
            var functionId = (byte)(_interfaceOffset + functionOffset.ListPins);
            var value = await _client.SendFrameAndAwaitResponse(new byte[] { functionId });

            return value;
        }
    }

and listPins maps to control pins like:

pins = await _whel1.listPins();  
_forwardPin = pins[0];  
_reversePin = pins[1];  
_leftPin = pins[2];  
_rightPin = pins[3];  
_fifthSwitchPin = pins[4];  

Once these pins are mapped, driving the wheelchair is done very similarly to how it was documented in my previous article which uses the same logic but a different Arduino remote control library called Firmata.

Reflecta

We switched from Firmata to Reflecta to enable us to add additional safety intelligence to the Chairduino, such as keepalive packets, auto-stop if the Chairduino loses connection to the wheelchair app, and a watchdog timer to detect firmware crashes in the Chairduino itself.

Both Firmata and Reflecta offer a simple RPC mechanism for setting output pins on an Arduino, but Firmata is difficult to integrate with additional microcontroller-side logic. Reflecta has been structured to make it much easier to integrate additional functionality with the RPC calls.

Chairduino Firmware

The Chairduino Firmware supports a few key functions:

  • Setting status LEDs that confirm communications between the Chairduino and the wheelchair, the Chairduino and the wheelchair app, and the heartbeat of the firmware itself (not hung).

  • A watchdog timer that resets the Chairduino if it crashes or hangs and a triple-blink on startup so we can visually detect a crash/restart.

  • A keep alive process that detects when the wheelchair app stops talking to the Chairduino for whatever reason (app crash, Surface reboot or out of power, etc.) and stops wheelchair motion.

  • Implementation of the ListPins call that tells the wheelchair app which pin to use to drive forward, turn right, etc.

  • A heartbeat packet that sends data from the Chairduino to the wheelchair app such as 'is the wheelchair powered on and connected to the Chairduino' to help diagnose cable/installation problems.

Summary

Hopefully this gives you a detailed overview of the wheelchair software architecture, the different pieces involved, and how they all work together.

If you have further questions, please feel free to email me at jay@hikinghomeschoolers.org.

-- jcb