Sunday, 11 March 2012

CPSC 543 Project - Second Iteration

Authors: Aidin Mirsaeidi & Huan Li

Background & motivation:

At the moment, cyclists' best option for getting navigational aid is through 'audio navigation' which is realized through GPS equipped smartphones. This can be an overwhelming experience for cyclists since their attention is divided between the road/environment and the audio input of navigation devices. Furthermore, step-by-step navigation instructions and user's inability to dedicate full attention to instructions does not help the situation. And this leads to cyclist's frustrations and poor GPS navigation. Visual navigation tends to be a less intrusive alternative, however is not a possibility for a cyclist and tends to conflict with user's riding experience.

Instead, we are trying to establish an unused and independent sensory channel between our envisioned system and the cyclist to provide navigational aid. This is realized through haptic navigation via vibrotactile feedback which would result in: low cognitive load and a seamless mapping between haptic input and direction.


The main question we want to answer is the following:

What are the most appropriate haptic signals/cues to establish a meaningful communication channel?

Below is a block diagram of our currently envisioned system:

Basically, we receive GPS data from an Android phone and communicate the corresponding data to an on board microcontroller via a bluetooth channel. We render those data and decide when and in what fashion to control the vibrotactile actuators mounted on a bicycle's handle bar.

To-do list:

Analyze GPS data and see what we can do with it
Assessing the benefits of having a visual compass for user to interpret haptic feedback and act as a visual aid
And most importantly, we want to center our work around haptic cue design in the sense that we want to establish an accurate mapping between haptic cues and navigational guiance. There are some related literature on that which we are currently reviewing.