Tangible User Interface
This project aims to provide the user with a tangible interface for gesture related interaction for a variety of scenarios. The physical object is that of a cube providing 6 distinct faces of interaction allowing the user to provide input based on orientation and receive feedback through the embedded bi-colour LEDs. The project was undertaken during a period of summer vacation and was intended to expand my knowledge and understanding of wireless communication methods and the use of micro controllers. The aim was to provide a tangible interface that would become ubiquitous with the general use of a computer and allow common tasks to be associated with natural gestural equivalents. The project was completed using the Picaxe micro controller and has since been partially rebuilt with the Arduino platform in order to expand the project and allow task specific visual feedback.
The purpose of the device was to initially provide shortcuts to common activities such as locking the machine where a single gesture could be used rather than navigating a hierarchy of menu options. The hope was that by associating gestures to previously overcomplicated interface options that the device would be simpler and so intrun relied upon becoming ubiquitous within daily tasks. Creating this tactile interface has also personally pushed my skills providing a suitable project to work with a variety of new techniques designing with CAD packages and etching PCB circuitry.
The device is split into two separate units with the physical cube taking user input and the charging base station acting as the wireless receiver. The cube is formed from laser cut acrylic which contains a custom PCB that drives the 12 LEDs that are positioned over the surface of the device. Within the centre of the PCB a Picaxe micro controller takes analogue input from an accelerometer giving the context of use by interpreting the gestures being undertaken. The data gathered at present is then interpreted by the controller adjusting the LEDs colour (Either red or blue) and the intensity using pulse width modulation. The orientation data is then fed to an AM RF transmitter which can then be received by the devices base station. This is then read by the computer as serial data and can be used within a variety of applications. The current implementation allows for the customisation of each face of the cube for specific actions such as launching a web browser or adjusting the volume of the machine. With each top level selection the context of the actions are also adjusted so that if a web browser has been selected then by moving it onto one face the url will be added to your favourites or to another the browser navigates backwards to your previous page. The purpose of these simple commands is not to replace your traditional interaction but to provide a method to quickly and effectively undertake common tasks with physical counterparts.
The current implementation allows one way communication from the device to the base station where it is then interpreted as a serial stream by the machine. Therefore only the computer can react to the interaction with the cube interface only giving predefined feedback based on the orientation. The project is currently being re implemented with the Arduino development platform which will use a transceiver unit to allow orientation data to be sent, processed and appropriate data returned for the associated LEDs. This will ensure gesture related feedback is provided dependant on the application in use and other external factors. Further developed regarding input techniques adding touch sensitive elements to the face of the cube allowing how the device is gripped and held to have an effect on the behaviour of the interface. For example by gripping the cube on four surfaces or cupping it with two hands could be related to closing applications and by increasing pressure could lead to all applications to be closed and the system shutdown.