The Human Body as a Computer Interface

1772 Words4 Pages

Interfaces take places into our lives in the form of the various devices, analog or digital, with whom we normally establish some kind of interaction. This means that the interfaces are "tools" extenders for our bodies, such as computers, cell phones, elevators, etc. The concept of interface is applicable to any situation or process where the exchange or transfer of information takes place. Some of the ways of thinking to the interface might be like “the area or place of interaction between two different systems not necessarily a technological system”. Traditional computer input devices leverage the dexterity of our limbs through physical transducers such as keys, buttons, and touch screens. While these controls make great use of our abilities in common scenarios, many everyday situations command the use of our body for purposes other than manipulating an input device (Saponas, 2010, p. 8). Humans are very familiar with their own body. By nature, humans gesture out their body parts to express themselves or communicate ideas. Therefore, body parts naturally lend themselves to various interface metaphors that could be used as interaction tools for computerized systems.

For example, imaging rushing to a class while wearing gloves in a very cold morning, all of the sudden you have to place a phone call to your classmate to remind him to printout a homework, dialing a simple call on a mobile phone’s interface within this situation can be difficult or even impossible. Similarly, when someone is jogging and listening to music on a music player, their arms are typically swinging freely and their eyes are focused on what is in front of them, making it awkward to reach for the controls to skip songs or change the volume. In these situati...

... middle of paper ...

...ace. In Proceedings ACM

CHI 2010

Hui, .M. (2010). Human Computer Interaction, A Portal to the Future. Microsoft Research.

Karen, .J. (2008). Interaction Design for Public Spaces. ACM MM’08, October 26–31, 2008,

Vancouver, British Columbia, Canada.

Mastnik, S., (2008). EMG-based Hand Gesture Recognition for Realtime Biosignal Interfacing.

Proceedings ACM IUI ‘08, 30-39.

Musilek, P. (2007). A Keystroke and Pointer Control Input Interface for Wearable Computers.

In Proceedings IEEE PERCOM ’07

Saponas, T., (2009). Enabling Always-available Input with Muscle-Computer Interfaces. In

Proceedings ACM UIST ’09.

Saponas, T., (2010). Making Muscle-Computer Interfaces More Practical. In Proceedings

ACM CHI 2010.

Saponas, T. (2009). Demonstrating the feasibility of using forearm electromyography for

muscle-computer interfaces. In Proceedings ACM CHI ’09.

Open Document