Other Free Encyclopedias » Online Encyclopedia » Encyclopedia - Featured Articles » Contributed Topics from F-J

Human Computer Interaction - Introduction, Novel Body-Gesture Driven Interface for Visual Art Creation

users motion user music

Hau San Wong and Horace H. S. Ip
Centre for Innovative Applications of Internet and Multimedia
Technologies (
AIM tech Centre)
Image Computing Group, Department of Computer Science
City University of Hong Kong, Hong Kong

Introduction

Recent advances in human-computer interaction techniques have resulted in significant enhancement of user experience in different domains. Rather than being restricted to the use of keyboard and mouse for interactive purpose, which requires the learning of unnatural actions such as typing and clicking, and are particularly awkward to use in a 3D interactive environment, current HCI techniques make use of our natural capabilities such as speaking, gesturing, head turning and walking to facilitate the user-machine communication process. For example, speech recognition, which directly transforms spoken commands or dialogs into their textual forms, will minimize the required amount of learning for novice users and will also allow us to bypass the restriction of operating a miniature keyboard in mobile devices. The possibility of speech input will also be invaluable to users with different levels of dexterity impairments. On the other hand, gesture and motion-based interface, which automatically detects and associates various types of spatial movements with different intentions of the users, will be the natural choice for interaction in a 3D virtual environment.

In addition to these interaction modalities, there are recent researches on the possibility of performing automatic facial expression recognition in interfaces, such that the current emotional state of the user can be inferred. The success of these efforts will result in promising new applications such as emotionally responsive virtual agents, and performance evaluation of actors/actresses based on their capabilities of mimicking various facial expressions. There are also recent adoptions of haptic feedback in user interfaces, which conveys important physical cues in a virtual reality interface and allows for effective manipulation of the objects in this environment, thus reinforcing the sense of presence for the users. In addition, the tactile modality is important in providing feedback for visually impaired users in advanced interfaces, as the user can then “feel” and physically “press” the control buttons of the interface through haptic input devices, and the different buttons can be distinguished by associating different levels of tactile feedback with them. Haptic devices are also important for the implementation of interfaces for the training of surgeons by associating the correct level of tactile feedback with particular incisions and suturing actions, and the enhancement of realism in computer games by providing suitable levels of force feedback to match with different scenarios.

To achieve human-computer interaction based on these new modalities, in particular for the case of gesture and motion recognition, will require either the introduction of new devices, or the adoption of advanced pattern classification and computer vision techniques. For example, gesture recognition can be achieved by using a data glove to convey information about the user’s finger configuration to the interface, or by using vision-based interaction in which gesture information is directly extracted from a video sequence. The successful adoption of these techniques will thus allow users to freely express their intentions without being required to wear special devices and tethered to particular locations.

While commands issued through special input devices can be efficiently and unambiguously decoded through dedicated circuitries, pattern classification and computer vision techniques are computationally intensive and usually results in a nonzero recognition error. Specifically, pattern classification requires the determination of a set of suitable features for effective characterization of different types of inputs, the design of an optimal classifier based on an underlying feature model for categorizing feature representations into their correct classes, and the evaluation of the model in terms of classification errors. For example, in speech recognition, speech signals are partitioned into segments corresponding to short time frames, and features from the time and frequency domains are extracted and concatenated to form a feature vector. These feature vectors are then modeled as random variable realizations within a probabilistic framework based on the Hidden Markov Model, and feature vector sequences associated with different phonemes are modeled using their corresponding HMMs. HMMs are also useful for modeling the space-time trajectories associated with different gestures in gesture recognition, and the time-varying motion vector fields associated with different facial expressions in affective computing. As a result, the computational complexity and recognition error will thus highly depend on the selection of suitable features and the design of an optimal model for feature representation and classification. These problems are alleviated to a considerable extent recently by the identification of suitable hardware implementations for the pattern classification algorithms, and the inclusion of feedback mechanisms for users to correct the errors.

In addition, a major challenge is to effectively combine information in different modalities to achieve multimodal interface based on a multiple classifier framework, which can reduce the recognition error level beyond those achievable using a single modality, such that the experience of the users can be considerably enriched. For example, errors in speech recognition in a noisy environment can be compensated to a large extent by the inclusion of the gesture or facial expression modalities to disambiguate the user’s intentions.

We next focus on the application of these enhanced interaction modalities to artistic creations. This emphasis is due to our viewpoint that, while the new interaction modalities currently play only a supportive role in conventional applications, since users can resort to keyboards or mice for most operations at the expense of a less enriched interactive process, the expression of artistic intentions cannot be easily supplanted by a set of typed commands or click sequences. As a result, the adoption of these enhanced modalities will be important and even essential for effective artistic expressions. Based on this emphasis, we introduce a number of innovative human-computer interaction environments for visual art creation and music composition.

Novel Body-Gesture Driven Interface for Visual Art Creation

The concept of generating creative work of art through body movement has been an important area explored by a number of artists and researchers in the fields of both arts and technology. These approaches all share the common conceptual ground that the human body movement and gestures are significant human expressions which communicate messages of human inner emotions and intentions when interacting with the outer environment. To further explore this relationship between body movement and users’ intentions, the City University of Hong Kong develops a real-time body-driven human-computer interface, the Body-Brush, which is able to capture human motion and transform the motion data into vibrant visual forms. This interface can preserve the 3-D information of body motion. It enables users to interact intuitively with the machine and control the rich visual simulation in synchronization with the body motion.

With a systematic study of the relations between the human body movement and the visual art language, Body-Brush turns the human body as a whole into a dynamic brush. This is achieved with the development of an immersive computer-vision-based motion analysis system with frontal infrared illumination, and an innovative graphic rendering software that maps the body motion gesture-path-energy to the colour-form-space visual attributes. Since users are not required to wear any specific sensor devices to be recognized by the computer or receive any prior training to use the man-machine interface, Body-Brush enables human to express freely and interact intuitively with the machine. The Body Brush interactive environment is illustrated in Figure 1 (a)

In this system, human motion is captured by a set of frontal infra-red illumination and a set of infra-red cameras. In this way, the extraction of the body silhouette is not sensitive to the environmental lighting condition, and users are not required to wear specific color clothing or sensor devices. The user can thus move unobstructed and express freely in the space.

The video images of the motion body are taken at orthogonal angles by the two infra-red sensitive cameras mounted outside of the 3-D canvas. From the two streams of video images, the user’s position within the 3-D canvas space is calculated and the body gestures of the user are extracted and analyzed at video frame rate. A number of numerical measurements relating to the user’s gesture are computed and these measurements are translated into various drawing attributes such as color (hue, value, saturation, and opacity), stroke path and stroke cross-section (for 3D modeling). Finally, these drawing attributes are painted onto a pair of stereo-images and projected back onto the 3-D canvas in synchronization of the user’s movement, enabling real-time interaction between the user and the virtual painting and sculpture. The setup of the Body Brush interface is illustrated in Figure 1 (b).

Based on the assumption that uplifted moods are associated with more vibrant color combinations, we design the mapping rules between motion and the color attributes of the virtual brush strokes as follows: Different hues are associated with different points in the 3D space, while the color intensity is related to the motion speed, such that higher speed results in brighter color, and vice versa . The color saturation is related to motion acceleration, such that acceleration generates a more saturated color. In addition, the body dimension determines the size of the brushstroke. Examples of virtual paintings created through the Body Brush interface are shown in Figure 2.

Hand-Gesture Interface for Music Composition

One of the major functions of music is to express and communicate ones’ feelings, emotions and thoughts to the audience. Through an expressive musical piece, the audience could perceive and interpret the emotions and messages delivered by the composer and performer. However, before artists can genuinely express themselves through music, it is usually necessary for them to master music theories or the techniques of using a musical instrument for playing music. As a result, this learning curve is the major barrier that prevents artists from expressing themselves freely through the musical medium. In view of these difficulties, the AIM tech Centre at City University of Hong Kong has developed two innovative interfaces, a glove-based system called Cyber Composer, and a vision-based system known as Body Baton , to eliminate this barrier by enabling the users to dynamically control the musical sound generation and variations through their gestures and body motions.

Cyber Composer is designed such that no musical instrument is required in the music generation process. It is composed of the music interface, CyberGlove interface, background music generation module and the melody generation module. Figure 3 shows an overview of the architecture of Cyber Composer.

The CyberGlove interface sends gesture, in the form of joint angle measurements, and 3D positioning information to the system. These are then translated into parameters for realtime music generation. Background music generation module generates music according to the processed gesture information, constraints imposed by music theories, and user-defined parameters such as tempo and key. The music interface then converts these musical expressions into MIDI signals and output to a MIDI device

Specifically, the height of the user is used to adjust the pitch of the sound. The pitch will change when the user raises his/her hands, jumps up or crouches down. We choose this height-to-pitch mapping because people are used to associating “taller” with note of higher frequency and vice versa , so users should be able to learn how to control the pitch easily and intuitively. On the other hand, the body dimension is mapped to the volume control of musical sound generation, given the usual association of a wider body extent to a greater sound amplitude. In addition, the position of the user controls the stereo placement of the musical sound. For example, if the user moves closer to one of the two speakers, that speaker will sound louder. Figure 4 shows a Body Baton user displaying a typical gesture when composing a musical piece.

Conclusion

Human computer interface has moved from communicating with the machines through ways which are natural to them, e.g. textual commands and data via keyboard or the mouse, to ways and means which are natural to the human, e.g. speech, vision, gestures and tactile feedback. We have shown here several examples of applying novel human-computer interaction principles to the realm of artistic creation.

Human Feelings [next] [back] Huizenga, H. Wayne - Overview, Personal Life, Career Details, Chronology: H. Wayne Huizenga, Social and Economic Impact

User Comments

Your email address will be altered so spam harvesting bots can't read it easily.
Hide my email completely instead?

Cancel or

Vote down Vote up

over 2 years ago

http://www.guccihandbags.com.co/

http://www.gucci-outlet.in.net/

http://www.gucci--outlet.com.co/

http://www.guccishoes.net.co/

http://www.guccishoes.us.org/

http://www.hermesbags.com.co/

http://www.hermesbirkin.com.co/

http://www.hermesoutlet.net.co/

http://www.hollister.us.org/

http://www.hollisterclothing-store.in.net/

http://www.insanityworkout.net.co/

http://www.iphone-cases.us/

http://www.ralphlaurenpolo.in.net/

http://www.ray-ban-outlet.us.com/

http://www.raybans.us.org/

http://www.rayban-sunglasses.org.uk/

http://www.rayban-sunglasses.us.org/

http://www.raybansunglassesoutlet.net.co/

http://www.raybanwayfarer.in.net/

http://www.replicahandbags.com.co/

http://www.replicawatches.us.com/

http://www.retro-jordans.com/

http://www.rolex-watches.me.uk/

http://www.rosherun.org.uk/

http://www.rosheruns.us/

http://www.salvatoreferragamo.in.net/

http://www.soccer-shoes.org/

http://www.softball-bats.us/

http://www.suprashoe.net/

http://www.swarovskicrystal.com.co/

http://www.swarovskijewelry.com.co/

http://www.swarovski-uk.org.uk/

http://www.the-northface.com.co/

http://www.the-northface.in.net/

http://www.thenorth-face.org.uk/

http://www.thenorthface.us.org/

http://www.thenorthfacejackets.in.net/

http://www.thomassabo-uk.org.uk/

http://www.tiffanyandco.net.co/

http://www.tiffanyjewelry.us.org/

http://www.tory-burch-outlet.in.net/

http://www.tory-burchoutlet.us.com/

http://www.louboutin.jp.net/

http://www.louis-vuittoncanada.ca/

http://www.louisvuitton.jp.net/

http://www.louis--vuitton.org.uk/

http://www.louisvuitton.so/

http://www.louisvuittonas.com/

http://www.edhardy.in.net/

http://www.levisjeans.com.co/

http://www.bcbgdresses.net/

http://www.bebeclothing.net/

http://www.harrods-london.co.uk/

http://www.guccishoes.com.co/

http://www.ralphlaurenoutletonline.us.org/

http://www.true-religion.com.co/

http://www.truereligionjeans.net.co/

http://www.truereligion-outlet.com.co/

http://www.uggaustralia.net.co/

http://www.uggboots.net.co/

http://www.uggbootsclearance.com.co/

http://www.uggsonsale.com.co/

http://www.uggsoutlet.com.co/

http://www.uptocoachoutlet.com/

http://www.vansshoes.us/

http://www.weddingdressesuk.org.uk/

http://www.yogapants.com.co/

http://www.ugg-boots.us.org/

http://www.poloralphlaurenoutlet.net.co/

http://www.burberryoutletonline.ar.com/

http://www.toms-outlet.net.co/

http://www.michaelkors.in.net/

http://www.christianlouboutinoutlet.net.co/

http://www.toryburchsale.com.co/

http://www.pradaoutlet.com.co/

http://www.longchamp-handbags.in.net/

http://www.longchampoutlet.com.co/

http://www.chanel-bags.com.co/

http://www.truereligion-outlet.us.org/

http://www.abercrombie-and-fitch.us.com/

http://www.timberlandboots-outlet.net/

http://www.timberland-shoes.com/

http://www.tommyhilfiger.net.co/

http://www.tommy-hilfigeroutlet.com/

http://www.tomshoesoutlet.com/

http://www.toms-outlet.in.net/

http://www.toms-shoes.com.co/

http://www.hollisterclothing.in.net/

http://www.newbalance-shoes.org/

http://www.converse--shoes.net/

http://www.lululemonoutlet.com.co/

http://www.nfl-jerseys.in.net/

http://www.cheapjerseys.us.org/

http://www.rolex-watches.us.com/

http://www.rolexwatchesforsale.us.com/

http://www.p90xworkout.in.net/

http://www.giuseppezanotti.com.co/

http://www.maccosmetics.net.co/

http://www.instyler.in.net/

http://www.mizunorunning.net/

http://www.handbagsoutlet.com.co/

http://www.hilfigeroutlet.in.net/

http://www.kate-spade.com.co/

http://www.katespade-outlet.com.co/

http://www.kate-spades.com/

http://www.longchamp.us.org/

http://www.longchamp.com.co/