Brain Activity Associated with Graphic Emoticons. The Effect of Abstract Faces in Communication over a Computer Network

Page created by Julio Flores
 
CONTINUE READING
Brain Activity Associated with Graphic Emoticons. The Effect of Abstract Faces in Communication over a Computer Network
Electrical Engineering in Japan, Vol. 177, No. 3, 2011
Translated from Denki Gakkai Ronbunshi, Vol. 129-C, No. 2, February 2009, pp. 328–335

 Brain Activity Associated with Graphic Emoticons. The Effect of Abstract Faces in
                    Communication over a Computer Network

                         MASAHIDE YUASA,1 KEIICHI SAITO,2 and NAOKI MUKAWA1
                                 1
                                School of Information Environment, Tokyo Denki University, Japan
                            2
                             Research Center for Advanced Technologies, Tokyo Denki University, Japan

                          SUMMARY                                                                       1. Introduction

       In this paper, we describe the brain activities that are                       Various communications have become possible, any-
                                                                               time and anywhere, with the development of information
associated with graphic emoticons by using functional MRI
                                                                               networks. For example, one can communicate easily using
(fMRI). We use various types of faces, from abstract to
                                                                               images and sound via a video phone or mobile phone, and
photorealistic, in computer network applications. A graph-
                                                                               text messages can be exchanged instantly by e-mail and
ics emoticon is an abstract face in communication over a
                                                                               chat. In the latter case, the means of communication are
computer network. In this research, we created various                         restricted to characters, which may hinder the conveying of
graphic emoticons for the fMRI study and the graphic                           feelings and emotions. In order to enhance communication,
emoticons were classified according to friendliness and                        a variety of face images are available. For example, emoti-
level of arousal. We investigated the brain activities of                      cons are placed at the end of a sentence in e-mails and in
participants who were required to evaluate the emotional                       chat entries to convey emotions that cannot be expressed by
valence of the graphic emoticons (happy or sad). The ex-                       characters only [1]. Another means is graphic emoticons
perimental results showed that not only the right inferior                     (face icons) created by computer graphics (facial picto-
frontal gyrus and the cingulate gyrus, but also the inferior                   grams, smilies, etc.).
and middle temporal gyrus and the fusiform gyrus, were                                Facial means of communication are compared in
found to be activated during the experiment. Furthermore,                      terms of “level of abstraction” and “expressiveness” in
it is possible that the activation of the right inferior frontal               Table 1 [2–4]. This comparison pertains not only to elec-
gyrus and the cingulate gyrus is related to the type of                        tronic communications but also to paper media using por-
abstract face. Since the inferior and middle temporal gyrus                    traits and cartoons (manga) with specific facial
were activated, even though the graphic emoticons are                          expressions.* By expressiveness we mean the ability to
static, we may perceive graphic emoticons as dynamic and                       better convey emotions and flavor by using deformation,
living agents. Moreover, it is believed that text and graphics                 emphasizing or omitting certain parts, etc.
                                                                                      Usually, the amount of nonverbal information used in
emoticons play an important role in enriching communica-
                                                                               communication grows with a lower level of abstraction, and
tion among users. © 2011 Wiley Periodicals, Inc. Electr
                                                                               vice versa. For example, in the case of a teleconference
Eng Jpn, 177(3): 36–45, 2011; Published online in Wiley
                                                                               using cameras [Table 1, (a)], a great deal of nonverbal
Online Library (wileyonlinelibrary.com). DOI                                   information can be involved in communication so as to
10.1002/eej.21162                                                              enhance exchange among the participants. In the case of
                                                                               emoticons (g) and graphic emoticons (f) with a high level
                                                                               of abstraction, faces are composed of few parts such as the
                                                                               eyes and mouth, and the expressive means are limited.
     Key words: face; fMRI; emoticon; nonverbal com-                           However, emotional expressivity may, on the contrary, be
munication; human computer interaction.                                        lost when a face icon becomes less abstract and more

                                                                               *
Contract grant sponsor: MEXT Grant-in-Aid for Young Scientists (B)              See Refs. 2 and 3 for level of abstraction, and Refs. 5–7 for the expres-
19700119 as well as by Tokyo Denki University (Q06J-14).                       siveness of portraits and cartoons.

                                                                                                                © 2011 Wiley Periodicals, Inc.
                                                                      36
Brain Activity Associated with Graphic Emoticons. The Effect of Abstract Faces in Communication over a Computer Network
Table 1. Abstract faces used in communications over                 paired by injuries to the area from the right lateral aspect of
                  computer networks                                   the frontal cortex to the inferior frontal gyrus [11].
                                                                      Kawashima performed experiments on emotion discrimi-
                                                                      nation using facial expressions and speech, and reported
                                                                      significant changes in the right inferior frontal gyrus in both
                                                                      cases. Nakamura too points out the possibility that nonver-
                                                                      bal information is processed by the right inferior frontal
                                                                      gyrus [12]. The left inferior frontal gyrus belongs to Broca’s
                                                                      area, which is involved in verbal communications, and
                                                                      Kawashima assumed functional differentiation between the
                                                                      left and right inferior frontal gyri in verbal and nonverbal
                                                                      processing [11].
                                                                             In addition, activation of the cingulate gyrus during
                                                                      emotional attention is reported by Phan and colleagues
                                                                      [13]. In experiments conducted by Gusnard and colleagues
realistic, as in the case of sophisticated avatars (b) [2]. On        and Lane and colleagues [13, 14], the cingulate gyrus is
the other hand, portraits (d) and cartoons (e) with medium            activated in discrimination tasks (happy/sad) using emo-
level of abstraction and considerable freedom of exaggera-            tion-evoking videos and images. Takehara and Nomura
tion offer diverse and eye-catching ways to represent emo-            examined brain activities in the discrimination of ambigu-
tions.                                                                ous and clear facial expressions, and reported that the
       Thus, abstract faces in portraits and cartoons or de-          anterior cingulate gyrus and certain other areas became
formed faces have a number of advantages such as richness             more active when ambiguous facial expressions were pre-
of facial expression, friendliness, and liveliness [2]. Al-           sented [16].
though the advantages of abstract faces are apparent, no                     There are also other studies on brain activities, such
detailed evaluation and analysis is available. Such an analy-         as the experiments of Chao and colleagues using still im-
sis would explain the mechanisms underlying the expres-               ages of human faces and animals; in that study, areas of the
sivity of abstract faces.                                             temporal gyrus (superior, middle, inferior) were activated
       In this study, we examine the properties of abstract           [17]. As known from previous research, the temporal gyrus
faces using brain activities measured by fMRI. Such meas-             is activated by biological motions involving the eyes and
urements of brain activities were performed previously for            mouth, or by animals [18]. Wicker and colleagues per-
emoticons, the most abstract face images used in electronic           formed brain measurements when a subject tracked the
communications [4, 8]. In this paper, we describe fMRI                gaze of a person on screen, and observed the activation of
experiments with graphic emoticons. The difference be-                the inferior and middle temporal gyri as well as some other
tween emoticons and graphic emoticons is not intuitively              areas [19]. According to Hoffman and Haxby, the area
evident; the purpose of this study is to investigate this             around the superior temporal sulcus is involved in gaze
difference by measurement of brain activities rather than by          tracking [20]. Puce and colleagues used face photographs
subjective methods such as statements and questionnaires.             and drawings to ascertain that the area around the superior
We believe that fMRI observations at different levels of              temporal sulcus was activated when the mouth moved [21].
abstraction will be helpful in clarifying the features of             In addition to other studies dealing with eye and mouth
abstract faces as well as in communications using abstract            movements [22–24], some studies suggest that the temporal
faces.                                                                gyrus area is also activated when seeing human motions
                                                                      expressed by light spots and other complex biological mo-
                  2. Previous Research                                tions [18, 25, 26]. Thus, it appears that the temporal gyrus
                                                                      responds to lively motion patterns specific to living things
       As regards face recognition, Kanwisher and col-                [18]. In this context, Chao and colleagues assume that facial
leagues reported that the right fusiform gyrus activates              expressions and other basic biological motion patterns,
when face images are presented to experimental subjects               even though rendered by still images, may recall past
[9, 10]. In addition, prosopagnosic patients, who can no              memories, thus activating the area around the temporal
longer recognize human faces, are reported to have patholo-           gyrus [17].
gies in the fusiform gyrus, lingual gyrus, and other areas of                There are reports of brain activities induced not only
the right hemisphere [11]. Thus, it appears that the right            by seeing biological motions but also by inferring emotions
fusiform gyrus is related to face recognition.                        from biological motions, or by inferring another person’s
       As regards the discrimination of facial expressions,           feelings from his or her gaze. Inference experiments based
comprehension of facial expressions is known to be im-                on the “theory of mind” [27, 28], gaze tracking and joint

                                                                 37
Brain Activity Associated with Graphic Emoticons. The Effect of Abstract Faces in Communication over a Computer Network
attention [20, 29] as well as other social interactions involv-
ing inference of another person’s thoughts and feelings are
reported to activate the areas of the medial prefrontal cortex,
cingulate gyri, temporal poles, and superior temporal sulci
[18].
       Brain measurements by Yuasa and colleagues in dis-
crimination tasks using emoticons show that the right fusi-
form gyrus is not activated but the right inferior frontal
gyrus and cingulate gyrus are activated [4, 8, 30, 31]; hence
we may expect similar results with graphic emoticons.

          3. Brain Measurement Using fMRI

       3.1 Outline of experiments
                                                                             Fig. 1. Examples of classified graphic emoticons.
       As in the previous experiments on discrimination of
facial expressions using photographs and emoticons [4, 8,
11, 12], we again measured brain activity in the case of
graphic emoticons. In preliminary tests, we found that it                 Just as in the previous research by Schlosberg and Russell,
was difficult to discriminate between “happy” and “sad” the               “happy” faces corresponded to high arousal and friendli-
expressions in the case of some graphic emoticons; thus,                  ness, “angry” faces to high arousal and unfriendliness,
we classified the graphic emoticons as explained below.                   “sad” faces to low arousal and unfriendliness, and “cool”
                                                                          faces to moderate arousal and affective valence.
       3.2 Creation of graphic emoticons                                        Thus, we decided to use “happy” and “sad” graphic
                                                                          emoticons (areas shown by circles in Fig. 1) as the stimuli
       We gathered graphic emoticons as well as other facial              for brain measurement.
marks, logos, and other images from magazines, the Web,
and other sources. These were used as a base for several
                                                                                3.3 Experimental setup
students to compile 60 new graphic emoticons by combin-
ing and deforming eyes, nose, mouth, and other face parts.                      In our experiments, we used a 1.5-T superconducting
Although graphic emoticons are essentially less abstract                  MRI scanner (Stratis II by Hitachi Medical Corp.). The
than regular emoticons, we aimed at a higher level of                     experimental subjects lay inside the fMRI scanner with
abstraction by removing face contours.                                    prism glasses on, and viewed visual stimuli projected on a
       In order to select the best experimental stimuli among             screen near their feet (Fig. 2). The stimuli were presented
the thus created graphic emoticons, we classified them by                 in task-and-rest block design (Fig. 3). The task and rest
facial expression [16, 34] using studies by Schlosberg [32]               stimuli were alternated every 50 seconds. Multiple visual
and Russell [33]. In classification experiments, the subjects             stimulus images were prepared and presented 10 times for
arranged individual clipped graphic emoticons using work-                 5 seconds each during task and rest. The experimental
sheets with two preprinted orthogonal axes, Arousal and
Friendliness. Here arousal represents the degree of excite-
ment or tension: that is, “high arousal” means feeling
exhilarated, strongly agitated or tense, and on the contrary,
“low arousal” means feeling depressed, languid, or easy.
Arousal and friendliness are utilized as emotional dimen-
sions by Schlosberg [32], Russell [33], Reeves and Nass
[35], and other researchers; thus, we adopt these axes in this
study.
       Graphic emoticons were arranged on the worksheets
by 10 students majoring in science, after which the average
positions were calculated. An example is given in Fig. 1.*

*
 The diagram shows most typical graphic emoticons among 60 used in
classification.                                                                       Fig. 2. fMRI and visual stimuli.

                                                                     38
images were obtained by mapping significant signals onto
                                                                     standard brain templates. The scans were implemented
                                                                     using EPI-GE sequences under the following conditions.
                                                                           •   Scan width: 240 mm
                                                                           •   TR/TE: 4600/50.5 ms
                                                                           •   Flip angle: 90°
           Fig. 3. Task-and-rest block design.                             •   Slice thickness: 4.0 mm
                                                                           •   Slice interval: 1.0 mm
                                                                           The voxel dimensions in the EPI images were 3.75 ×
subjects compared the preselected graphic emoticons (task)           3.75 × 5.0 mm. The maximum half-width (FWHM) was set
and their scrambled images (rest) as shown in Fig. 4.                to 10 mm, and the analysis was performed as follows.
       The subjects were asked to push a button when the
graphic emoticon conveyed a sad facial expression. This                    • Realignment: Correction of position shifts caused
was done to confirm whether the same tasks as in previous                    by body movements of the subjects during the
research [4, 11, 12] resulted in activation of the same brain                experiments.
sites in the case of graphic emoticons. In addition, the                   • Normalization: conversion of an individual sub-
subjects were instructed to continue without interruption                    ject’s brain configuration to a standard brain tem-
even if they pressed the button by mistake. Measured data                    plate (Talairach).
for such misjudgment cases were also used in the analysis.                 • Smoothing: suppression of noise included in ob-
This is because we thought that recognition of emotions,                     served images to improve the S/N ratio.
even though erroneous, was helpful in comparing activated                  • Statistics: t-test for each voxel.
brain areas.
                                                                            The above processing was implemented using the
                                                                     SPM99 (Statistical Parametric Mapping) medical image
      3.4 Scanning method
                                                                     analysis software [36]. By using SPM99, sites were de-
      The contents and procedures of the experiments as              tected where signal strength was statistically significant in
well as important issues (risks, personal information pro-           the task and rest blocks for every subject. In addition,
tection, etc.) were explained to the subjects using a text           variance tests were applied to the data of multiple experi-
approved by the Ethics Committee, and their consent was              mental subjects, and activated sites were determined by a
obtained.                                                            significance test (t-test). Correction for multiple compari-
      In these experiments, a statistical significance test          son was employed to deal with correlation between neigh-
was applied to the BOLD (Blood Oxygenation Level De-                 boring voxels (see Ref. 37 for details). Significance was
pendent) signals of the task and rest blocks, and brain              defined as p < 0.05.
                                                                            The estimated activation patterns were represented
                                                                     by an SPM (Z) map [36, 38–40], and 3D brain images were
                                                                     obtained.

                                                                           3.5 Experimental results

                                                                            The experimental subjects were 11 right-handed
                                                                     male university students majoring in science. In statistical
                                                                     processing, the t-test was applied to every voxel, and the
                                                                     site coordinates estimated from brain activities of each
                                                                     subject were compared. Two subjects with activity patterns
                                                                     different from the other subjects were further analyzed.
                                                                     These two subjects exhibited significant activities near the
                                                                     temporal gyrus and cingulate gyrus, but the fusiform gyrus
                                                                     and inferior frontal gyrus did not show any significant
                                                                     activation. Thus, variance tests were applied to nine other
                                                                     subjects, and an activity map was built as shown in the lower
 Fig. 4. Examples of graphic emoticons and scrambled                 part of Fig. 5. Here the significantly activated parts are
                       images.                                       shown by red (after multiple comparison correction).

                                                                39
Fig. 5. Brain activities in experiments: activated areas are shown by black (previous studies) and red (this study). [Color
                  figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.]

       Two examples (two of nine subjects) of the response           significantly activated areas (black) observed in the case of
observed near the temporal gyrus are given in Fig. 6. The            emoticons and face photographs [4]. Comparison results
blue line shows the observed signal (BOLD signal), and the           for major brain areas are presented in Table 2. In particular,
red line represents the signal variation (hemodynamic)               the table describes the activities of the right fusiform gyrus,
model regarding the time delay of cerebral blood flow                the right inferior frontal gyrus, and the right middle/inferior
increase. Some time is required for the cerebral blood flow          temporal gyrus in the case of face photographs, emoticons
to increase after seeing a stimulus. This time delay must be         and graphic emoticons.
taken into account in order to correlate the variation of the               In our experiments, activation was observed near the
blood flow signal and the experimental task/rest blocks [37,         right fusiform gyrus [Fig. 5 (A)]. In addition, significant
41]. The red line shows such a hemodynamic model created             activities were also registered at the right inferior frontal
using SPM. In these experiments, the task and rest blocks            gyrus [Fig. 5 (B)] with standard brain coordinates (X, Y, Z)
were presented three times, and hence there are three peaks          = (54, 26, –2), the right inferior temporal gyrus (X, Y, Z) =
and valleys in the signal variation model. The brain sites           (54, –24, –12), and the right middle temporal gyrus (X, Y,
involved in the experiments can be identified statistically          Z) = (58, –36, 2) [Fig. 5 (C)]. Activation was also observed
                                                                     around the cingulate gyrus (X, Y, Z) = (2, –2, 34).
by calculation between the model and the actual response.
As can be seen from Fig. 6, the signal variation pertaining
to the task and rest stimuli (the red line) agrees well with                                4. Discussion
the observed response (the blue line).
       Significantly activated brain areas are shown in the               Since graphic emoticons are similar to emoticons,
lower part of Fig. 5. For comparison, the upper part shows           one might expect that the right inferior frontal gyrus would

                              Table 2. Comparison between previous studies and present study

                                                                40
cons are used as a standard tool for emotional communica-
                                                                      tion, like voice and facial expression. Confirmation of this
                                                                      assumption by measured brain activities seems very impor-
                                                                      tant.
                                                                             As can be seen from Table 2, the temporal gyrus area
                                                                      shows significant activation with graphic emoticons, which
                                                                      is not the case with emoticons and face photographs. This
                                                                      area is known to respond to complex biological motions
                                                                      rather than to simple ones [18, 25]. According to Chao and
                                                                      colleagues, facial expressions and other basic biological
                                                                      motion patterns, even though rendered by still images, may
                                                                      recall past memories, thus activating the temporal gyrus
                                                                      [17]. Similarly, graphic emoticons are also still images but
                                                                      we may assume the same recollective mechanism of tem-
                                                                      poral gyrus activation with exaggerated emotional expres-
                                                                      sion by means of graphic emoticons; for example,
                                                                      expression of joy by “winking eyes,” or expression of
                                                                      sadness by “downward mouth,” in Figs. 1 and 4. The brain
                                                                      activities indicate that, in contrast to emoticons and face
                                                                      photographs, graphic emoticons offer a unique effect of
                                                                      exaggerated expression, which seems an important conclu-
                                                                      sion. For example, it is possible that graphic emoticons
                                                                      remind the viewer of vivid dynamics offered by cartoons
                                                                      and comics.
                                                                             Experiments based on the “theory of mind” [27, 28],
                                                                      gaze tracking and joint attention [20, 29], as well as other
                                                                      social interactions, are reported to activate the areas of the
                                                                      medial prefrontal cortex (cingulate gyrus area), the tempo-
                                                                      ral poles, and the superior temporal sulci (temporal gyrus
                                                                      area) [18]. Very similar activation patterns were also ob-
                                                                      served in our experiments with graphic emoticons. Thus,
                                                                      we may assume a strong relation to social interactions that
                                                                      involve inference of another person’s thoughts and feelings.
  Fig. 6. Brain activities in experiments: responses at               For example, comics characters have faces different from
activated areas. [Color figure can be viewed in the online            normal human faces, but viewers feel touched and empathic
  issue, which is available at wileyonlinelibrary.com.]               due to the exaggerated expressions. Portraits, too, are often
                                                                      very different from real people, but viewers are touched
                                                                      (“You know who it is immediately!” or “Well, it is an
                                                                      interesting perspective!”). Although they are just still im-
activate and the right fusiform gyrus would not. Actually,            ages, comics and portraits evoke social interactions when
however, activation around the right fusiform gyrus was               inferring characters’ emotions and the artist’s message, thus
observed. This can be explained by the fact that graphic              entertaining the viewer. In the future, we plan to continue
                                                                      research on abstract faces from the standpoint of such
emoticons, while being highly abstract, contain more spe-
                                                                      interactions.
cific face parts (eyes, nose, etc.) than emoticons. On the
                                                                             Significant activation was also observed at the right
other hand, emoticons contain some basic face parts such
                                                                      inferior parietal lobule and the right and left middle frontal
as the eyes and mouth, but these features are not distinct            gyri. In addition, weak activities were also detected around
enough to activate the right fusiform gyrus.                          the parahippocampal gyrus and cerebellum. Previous re-
       Considering activation of the right inferior frontal           search [42, 43] suggests that the right inferior parietal
gyrus by graphic emoticons and emoticons, we may con-                 lobule is related to visual attention and stimulus location,
clude that both types are not merely highly abstract images           and the middle frontal gyri are related to working memory.
but are similar to nonverbal information such as the human            Since the same sites were activated in preceding experi-
voice and facial expression, and are therefore processed by           ments with face images, we may assume that visual atten-
the brain in a similar way. There is a possibility that emoti-        tion and memory are involved in emotion discrimination

                                                                 41
tasks. The parahippocampal gyrus is known to relate to the            frontal gyrus did not show any significant activation. This
formation of episode memory, and to recall of semantic                indicates individual difference in the comprehension of
memory [44]. In addition, Epstein and Kanwisher report                graphic emoticons. On the other hand, strong activity of the
strong activation near the hippocampus and parahippocam-              cingulate gyrus area was detected in two subjects (shown
pal gyrus when viewing landscapes and scenes, and discuss             by red in Fig. 7). Thus, we may assume that the preclassified
the relation of these sites to facial stimuli [45]. Thus, it          graphic emoticons were perceived by the two subjects as
appears that when graphic emoticons are presented, the                ambiguous. Further experiments should be carried out to
parahippocampal gyrus is activated via reminiscences of               develop a classification of facial stimuli, and to examine
previous scenes, which could be clarified by additional               individual differences for sorted abstract faces.
tests. As regards the cerebellum, Allen and colleagues point
out the relation to action prediction as well as attention and                             5. Conclusions
other cognitive processes [46]. In addition, Kudo and col-
leagues report a relation between cerebellar injuries and                    We considered abstract faces used in communica-
cognitive and emotional disorders [47], and Saito presents            tions, and attempted to measure brain activity related to
conclusions about its involvement in communications from              graphic emoticons. After preliminary classification of
the results of autism studies [48]. Therefore, in our experi-         graphic emoticons, we measured brain activity in emotional
ments, the activation of the cerebellum may be related to             discrimination tasks. We found that the right fusiform
emotions and communication.                                           gyrus, right inferior frontal gyrus, and right temporal gyrus
       In this study, the experimental subjects were only             area were activated, in contrast to regular emoticons.
male students. Additional experiments are needed to con-              Graphic emoticons and regular emoticons are not merely
firm these results for people of different sex and age groups.        face images with high level of abstraction: they seem to be
Takehara examined whether results obtained for the emo-               processed in a way similar to nonverbal information such
tional recognition of emoticons by young people also apply            as human voice and facial expression. We also concluded
to communications among older people [49]. He found that,             that exaggerated expression of graphic emoticons using the
just as with young people (university students), emoticons            eyes, brows, mouth, and other face parts can recall biologi-
can convey emotions in e-mails exchanged among older                  cal motions, thus activating the temporal gyrus area.
people (students’ parents). Therefore, we may expect simi-                   In this study, emoticons were not presented to the
lar brain activation patterns with different sex and age              subjects who participated in experiments with graphic emo-
groups.                                                               ticons. In the future, we are planning experiments with the
       Furthermore, in two of nine experimental subjects,             same subjects to compare face images with different level
significant activities were observed near the temporal gyrus          of abstraction.
and cingulate gyrus, while the fusiform gyrus and inferior                   We plan to examine the properties of abstract faces
                                                                      by further detailed experiments with drawings, avatars, and
                                                                      other images. Such continued studies will contribute to
                                                                      such fields as dialog-based interface or robot design involv-
                                                                      ing the issue of the optimal level of abstraction of face
                                                                      images for particular types of communications. For exam-
                                                                      ple, an intuitive interface with clear emotional expression
                                                                      can be implemented by using face images that offer signifi-
                                                                      cant activation in emotional discrimination tasks, or face
                                                                      images related to social interactions.

                                                                                          Acknowledgments

                                                                            We are grateful to Messrs. H. Hoshi and H. Nakatani
                                                                      (Tokyo Denki University) for their assistance in the fMRI
                                                                      experiments, and to all persons who participated in the
                                                                      experiments. We also express our deep gratitude to re-
    Fig. 7. Brain activities of two subjects (sagittal):              searchers of the Research Center for Advanced Technolo-
   activated areas are shown by red (anterior cingulate               gies (Tokyo Denki University) for their valuable advice
  cortex, corrected). [Color figure can be viewed in the              regarding fMRI measurement. This study was supported in
            online issue, which is available at                       part by a MEXT Grant-in-Aid for Young Scientists (B)
                 wileyonlinelibrary.com.]                             19700119 as well as by Tokyo Denki University (Q06J-14).

                                                                 42
REFERENCES                                     18. Firth U, Firth CD. Mentalizing in the brain: The
                                                                       neuroscience of social interaction: Decoding, imitat-
 1. Takehara T, Sato N. Promotion of emotional commu-                  ing, and influencing the actions of others. 2004. p
    nication by means of “happy” emoticons. Kaogaku                    54–75.
    (J Jpn Acad Facial Stud) 2004;4:9–17.                          19. Wicker B, Michel F, Henaff MA, Decety J. Brain
 2. Koda T, Maes P. Agents with faces: The effects of                  regions involved in the perception of gaze: A PET
    personification of agents. 1996.                                   study. NeuroImage 1998;8:221–227.
 3. McCloud S. Understanding comics. Harper Peren-                 20. Hoffman EA, Haxby JV. Distinct representations of
    nial; 1993.                                                        eye gaze and identity in the distributed human neural
 4. Yuasa M et al. Brain activity associated with emoti-               system for face perception. Nature Neurosci
    cons: An fMRI study. Effects of facial expressions in              2000;3:80–84.
    personal communications over a computer network.               21. Puce A, Syngeniotis A, Thompson JC, Abbott DF,
    Trans IEEJ 2007;127C:1865–1870.                                    Wheaton KJ, Castiello U. The human temporal lobe
 5. Suzuki K et al. What makes a portrait attractive. 12th             integrates facial form and motion: evidence from
    Forum of Japan Academy of Facial Studies, 2007.                    fMRI and ERP studies. NeuroImage 2003;19:861–
 6. Masuda E et al. Relationship between caricature face               869.
    shape and impression: Understanding author’s mes-              22. Calvert GA, Bullmore ET, Brammer MJ, Campbell
    sage. 12th Forum of Japan Academy of Facial Stud-                  R, Williams SCR, McGuire PK, Woodruff PWR,
    ies, 2007.                                                         Iversen SD, David AS. Activation of auditory cortex
 7. Yokota M. Clinical psychology of animation.                        during silent lipreading. Science 1997;276:593–596.
    Seishin-Shobo; 2006.                                           23. Calvert G, Campbell R, Brammer M. Evidence from
 8. Yuasa M, Saito K, Mukawa N. Emoticons convey                       functional magnetic resonance imaging of crossmo-
    emotions without cognition of faces: An fMRI study.                dal binding in the human heteromodal cortex. Curr
    Proceedings of CHI2006, Montreal, Canada.                          Biol 2000;10:649–657.
 9. Kanwisher N, McDermott J, Chun MM. The fusi-                   24. Puce A, Allison T, Bentin S, Gore JC, McCarthy G.
    form face area: A module in human extrastriate cortex              Temporal cortex activation in humans viewing eye
    specialized for face perception. J Neurosci                        and mouth movements. J Neurosci 1998;18:2188–
    1997;17:4302–4311.                                                 2199.
10. Tong F, Nakayama K, Moscovitch M, Weinrib O,                   25. Bonda E, Petrides M, Ostry D, Evans A. Specific
    Kanwisher N. Response properties of the human                      involvement of human parietal systems and the
    fusiform face area. Cogn Neuropsychol                              amygdala in the perception of biological motion.
    2000;17:257–279.                                                   Neuroscience 1996;16:37–44.
11. Kawashima R. Brain imaging of higher functions.                26. Puce A, Perrett D. Electrophysiology and brain im-
    Igaku-Shoin; 2002.                                                 aging of biological motion. Philos Trans R Soc Lon-
12. Nakamura K. Activation of the right inferior frontal               don Ser B 2003;358:435–445.
    cortex during assessment of facial emotion. Brain              27. Castelli F, Happe F, Frith U, Frith C. Movement and
    Nerve 1999;43:519–527.                                             mind: A functional imaging study of perception in-
13. Phan LK, Wager T, Taylor SF, Liberzon I. Functional                terpretation of complex intentional movement pat-
    neuroanatomy of emotion: A meta-analysis of emo-                   terns. NeuroImage 2000;12:314–325.
    tion activation studies in PET and fMRI. NeuroImage            28. Happe FGE. An advanced test of theory of mind:
    2002;16:331–348.                                                   Understanding of story characters’ thoughts and feel-
14. Gusnard DA, Akbudak E, Shulman GL, Raichle ME.                     ings by able autistic, mentally handicapped, and nor-
    Medial prefrontal cortex and self-referential mental               mal children and adults. J Autism Dev Disorders
    activity: relation to a default mode of brain function.            1994;24:129–154.
    Proc Natl Acad Sci USA 2001;98:4259–4264.                      29. Kampe KKW, Frith CD, Frith U. “Hey John”: Signals
15. Lane RD, Fink GR, Chau PM, Dolan RJ. Neural                        conveying communicative intention toward the self
    activation during selective attention to subjective                activate brain regions associated with “mentalizing,”
    emotional responses. Neuroreport 1997;8:3969–                      regardless of modality. J Neurosci 2003;23:5258–
    3972.                                                              5263.
16. Takehara T, Nomura M. Forefront studies on human               30. Yuasa M et al. Brain activity associated with abstract
    face. Kitaoji Press; 2004.                                         faces—Effects of facial expressions in personal com-
17. Chao L, Haxby V, Martin A. Attribute-based neural                  munications over computer network. IEEJ Medical
    substrates in temporal cortex for perceiving and                   and Biological Engineering Technical Committee, p
    knowing about objects. Nature Neurosci 1999;2(10).                 19–23, 2005.

                                                              43
31. Yuasa M et al. Brain activation by emoticons.                  42. Myers PS. Right hemisphere damage: Disorders of
     Kaogaku (J Jpn Acad Facial Stud) 2005;5:134.                       communication and cognition. 1996 (in Japanese
 32. Schlosberg H. The description of facial expressions                translation).
     in terms of two dimensions. 1952;44:229–237.                   43. Kranczioch C, Debener S, Schwarzbach J, Goebel R,
 33. Russell J. Reading emotions from and into faces:                   Engel AK. Neural correlates of conscious perception
     Resurrecting a dimensional-contextual perspective.                 in the attentional blink. Neuroimage 2005;24:704–
     Cambridge University Press; 1997.                                  714.
 34. Yoshikawa S et al. Face and mind: Introduction to              44. Suzuki K et al. What makes a portrait attractive. Tech
     facial psychology. Saiensu Press; 1993.                            Rep IEICE 2008;HIP2007-164:31–36.
 35. Reeves B, Nass C. The media equation (in Japanese              45. Epstein R, Kanwisher N. Acortical representation of
     translation).                                                      the local visual environment. Nature 1998;392:598–
 36. SPM: http://www.fil.ion.ucl.ac.uk/spm/                             601.
 37. Tsukimoto H et al. Introduction to brain function              46. Allen G, Buxton RB, Wong EC, Courchesne E. At-
     image analysis. Ishiyaku Publ.; 2004.                              tentional activation of the cerebellum independent of
 38. McRobbie DW, Moore EA, Graves MJ, Prince MR.                       motor involvement. Science 1997;275:1940–1943.
     MRI: from picture to proton. Ohm Press; 2004.                  47. Kudo Y et al. Cognitive, affective and behavioral
 39. Ishii I. Statistic analysis of brain function images—              disturbances due to cerebellar hemorrhage: Inhibi-
     SPM and 3D-SSP. Saishin Igaku 2005;60:980–987.                     tory factors for rehabilitation. Jpn J Rehabilitation
 40. Kawamura S, Ueno T. Statistic image analysis in                    Med 2005;42:463–468.
     nuclear neurology and fMRI. JSRT J 2003;59:594–                48. Saito O. Cerebellar abnormality in autism. Tech Rep
     603.                                                               IEICE 1998;TL98-11.
 41. Takeda T. Brain engineering. Corona Publ.; 2003.               49. Takehara T. Generational difference in emotion rec-
                                                                        ognition effect of emoticons. Kaogaku (J Jpn Acad
                                                                        Facial Stud) 2007;7:37–45.

                                             AUTHORS (from left to right)

      Masahide Yuasa (member) received a bachelor’s degree from Tokyo University of Science in 1998, completed the doctoral
program at Tokyo Institute of Technology (Intelligent Systems) in 2004, and joined the faculty of Tokyo Denki University as a
research associate. His research interests are human interfaces, interaction of embodied agents, and brain science. He holds a
D.Eng. degree, and is a member of IEEE, ACM, JSAI, IPSJ, and HIS.

      Keiichi Saito (member) received a bachelor’s degree from Waseda University in 1983 and joined the faculty as a research
associate. He is now an associate professor at the Research Center for Advanced Technologies, Tokyo Denki University. His
research interests are psychophysiological detection of deception using higher brain function analysis, relationship between
humans and information environment. He is Vice President of BMFSA. He holds a D.Eng. degree, and is a member of JSPPP,
JSWSAT, and other societies.

                                                             44
AUTHORS (continued)

      Naoki Mukawa (nonmember) completed the doctoral program at Waseda University (Graduate School of Science and
Engineering) in 1976 and joined NTT Electrical Communication Laboratory. Since 2003 he has been a professor at Tokyo Denki
University. His fields of expertise are image processing, image analysis, and human interface; especially interested in face and
eyesight communications. His fields of research are eyesight effect analysis, visual communications, and brain activity. He holds
a D.Eng. degree, and is a member of IEICE (Fellow), IPSJ, JSAI, Japan Academy of Facial Studies, IEEE, and ACM.

                                                                45
You can also read