Research Article
Emotional Expression Processing and Depressive
Symptomatology: Eye-Tracking Reveals Differential Importance
of Lower and Middle Facial Areas of Interest
Laurie Hunter , Laralin Roland, and Ayesha Ferozpuri
Christopher Newport University, Newport News, VA, USA
Correspondence should be addressed to Laurie Hunter; laurie.hunter@cnu.edu
Received 2 July 2019; Revised 23 September 2019; Accepted 5 October 2019; Published 6 January 2020
Academic Editor: Bettina F. Piko
Copyright © 2020 Laurie Hunter et al. is is an open access article distributed under the Creative Commons Attribution License,
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
e current study explored the eye-tracking patterns of individuals with nonclinical levels of depressive symptomatology when
processing emotional expressions. Fiy-three college undergraduates were asked to label 80 facial expressions of ve emotions (anger,
fear, happiness, neutral, and sadness) while an eye-tracker measured visit duration. We argue visit duration provides more detailed
information for evaluating which features of the face are used more oen for processing emotional faces. Our ndings indicated
individuals with nonclinical levels of depressive symptomatology process emotional expressions very similarly to individuals with
little to no depressive symptoms, with one noteworthy exception. In general, individuals in our study visited the “T” region, lower
and middle AOIs (Area of Interest), more oen than upper and noncore areas, but the distinction between the lower and middle
AOIs appears for happiness only when individuals are higher in depressive symptoms.
1. Introduction
Emotion recognition and facial processing are crucial abilities
necessary for successful daily social interactions. Depression
inuences how individuals process their social environment,
thus impacting social interactions. Studies have explored how
depressed individuals process various emotional stimuli by
exploring a preference for or focus on one of multiple images.
For instance, Sears et al. [1] measured eye movements of
depressed participants who were asked to view 4 dierent
images and reported depressed individuals attended longer to
depressive images. Social interactions oen involve processing
individual facial expressions of emotion, rather than a prefer-
ence for or focus on one of several images, an important issue
addressed in the current study.
Previous research relied on various, inconsistent tasks and
processing measures to examine the relationship between
depression and emotion processing. Few studies [2–4] have
assessed areas of interest on the face and what features
depressed individuals attend to when processing facial expres-
sions of emotion. To our knowledge, only two studies have
explored how depressed individuals process the various areas
of an emotional expression using the eye-tracking technology
[3, 4], but these researchers did not specically address
whether any particular facial feature (AOI) yielded greater
processing time when judging emotional expressions. Both
studies indicated emotion recognition accuracy is enhanced
for depressed individuals when they focus on multiple features
of the face. To better understand the processing of facial
expressions of emotion, we assessed visit duration as our meas-
ure of processing time in an eort to better explain where the
individual gathers information for the emotional stimuli. We
also explored AOIs identied as important in emotion recog-
nition [5] to determine whether depressed individuals use
AOIs dierently than nondepressed individuals.
Research utilizing faces as stimuli and employing a
free-viewing task, is limited, and conclusions are contradictory
regarding the negative, unpleasant image bias among depressed
individuals. Duque and Vazquez [6] researched visual atten-
tion in depressed and nondepressed participants with emo-
tional faces. Individuals with depression spent more time on
sad faces and less time on happy faces, and longer rst xations
toward sad faces only. Similarly, Caseras et al. [7] supported a
negative image bias for rst xation duration and total xation
Hindawi
Depression Research and Treatment
Volume 2020, Article ID 1049851, 7 pages
https://doi.org/10.1155/2020/1049851
Depression Research and Treatment2
time in unpleasant emotional faces for depressed individuals.
Mogg et al. [8] assessed eye movements of individuals with
generalized anxiety disorder and individuals with depression
while viewing happiness, sadness, anger, and neutral faces.
ere were no group differences in initial fixations revealing
no negative attentional bias. ese researchers did not utilize
fixation duration, thus limiting conclusions regarding the neg-
ative image bias on emotional faces. ese inconsistent find-
ings may be explained by the task itself. In the aforementioned
studies, participants simply viewed the faces without any
instruction to judge emotional content. Perhaps explicit
instructions focused on emotion processing which provided
a more representative task similar to social interactions, i.e.,
an emotion judgment task paradigm.
An emotion judgment task paradigm is viewing a stimulus
with the assigned task of judging the emotion being displayed.
In eye-tracking studies, participants typically view an image
or a face on a screen and make an emotional judgment while
the researcher tracks the participants’ eye movements. We
argue this judgment task provides a better understanding of
how individuals process emotional interactions.
Only three studies, to our knowledge, have combined a
task-oriented paradigm and eye-tracking. Matthews and Antes
[2] explored dysphoric and nondepressed participants’ eye fix-
ations on emotional images (but not facial expressions).
Depressed individuals went to sad images more frequently than
nondepressed individuals. Additionally, Schmid et al. [3]
explored mood priming on emotion judgment and eye-tracking
as well as AOIs on the face. Interfeatural saccade ratio (jumping
from feature to feature across the face divided by total jumps)
reflected a multiple feature processing approach. e feature
gaze duration was used to show single feature processing
approach. Schmid et al. [3] concluded when participants were
in a sad mood, these two processing styles influenced emotion
recognition abilities. Specifically, the multiple feature processing
approach was positively related to emotion recognition only
when participants were in sadness. Furthermore, the single fea-
ture processing approach was negatively related to emotion
recognition only when participants were in a state of sadness.
ere were, however, no findings for happy mood. Furthermore,
emotion recognition accuracy was enhanced with depressed
individuals taking a multiple AOI approach. Finally, Wu et al.
[4] further addressed the issue of features of the face by meas-
uring fixation duration in an emotional expression judgment
task with participants who were depressed and not depressed.
eir findings indicated, although spending less time on AOIs,
depressed individuals were equally as accurate as nondepressed
individuals when judging emotional expressions. Of importance
to the current study, time spent on the AOIs (the eyebrows, eyes,
nose, and mouth) did not differ between the two groups
(depressed and nondepressed). e nose AOI yielded the great-
est fixation duration compared to the eyes, followed by the
mouth, and then the eyebrows. e researchers noted that this
finding was likely due to the placement of a fixation cross in the
middle of the screen for each trial, an issue which was eliminated
in the current study. Furthermore, the current study also
addressed noncore features (not the eyebrows, eyes, and nose/
mouth) of the face as an AOI, to explore whether those features
are utilized in emotion processing.
2. Eye-Tracking Metrics
As noted, eye-tracking technology offers a unique way to
assess how individuals process emotional stimuli. Typical data
processing from eye-tracking studies utilizes fixation metrics
which indicate initial interest in facial features but not atten-
tional maintenance on those features, an important construct
in emotion recognition [9]. As noted above, some studies focus
on initial fixation [1, 2, 6, 8, 10]. Another dependent variable
used is fixation duration [2, 4, 6, 7, 10–12]. Initial fixation and
fixation duration seem to be the variables of choice, but visit
duration may provide more information regarding attentional
maintenance because it reflects whether an individual visits
and revisits a particular feature of the face. Salvucci and
Goldberg [13] explain a visit has a starting point and an ending
point similar to a fixation. Fixations are specific points where
the individual looks at a given moment in time. Each individ-
ual eye movement point serves as a fixation. Fixations, how-
ever, do not accurately explain AOIs processing. A visit is when
an individual makes an eye movement point on the face, then
leaves the point last made to a different area on the face and
then returns to the same point initially made. Visit data would
better explain significant areas of interest for emotions because
it assesses areas used to gather more information, because it
represents “a higher-level collection of fixations organized
about visual targets and areas” ([13], p. 5).
3. Areas of Interest in Emotion Recognition
In emotion recognition literature, research has suggested dis-
tinct facial areas (AOIs) express each emotion in a variety of
different ways and intensities. In a study conducted by
Hasegawa and Unuma [14], participants were shown images
which had variation in eyebrow slant, eye opening, and
mouth-opening to demonstrate anger and sadness. Anger was
more easily recognized when the eyebrows are at a slant, the
mouth is open, and the eyes are closed. Sadness was perceived
most oen when the eyebrows were lowered and the eyes were
more closed. e study by Hasegawa and Unuma [14] supports
Sullivans [15] claim stating specific facial areas distinguish
specific emotions. Eisenbarth and Alpers [16] also found indi-
viduals focused on specific facial AOIs for each emotion, sug-
gesting different parts of the face distinguish specific emotions.
For instance, the AOI which is ideal for recognizing happiness
may not be the same AOI which is ideal for recognizing
another emotion, such as anger. Ekman and Friesen [5] iden-
tified three AOIs which assist in emotion recognition: upper,
middle, and lower parts of the face. Research exploring AOIs
should consider the three areas important for recognition of
facial expressions of emotion [5]. e upper AOI includes the
eyebrows and forehead, the middle AOI consists of the eyes
and cheekbones, and the lower AOI encompasses the nose,
mouth, and (Ekman & Friesen, [5] as cited in [15]). Previous
research indicated depressive symptomatology impacts how
individuals attend to and process facial expression, thus con-
sidering specific AOIs when processing emotions warrants
further exploration. Wu et al. [4] and Schmid et al. [3] are the
only researchers who addressed the AOI regions, but as
3Depression Research and Treatment
mentioned earlier, they did not denitively evaluate the role
of particular parts of the face in an individual’s interpretation
of facial expressions of emotion.
In summary, evidence suggests depressed individuals may
be drawn to and attend longer to depressive images (either
scenes or faces) when asked to view dierent images. However,
in real world social interactions, individuals are not choosing
among facial images but rather processing a facial expression
to which a response may be necessary, i.e., social interaction.
Our limited knowledge of the impact of AOIs on facial emo-
tional processing highlights dierences according to various
emotions as well as depressive symptoms, but further explo-
ration is necessary. Emotion recognition literature has sug-
gested particular areas of the face are unique for particular
emotions, hence, focusing on those areas may enhance pro-
cessing abilities, and thus it makes intuitive sense to explore
how these areas are perceptually and cognitively processed.
is process, we argue, is most eectively measured using a
less utilized eye-tracking metric, namely visit duration. By
denition, visit duration comprises all xations within a given
AOI, for a particular visit. Arguably, visit duration indicates
the amount of time an individual spends visiting and returning
to the AOI, providing detailed information about how an indi-
vidual is actively interpreting an emotional expression. e
present study expanded on previous literature by further
exploring how features of the face (AOIs) may inuence emo-
tion processing as a function of depressive symptomatology.
We hypothesized individuals with nonclinical levels of depres-
sion will utilize AOIs dierently than nondepressed individ-
uals and this dierence will be impacted by the emotion
portrayed.
4. Method
4.1. Participants. Fiy-three college-aged students (40 women,
13 men,
𝑀
age
= 20.75
, 72% white) were recruited from a small,
liberal arts university in the Mid-Atlantic states through the
SONA psychology research participant system.
4.2. Procedure and Materials. e procedures were approved
by e University’s Institutional Review Board. Participants
were instructed they would be completing an emotion
recognition task while having their eye movements recorded
with an eye tracker. Following informed consent and
demographic collection, the participants were positioned in
front of a 24-inch monitor containing the Tobii X3-120 eye
tracking system (Tobii Technology Danderyd, Sweden) xated
to the bottom of the monitor. e Tobii X3-120 eye-tracker
analyzes eye movements at 120 Hz per second by utilizing
corneal reection techniques.A standard 9-point calibration
procedure was used to ensure participants were 67 cm away
from the computer screen. Aer calibration was completed,
the participants were shown the emotional label key press they
would use to judge each face. e Participants were instructed
to judge and label the emotion they thought best represented
the emotion present on the displayed face. e label response
paradigm involved a keypress with −2 representing anger,
−1 for fear, 0 for happiness, 1 for neutral, and 2 for sadness.
Four faces were employed for labeling practice. Each face
was presented for a total of 4 seconds. Before each face was
presented, a xation cross was shown, in dierent positions,
to direct gaze. e order in which the faces and xation cross
positions were presented was pseudo-randomized. Once
participants nished viewing and rating the faces, they were
asked to complete the Beck Depression Inventory, a measure of
the severity of depression over a two-week period. Participants
respond to 21 items using a 4-point severity scale [17]. is
widely used assessment of depression symptoms has reported
high reliability,
𝛼 = 0.93
[17], high content validity [18], and
strong to moderate convergent validity (ranging from 0.58
to 0.79, Richter et al. [18]). None of our participants met the
clinical-criteria for depression, but we were able to create a
median-split, with half of our participants revealing little to
no depressive symptoms (a score of less than or equal to six
on the BDI).
4.3. Facial Expression Stimuli. Eighty facial expression images
of ve emotions (happiness, sadness, anger, fear, and neutral)
were taken from the Child Aective Facial Expression set,
a validated set of child facial expressions [19, 20], and the
NimStim set of facial expressions, a valid and reliable set of
adult facial expressions [21]. Images were equally distributed
across sex, age of face, and emotion of face, yielding 80 images.
Images for this study were selected for their highest accuracy
rating within each emotion, and all images were free from
extraneous factors, such as facial scars, blemishes, or distinctive
hair, all images were presented on a solid background, with
a white drape covering anything (like clothes) other than the
face.
5. Results
As we argued earlier, the particular eye-tracking metric ana-
lyzed is critical to consider when exploring visual processing
during an emotion recognition task. e most commonly used
metrics for eye-tracking data are xation count or xation
duration [22], but we have asserted visit duration which pro-
vides more meaningful data for emotional processing. Primary
interest is marked by xation metrics providing information
regarding initial processing. Revisiting a particular AOI indi-
cates usefulness or perceived importance for emotion process-
ing, which is represented by the Tobii soware variable of visit
duration, not xation duration [9].
A 2 (depression) × 2 (age of face) × 5 (emotion) × 4 (AOI)
mixed model ANOVA was conducted for visit duration only.
Several main eects and interactions yielded signicance.
e results of these analyses are discussed separately in
the sections below. All signicant ndings are reported, but
only those ndings relevant to our hypotheses are discussed
in terms of follow-up analyses, tables and/or gures. For each
signicant eect, appropriate follow-up analyses (main eect-
LSD, interaction eects- pairwise
𝑡
-tests with Bonferroni cor-
rections) were conducted with the ndings discussed below
and presented in tables and gures. It is important to note,
that the main eects and the two-way interactions are over-
shadowed by the three-way interaction discussed below. We
Depression Research and Treatment4
15
10
5
0
15
10
5
0
AngerFearHappy Neutral Sad
AngerFearHappy Neutral Sad
Lower
Middel
Noncore
Upper
Lower
Middel
Noncore
Upper
BDI high
BDI low
F 1:ree-way interaction among depressive symptomatology,
emotion, and area of interest.
T 1: Follow-up t-tests for two-way interaction between emo-
tion and AOIs.
Signicant at the 0.00167, Bonferroni correction.
Emotion Lower Middle Upper
Anger
Middle
8.655
Noncore
14.932
4.789
−0.398
Upper 11.326
4.831
Fear
Middle
5.418
Noncore
14.075
6.311
4.012
Upper 16.269
11.894
Happiness
Middle
4.454
Noncore
15.135
6.317
0.467
Upper 14.271
8.200
Neutral
Middle
3.464
Noncore
11.090
5.366
3.940
Upper 15.288
10.324
Sadness
Middle
5.623
Noncore
13.264
5.016
0.816
Upper
10.651
6.287
(b) BDI High
Signicant at the 0.0008, Bonferroni correction.
Emotion Lower Middle Upper
Anger
Middle
5.007
Noncore
8.322
3.290 −0.334
Upper 6.698
3.075
Fear
Middle
4.591
Noncore
10.119
4.531
1.878
Upper 9.899
7.060
Happiness
Middle
3.664
Noncore
9.888
4.742
−0.456
Upper 9.549
4.739
Neutral
Middle
3.483
Noncore
8.341
3.383 2.367
Upper 9.364
5.943
Sadness
Middle
4.096
Noncore
8.441
4.014 0.035
Upper
6.631
4.572
T 2: Follow-up t-tests for three-way interaction.
(a)BDI L
Emotion Lower Middle Upper
Anger
Middle
7.932
Noncore
14.355
3.439 −0.230
Upper
9.630
3.672
Fear
Middle
3.197
Noncore
10.622
4.460
3.667
Upper 14.406
9.873
Happiness
Middle
2.678
Noncore
11.743
4.321
0.854
Upper
10.466
6.784
Neutral
Middle
1.382
Noncore
7.809
4.142
3.128
Upper
14.149
8.734
Sadness
Middle
3.890
Noncore
11.543
3.253 0.803
Upper
8.923
4.365
mention the signicance and follow-up analyses here, but
conduct further exploratory analyses of the three-way inter-
action only.
Consistent with our hypothesis, and of utmost importance
for our study, a three-way interaction among emotion, AOI, and
BDI yielded signicance,
𝐹
(
12, 612
)
= 2.011
,
𝑝 = 0.021
5Depression Research and Treatment
visit duration compared to anger (
𝑀 = 5.439
,
),
𝑡
(
52
)
= 4.230
,
, neutral (
𝑀 = 5.322
,
),
𝑡
(
52
)
= 4.691
,
, sadness (
𝑀 = 5.268
,
𝑆𝐸 = 0.236
),
𝑡
(
52
)
= 6.523
,
, and happiness
(
𝑀 = 5.124
,
𝑆𝐸 = 0.226
),
𝑡
(
52
)
= 6.883
,
, which
were not signicantly dierent from one another with one
exception anger compared to happiness,
𝑡
(
52
)
= 3.430
,
. But for child faces, only two dierences emerged
as signicant, anger (
𝑀 = 5.360
,
) compared to
sadness (
𝑀 = 4.992
,
𝑆𝐸 = 0.261
),
𝑡
(
52
)
= 4.406
,
,
and neutral (
𝑀 = 5.255
,
𝑆𝐸 = 0.274
) compared to sadness,
𝑡
(
52
)
= 3.411
,
. An age of face and AOI interaction
approached signicance,
𝐹
(
3, 153
)
= 2.447
,
𝑝 < 0.066
, indi-
cating for both adult and child faces, greater visit duration
for lower and middle AOIs compared to noncore and upper
AOIs. An emotion and areas of interest interaction yielded
signicance,
𝐹
(
12, 612
)
= 19.165
,
𝑝 < 0.05
. Follow-up pair-
wise comparisons among the AOIs for each emotion with
a Bonferroni correction of 0.00167 indicated several signif-
icant dierences (See Table 1). For anger, there were signif-
icant dierences in visit duration among all the AOIs except
noncore and upper AOIs. For fear, signicant dierences
emerged in visit duration for all AOIs. For happiness, sig-
nicant dierences in visit duration resulted for all AOIs
with the exception of noncore and upper AOIs. For neutral
signicant dierences in visit duration were indicated for
all AOIs. Finally, for sadness signicant dierences emerged
in visit duration for all AOIs except noncore to upper AOIs.
No other 2-way interactions yielded signicance.
6. Discussion
e current study oers a new perspective for assessing emo-
tion recognition abilities in individuals with depressive symp
-
tomatology by focusing on visit metrics, rather than xation
metrics, from an eye-tracking protocol. We argue visit data
encompasses a more representative eye-tracking metric for
emotion recognition as it relies on an initial and a follow-up
interest as its measure. Our ndings indicate individuals with
nonclinical levels of depressive symptoms process emotional
expressions very similarly to individuals with little to no
depressive symptoms, with one noteworthy exception. In gen-
eral, individuals in our study visited the “Tregion, lower and
middle AOIs, more oen than upper and noncore areas, but
the distinction between the lower and middle AOIs appears
for all emotions except happiness when individuals are higher
in depressive symptoms. In our study, facial expressions were
presented as color photographs rather than black and white
and as such are more analogous to real-world social interac-
tions thus providing more denitive information regarding
emotion recognition processing.
Generally speaking, our ndings support the importance
of the middle and lower AOIs when processing the emotions
of anger, fear, happiness, neutral, and sadness, and also high-
light the importance of using multiple facial features to process
emotional expressions as noted in previous literature [3, 4].
is use of multiple features, specically the lower and middle
AOIs, reveals a particular pattern we refer to as the “T” which
(see Figure 1). For each BDI group, low and high, follow-up
pairwise comparisons among the AOIs for each emotion with
a Bonferroni correction of 0.0008 indicated several signicant
dierences (Tables 2(a) and 2(b)). In addition, an age of face,
emotion, and AOI interaction yielded signicance,
𝐹
(
12, 612
)
= 15.000
,
𝑝 < 0.05
, as well as an age of face, AOI, and
BDI interaction approached signicance,
𝐹
(
3, 153
)
= 2.248
,
𝑝 = 0.085
.
5.1. BDI Low. For anger, the lower AOI visit duration was
signicantly greater than the remaining three AOIs, middle,
noncore, and upper. No other dierences emerged as
signicant for anger. For fear, happiness, and neutral, similar
patterns emerged, with signicant dierences in visit duration
resulting for all AOIs with the exception of lower compared
to middle and noncore compared to upper. For sadness, the
lower AOI did not dier from the middle AOI, the middle AOI
did not dier from the noncore AOI, and nally, the noncore
AOI did not dier from the upper AOI. e lower AOI had
signicantly greater visit duration than the noncore and upper
AOIs; the middle AOI had signicantly greater visit duration
than the upper AOI.
5.2. BDI High. For anger, the lower AOI visit duration was
signicantly greater than the remaining three AOIs: middle,
noncore, and upper. No other dierence emerged as signicant
for anger. For fear, the lower AOI had signicantly greater visit
duration than the middle AOI, noncore AOI, and the upper
AOI; the middle AOI had signicantly greater visit duration
compared to the noncore and upper AOIs; the noncore and
upper AOIs visit duration did not signicantly dier. For
happiness, signicant dierences in visit duration resulted for
all AOIs with the exception of lower compared to middle and
noncore compared to upper. For neutral, signicant dierences
in visit duration emerged between the lower AOI compared
to the noncore AOI and the upper AOI, as well as the middle
AOI compared to the upper AOI. All remaining comparisons
in visit duration for neutral were not signicant. For sadness,
the lower AOI visit duration was signicantly greater than
the remaining three AOI’s (middle, noncore, and upper). e
middle AOI yielded signicantly greater visit duration than
the upper AOI, but visit duration was not dierent between
the middle AOI compared to the noncore and upper AOIs.
A main eect for emotion was found,
𝐹
(
4,204
)
= 12.184
,
𝑝 < 0.05
. Pairwise comparisons using LSD procedure revealed
greater time was spent on fearful faces (
𝑀 = 5.543
,
𝑆𝐸 = 0.195
),
followed by anger (
𝑀 = 5.389
,
), neutral (
𝑀 = 5.276
𝑆𝐸 = 0.188
), happiness (
𝑀 = 5.9182
,
𝑆𝐸 = 0.187
), and lastly
sadness (
𝑀 = 5.123
,
𝑆𝐸 = 0.187
). Second, a main eect for AOI
was found to be signicant,
𝐹
(
3, 153
)
= 98.176
,
𝑝 < 0.05
.
Pairwise comparisons of the means using LSD procedure spec-
ied greater amount of time was spent on the lower portion
of the face (
𝑀 = 11.010
,
𝑆𝐸 = 0.543
), followed by middle
(
𝑀 = 6.256
,
𝑆𝐸 = 0.514
), and lastly noncore (
𝑀 = 2.338
,
𝑆𝐸 = 0.303
) and upper (
𝑀 = 1.605
,
𝑆𝐸 = 0.235
). No other
main eects revealed signicance.
An age of face and emotion interaction yielded
signicance,
𝐹
(
4,204
)
= 9.015
,
𝑝 < 0.05
. With adult faces,
fear (
𝑀 = 5.854
,
) revealed signicantly greater
Depression Research and Treatment6
from understanding how to utilize the eyes and mouth
regions of the face.
It must be noted the current study evaluates eye-tracking
patterns without regards to accuracy, thus, we cannot defini-
tively state whether these processing strategies support accu-
rate recognition. e stimuli chosen for this study were the
highest accuracy ratings from the stimuli set, and as such may
limit our ability to explore how accuracy is impacted by our
patterns. Future studies should not only explore the relation-
ship between areas of interest and accuracy of emotional
expression judgment, but also utilize stimuli with lower accu-
racy ratings to eliminate any potential ceiling effects. Given
the universality of emotional expression recognition and the
importance of the lower and middle AOIs when visually pro-
cessing emotional images, we would expect the same pattern
of results with stimuli with low separability. Furthermore, each
image was presented for four seconds, so comparisons of gaze
time for particular emotions is not possible. As such, we can-
not address previous literature reporting longer gaze periods
for sad images in individuals with depressive symptoms.
Future studies could address this issue by removing the set
time of image exposure.
e interaction among age of face, AOI, and BDI group
approached significance, but the trend is worth noting and is
particularly interesting given the own-age/other-age bias emo
-
tion recognition literature (e.g., [16]). When young adults are
exploring child emotional expression images, a more explor-
atory technique of using the lower and middle AOIs seems
indicative on an other-age bias. is pattern is also noted
among depressive symptomatology, but is not statistically
meaningful. When judging the emotional expressions of chil-
dren, college-aged individuals with little to no depressive
symptoms utilize the features of the “T” equally to gather crit-
ical information to understand the emotion. College-aged
individuals with nonclinical levels of depressive symptoms,
however, utilize the “T” pattern but with clear preference for
the lower AOI, for reasons noted above. Further exploration
of this other-age bias with child faces and depressive symptoms
appears to be a fruitful area of research, worthy of future
study.
In conclusion, we have presented findings which clarify
the important areas of interest (facial features) when process-
ing emotional expressions to reflect previous literature
describing the roles of various facial features (e.g., [5]). is
clarification aligns with a general “T” pattern indicating the
importance of the eyes and mouth areas. Further differentia-
tion of these important areas is noted, however, when we con-
sider depressive symptomatology in a nonclinical sample.
Data Availability
e eye-tracking and demographic data used to support the
findings of this study are available from the corresponding
author upon request.
Conflicts of Interest
e authors declare that they have no conflicts of interest.
appears to be of utmost importance and critical when process-
ing emotions. In most cases, these depictions indicated very
little transitions outside the “T” pattern supporting our find-
ings of less visit durations for noncore and upper AOIs. us,
when making emotion judgments, participants focus on the
eyes, nose, and mouth (lower and middle AOIs), not the eye-
brows, forehead, or extraneous features. When examining the
patterns, no significant distinctions between the upper AOI
and noncore AOI (extraneous features such as hair) are noted,
indicating any extraneous features do not lead to greater visit
duration for noncore AOI. A unique contribution of the cur-
rent study is the separation of the eye/middle AOI from the
eyebrow/upper AOI, suggesting it is the eyes which contain
more emotional information.
e general finding of the “T” pattern is further differ-
entiated when considering depressive symptomatology in a
nonclinical population. Specifically, for individuals who have
little to no depressive symptoms, eye-tracking patterns for
fear, happiness, and neutral follow the aforementioned “T
pattern, with lower and middle AOIs showing similarly high
visit durations and upper and noncore AOIs showing simi-
larly low visit durations. Individuals who have higher, yet
nonclinical levels of depressive symptoms, display a “T
pattern for happiness only. us, all participants, regardless
of depressive symptomatology, show the “T” pattern when
processing happiness. Perhaps, the uniqueness of happiness
explains this phenomenon. First, happiness is uniquely char-
acterized by the smile and aer processing the smile, the
focus can shi to the next important feature, the eyes, with
continued transitioning between those two AOIs. Second,
previous literature acknowledges happiness is rather quickly
and usually accurately recognized, thus group differences
may not be detected in this emotion. In addition, happiness
is the only pleasant emotion so more general exploration of
important emotional features is warranted. For fear and neu-
tral, nondepressed individuals demonstrate the “T” pattern
as well, thus supporting the more generalized exploration
pattern of important facial features for these two emotions.
Fear and neutral are more complex emotions than anger and
sadness, so one may need to disperse attention to multiple
features to garner the necessary information to process that
emotion. Whereas in anger and sadness, the lower AOI is
rather unique and dramatic, sadness displays an extremely
pouty mouth, and anger displays exposed teeth, so focus is
less likely toward the middle AOI. When depressive symp-
tomatology is higher, however, individuals focus on the lower
AOI more than the middle AOI for fear, neutral, anger, and
sadness. Perhaps, depressed individuals draw their gaze to
the singular feature of the mouth [3] with more unpleasant
emotions to avoid emotional displays which may match
depressive symptoms, ignoring the importance of the middle
AOI. us, these individuals may miss out on important
emotional cues and hinder social interactions and perhaps
exacerbate depression [23]. Future studies should expand on
this finding in samples with clinical levels of depression to
explore this interpretation. Particularly, our findings can
provide the foundation for therapeutic interventions
designed to assist populations with mild levels of depression.
Individuals with emotional processing deficits may benefit
7Depression Research and Treatment
[16] H. Eisenbarth and G. W. Alpers, “Happy mouth and sad eyes:
scanning emotional facial expressions,Emotion, vol. 11, no. 4,
pp. 860–865, 2011.
[17] A. T. Beck, R. A. Steer, and G. K. Brown, Manual for the Beck
Depression Inventory-II, Psychological Corporation, San
Antonio, TX, 1996.
[18] P. Richter, J. Werner, A. Heerlein, A. Kraus, and H. Sauer, “On
the validity of the beck depression inventory,Psychopathology,
vol. 31, pp. 160–168, 1998.
[19] V. LoBue and C. rasher, “e child affective facial expression
(CAFE) set: validity and reliability from untrained adults,
Frontiers in Psychology, vol. 5, pp. 1–8, 2015.
[20] V. LoBue, L. Baker, and C. rasher, “rough the eyes of a
child: preschoolers’ identification of emotional expressions
from the child affective facial expression (CAFE) set,Cognition
& Emotion, vol. 32, no. 5, pp. 1122–1130, 2018.
[21] N. Tottenham, J. W. Tanaka, A. C. Leon et al., “e NimStim
set of facial expressions: judgments from untrained research
participants,Psychiatry Research, vol. 168, no. 3, pp. 242–249,
2009.
[22] K. A. Pelphrey, N. J. Sasson, J. S. Reznick, G. Paul,
B. D. Goldman, and J. Piven, “Visual scanning of faces in
autism,Journal of Autism and Developmental Disorders, vol.
32, no. 4, pp. 249–261, 2002.
[23] A. K. Wittenborn, H. Rahmandad, J. Rick, and
N. Hosseinichimeh, “Depression as a systemic syndrome:
mapping the feedback loops of major depressive disorder,
Psychological Medicine, vol. 46, no. 3, pp. 551–562, 2016.
References
[1] C. R. Sears, K. R. Newman, J. D. Ference, and C. L. omas,
Attention to emotional images in previously depressed
individuals: an eye-tracking study,Cognitive erapy and
Research, vol. 35, pp. 517–528, 2011.
[2] G. R. Matthews and J. R. Antes, “Visual attention and depression:
cognitive biases in the eye fixations of the dysphoric and the
non-depressed,Cognitive erapy and Research, vol. 16, no. 3,
pp. 359–371, 1992.
[3] P. C. Schmid, M. S. Mast, D. Bombari, F. W. Mast, and
J. S. Lombaier, “How mood states affect information processing
during facial emotion recognition: an eye tracking study,Swiss
Journal of Psychology, vol. 70, no. 4, pp. 223–231, 2011.
[4] L. Wu, J. Pu, J. B. Allen, and P. Pauli, “Recognition of facial
expressions in individuals with elevated levels of depressive
symptoms: an eye-movement study,Depression Research and
Treatment, vol. 2012, pp. 1–7, 2012.
[5] P. Ekman and W. Friesen, Unmasking the face, Prentice Hall,
Englewood Cliffs, NJ, 1975.
[6] A. Duque and C. Vazquez, “Double attention bias for positive
and negative emotional faces in clinical depression: evidence
from an eye-tracking study,Journal of Behavior erapy and
Experimental Psychiatry, vol. 46, pp. 107–144, 2015.
[7] X. Caseras, M. Garner, B. P. Bradley, and K. Mogg, “Biases in
visual orienting to negative and positive scenes in dysphoria:
an eye movement study,Journal of Abnormal Psychology,
vol. 166, no. 3, pp. 491–497, 2007.
[8] K. Mogg, N. Millar, and B. P. Bradley, “Biases in eye movements
to threatening facial expressions in generalized anxiety disorder
and depressive disorder,Journal of Abnormal Psychology,
vol. 109, no. 4, pp. 695–704, 2000.
[9] S. Kim, Z. Dong, H. Xian, B. Upatising, and J. Yi, “Does an
eye tracker tell the truth about visualizations? Findings
while investigating visualizations for decision making,IEEE
Transactions on Visualization and Computer Graphics, vol. 18,
no. 12, pp. 2421–2430, 2012.
[10] C. R. Sears, C. L. omas, J. M. LeHuquet, and J. C. S. Johnson,
Attentional biases in dysphoria: an eye-tracking study of the
allocation and disengagement of attention,Cognition and
Emotion, vol. 24, no. 8, pp. 1349–1368, 2010.
[11] J. L. Kellough, C. G. Beevers, A. J. Ellis, and T. T. Wells, “Time
course of selective attention in clinically depressed young adults:
an eye tracking study,Behavioral Research erapy, vol. 46,
no. 11, pp. 1238–1243, 2008.
[12] T. T. Wells, C. G. Beevers, A. E. Robison, and A. J. Ellis, “Gaze
behavior predicts memory bias for angry facial expression in
stable dysphoria,Emotion, vol. 10, no. 6, pp. 894–902, 2010.
[13] D. Salvucci and J. H. Goldberg, “Identifying fixations and
saccades in eye-tracking protocols,Proceedings of the Eye
Tracking Research and Applications Symposium, ACM,
pp. 71–78, Palm Beach Gardens, FL, USA, 2000.
[14] H. Hasegawa and H. Unuma, “Facial features in perceived
intensity of schematic facial expressions,Perceptual and Motor
Skills, vol. 110, no. 1, pp. 129–149, 2010.
[15] L. A. Sullivan, Recognition of facial expressions of emotion
by children and adults (Doctoral dissertation, e University
of Alabama at Birmingham, 1996). Dissertation Abstracts
International, ProQuest, Ann Arbor, MI, USA, p. 7253, 1997.
Stem Cells
International
Hindawi
www.hindawi.com Volume 2018
Hindawi
www.hindawi.com Volume 2018
M E D I ATO R S
I N FLA M MAT IO N
of
Endocrinology
International Journal of
Hindawi
www.hindawi.com Volume 2018
Hindawi
www.hindawi.com Volume 2018
Disease Markers
Hindawi
www.hindawi.com
Volume 2018
BioMed
Research International
Oncology
Journal of
Hindawi
www.hindawi.com Volume 2013
Hindawi
www.hindawi.com Volume 2018
Oxidative Medicine and
Cellular Longevity
Hindawi
www.hindawi.com Volume 2018
PPAR Research
Hindawi Publishing Corporation
http://www.hindawi.com Volume 2013
Hindawi
www.hindawi.com
The Scientic
World Journal
Volume 2018
Immunology Research
Hindawi
www.hindawi.com Volume 2018
Journal of
Obesity
Journal of
Hindawi
www.hindawi.com Volume 2018
Hindawi
www.hindawi.com Volume 2018
Computational and
Mathematical Methods
in Medicine
Hindawi
www.hindawi.com Volume 2018
Behavioural
Neurology
Ophthalmology
Journal of
Hindawi
www.hindawi.com Volume 2018
Diabetes Research
Journal of
Hindawi
www.hindawi.com Volume 2018
Hindawi
www.hindawi.com Volume 2018
Research and Treatment
AIDS
Hindawi
www.hindawi.com Volume 2018
Gastroenterology
Research and Practice
Hindawi
www.hindawi.com Volume 2018
Parkinsons
Disease
Evidence-Based
Complementary and
Alternative Medicine
Volume 2018
Hindawi
www.hindawi.com
Submit your manuscripts at
www.hindawi.com