Psichologija ISSN 1392-0359 eISSN 2345-0061
2022, vol. 67, pp. 47–69 DOI: https://doi.org/10.15388/Psichol.2022.59

The Unique Cost of Human Eye Gaze in Cognitive Control: Being Human-Specific and Body-Related?

Kexin Li
South China Normal University, China
359550131@qq.com

Aitao Lu
South China Normal University, China
lu_yoyo@yeah.net

Ruchen Deng
South China Normal University, China
276463294@qq.com

Hui Yi
South China Normal University, China
912909203@qq.com

Abstract. This study investigated the eye gaze cost in cognitive control and whether it is human-specific and body-related. In Experiment 1, we explored whether there was a cost of human eye gaze in cognitive control and extended it by focusing on the role of emotion in the cost. Stroop effect was found to be larger in eye-gaze condition than vertical grating condition, and to be comparable across positive, negative, and neutral trials. In Experiment 2, we explored whether the eye gaze cost in cognitive control was limited to human eyes. No larger Stroop effect was found in feline eye-gaze condition, neither the modulating role of emotion. In Experiment 3, we explored whether the mouth could elicit a cost in Stroop effect. Stroop effect was not significantly larger in mouth condition compared to vertical grating condition, nor across positive, negative, and neutral conditions. The results suggest that: (1) There is a robust cost of eye gaze in cognitive control; (2) Such eye-gaze cost was specific to human eyes but not to animal eyes; (3) Only human eyes could have such eye-gaze costs but not human mouth. This study supported the notion that presentation of social cues, such as human eyes, could influence attentional processing, and provided preliminary evidence that the human eye plays an important role in cognitive processing.

Keywords: Stroop task; Cognitive control; Eye gaze effect; Human eyes.

Received: 31/07/2022. Accepted: 10/11/2022.
Copyright © 2022 Kexin Li, Aitao Lu, Ruchen Deng, Hui Yi. Published by Vilnius University Press. This is an Open Access article distributed under the terms of the Creative Commons Attribution Licence (CC BY), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Introduction

Observing another individual’s focus of attention has been shown to play a significant role in social interaction and communication (Kampe et al., 2003), as the visual information in eyes is an important source of information used to understand communicative goals and emotional states of others (Senju & Johnson, 2009; Bouw et al., 2022). Previous studies showed that there was a well-established bias to detect faces with direct gaze (Doi & Shinohara, 2013). For example, observers paid more attention to faces with direct gazes than to those with averted gazes, which resulted in more responses to the expressions of an individual with a direct gaze (Kesner et al., 2018). This bias is also thought to reflect an advantage for innately recognizing predator cues in the environment (Brown & Chivers, 2005; Kobayashi & Hashiya, 2011). Due to the uniqueness in physical, photometric, and motion characteristics, human eyes may have evolved to facilitate interactions with conspecifics as well (Bradbury & Vehrencamp, 2012).

Studies on brain imaging have suggested that eye contact can activate brain areas associated with the social brain network (Adolphs, 2008). There is evidence from developmental studies that even newborns preferred looking at a face with a direct gaze rather than one with an averted gaze (Farroni et al., 2002). According to the “attentional capture” account, direct gaze is a salient social cue due to its importance in communication and nonverbal social interaction (Böckler et al., 2014; Senju & Hasegawa, 2005). During social interactions, humans not only exchange communicative intentions with one another via eye contact, but also experience being watched by another (e.g., Engelmann et al., 2012; Kano et al, 2022). Consistently, other studies have found that direct gaze automatically attracts visual attention (Senju & Johnson, 2009), and is even hard to inhibit (Thompson et al., 2019; Wang et al., 2022).

The fact that humans are excellent at detecting direct gaze and following the direction of another’s gaze also reflects an essential function of selective attention, namely, the preferential processing of high-priority stimuli in the environment (Rothkirch et al., 2015). Thus, what will happen when both processes coincide and eye gaze become the focus of attention? Because eye gaze plays a significant role in survival, reproduction, and procreation, explicit attention effects might be amplified for the processing of such stimuli, which could result in facilitation in face-focused processing but interference in other concurrent cognitive tasks (Burra & Kerzel, 2021). Studies have shown that eye contact with another human is distracting in a number of visual working memory-based tasks, including simple visual target detection (Senju & Hasegawa, 2005) and spatial cognition (Buchanan et al., 2014). It was found that being watched may occupy cognitive resources, thereby increasing risk-seeking behaviors and Stroop interference (Conty et al., 2010). Thus, eye gaze could incur costs for cognitive processing.

Additionally, due to the important role in evolution and individual survival, emotional information is also prioritized to a large extent (Öhman & Mineka, 2001). For example, the selection of emotionally salient environmental stimuli is crucial to flexible and adaptive behavior (Yamaguchi & Onoda, 2012). It has been demonstrated that emotional stimuli efficiently orient attention (e.g., Vuilleumier, 2005), and detecting emotional stimuli is faster in visual search tasks (e.g., Eastwood et al., 2001). Besides, when presented subconsciously, emotional faces gain consciousness faster and more efficiently than neutral faces (Amting et al., 2010). Accordingly, neural evidence has demonstrated that as a consequence of the activation of subcortical neural structures, emotional events could attract attention rapidly and automatically (Morris et al., 1999; Mendez-Bertolo et al., 2016). This explains the reason why human emotional expressions play a vital role in communication.

There are universally identifiable emotions that can be displayed through eye, mouth, and brow movements (Ekman et al., 1988; Kheirkhah et al., 2020). It was found that categorizing emotions from the eye region involves an automatic process triggered in a bottom-up fashion by information available from the position of the visibility or display of eye white (Whalen et al., 2004). The eyes of all humans, including infants, are used to gauge the emotional state of their partner when interacting (e.g., Farroni et al., 2002). Previous studies have demonstrated that emotion and eye gaze act in the same way to capture attention (Niedźwiecka, 2020). In addition, to our knowledge, one previous study has investigated the cost of eye gaze in cognitive control (Conty et al., 2010). By evaluating the distracting effect of eye contact on concurrent conflict processing in Stroop task, Conty et al. (2010) found that direct-gaze eyes alone could automatically capture cognitive resources and thus disrupted cognitive control. Thus, direct eye gaze, emotion and cognitive control may share the same mechanism. Cognitive control is one of the most important cognitive functions humans have for adjusting to their environments (Hasegawa, & Takahashi, 2014). It is particularly important to pay attention to its various influencing factors to understand their interacting roles in how to affect cognitive control. Previous studies have independently examined the effects of eye gaze and emotion on cognitive control (e.g., Conty et al., 2010; Niedźwiecka, 2020), but they are limited in that little is known about how the combination of eye gaze and emotion are related to cognitive control. Specifically, though emotion could be expressed by eye gaze, whether and how emotion could affect the cost of eye gaze in cognitive inhibition is unclear. Therefore, the first aim of this study is to investigate whether the cost of eye gaze in Stroop effect (i.e., the Stroop effect in eye-gaze condition is larger than that in non-eye-gaze condition) varies across different emotional eye conditions.

Another limitation of the extant research is that it is not yet known if the cost of eye gaze in cognitive control is specific to humans. Unfortunately, there is a dearth of literature regarding the nature of eye-gaze cost in cognitive control. In order to answer this question, an extending crucial issue is whether such a eye-gaze cost exists in animals. It was found that eye contact also occurs during most animal interactions, indicating either threat or interest (Baron-Cohen, 1997; Perrett & Emery, 1994). Based on neural studies using single cells recordings in primates, it has been shown that the gaze direction of other animals is encoded by a dedicated neural circuitry (Perrett et al., 1985). Given the importance of eye contact in nonhuman social interactions and the fact that cats are distinct mammalian species that exhibit a close relationship to human genomes (Murphy et al., 1999), the second aim of this study is to examine whether the cost of eye gaze can occur in Stroop effect when simultaneously presenting an eye gaze of cat and a Stroop task.

Besides, although considerable research studies have been done previously on the cost of eye gaze, whether this cost is limited to eye gaze or could extend to other body parts remains unclear. We know that the mouth is an important body part for its basic life functions such as chewing, swallowing, breathing, and speaking. Additionally, the mouth region also plays a crucial role in emotion perception in faces (Wegrzyn et al., 2017). Also, similar to eye, mouth is a part of body representation, involved in conscious body-related perception and cognition. Previous studies have shown that the initial stage of face processing encodes the first-order relational information (two eyes above nose, nose above mouth; Maurer et al., 2002). Although researchers have examined the early implicit processing of distinct facial features such as mouth (Pesciarelli et al., 2016), the findings do not provide detailed mechanisms involved in the effect of mouth. Therefore, based on the second aim of this study, we further consider the plausibility of the extended effects of eye gaze to other body parts such as mouth. That is, could mouth affect cognitive Stroop effect?

The Stroop color-naming task (Stroop, 1935) is widely used to study the mechanisms of cognitive control. Participants in the classic Stroop task are instructed to name the ink color of color words. In congruent trials, the meaning of the color word and the ink color are the same (e.g., the word “red”), while color word and ink color have different meanings in incongruent trials (e.g., the word “blue” printed in red ink). When comparing a congruent condition, people are slower in the incongruent case of color naming. This increase in response time is known as Stroop effect. As cognitive control occurs when input stimuli elicit conflicting responses, Stroop tasks are often used in cognitive neuroscience and clinical psychology to support theories of cognitive control (Hu et al., 2012; Laurenson et al., 2015).

The present study aims to replicate and extend this finding of eye gaze in three respects. We first confirm the effect of eye gaze on Stroop task and meanwhile investigate the potential influence of emotion in affecting such effect. The comparisons between positive and negative gaze conditions and neutral condition were intended to shed light on the emotion effect on the eye gaze cost in Stroop effect. Second, we examine whether the eye gaze effect is specific to humans, and thus does not exist in animals. The present study will take a look at whether the Stroop effect varies across feline eye gaze and vertical grating conditions. Finally, we explore whether the cost effect is restricted to eyes, namely, other body parts such as mouth can not elicit a cost effect. The present study compares the Stroop effect between mouth condition and vertical grating condition to reveal the effect of mouth in cognitive control. Therefore, this study consisted of three experiments to examine the unique effector mechanism of eye gaze in cognitive control. Experiment 1 used vertical grating and neutral face as the baselines to reveal the eye gaze cost and to see if emotion could modulate the eye gaze cost. Experiment 2 used feline eyes with direct gaze to investigate whether the eye gaze effect is also present in animals. In Experiment 3, based on the results of Experiment 1 and 2, we investigated whether the cost of eye gaze in cognitive control is just a body-related effect or not by assessing the potential effect of mouth in Stroop task.

Experiment 1

Methods

Participants. Thirty-seven college students (mean age: 21.0±2.02 years; 15 males) took part in this experiment. All participants had normal or corrected-to-normal vision and were native Chinese speakers. They provided informed written consent and received payment for participation. The Institutional Review Board of the South China Normal University (Guangzhou, China) approved this study.

Stimuli. Four Chinese color words (i.e., 79335.jpg (red), 79337.jpg (yellow), 79339.jpg (blue), and 79341.jpg (green)) were used as stimuli in the Stroop task. The Stroop task consists of congruent trials, in which the four color words are presented in the ink of the color indicated by the word, incongruent trials, in which the four color words are presented in ink of a nonmatching color. The four words, each 72×35 pixels in size, were displayed in the center.

Eighteen pictures of the eye region of different people (half of the pictures were males and half were females) were selected from the Chinese Affective Face Picture System (CAFPS; Bai et al., 2005), with 6 eye pictures expressing positive emotion, 6 eye pictures expressing neutral emotion, and 6 eye pictures expressing negative emotion (see Figure 1). The eyes pictures were 260×63 pixels in size. Eighteen extra subjects were asked to rate all the pictures according to the following two criteria: (a) emotional valence: extremely negative emotion to extremely positive emotion (1–7 scale; 1 = very strong negative emotion, 7 = very strong positive emotion); (b) clarity: very unclear to very clear (1–7 scale).

Figure 1
Examples of eye and grating stimuli in Experiment 1

78928.png 

The three types of pictures were balanced in subjective clarity rating (mean ratings of 5.82, 5.73, and 5.35; p> .05). Additionally, positive eye pictures had a positive valence rating (6.22), while negative eye pictures had a negative valence rating (2.07). Neutral eye pictures had a medium valence rating (3.97). The mean emotional valence of positive eye pictures was significantly higher than those of negative and neutral eyes with negative eye pictures being lower than neutral eye pictures (F(2,15)=443.905, p< .001).

Six vertical grating pictures were used as a baseline with 260×63 pixels in size. They subtended vertical and horizontal visual angles of 6–7° and 1.6°, respectively. The eye and grating pictures were randomly paired with the color words and presented above them.

Procedure. Participants were seated 0.5 meters away from the front of the screen and were asked to keep their heads as still as possible during the whole experiment. E-Prime 2.0 was used for stimulus presentation and behavioral response collection (Psychology Software Tools, Pittsburgh, PA, USA). The experiment was divided into a practice stage and a formal experiment stage. The practice consisted of 16 trials to familiarize the participants with the experimental procedure. Only when their correct rates are greater than 90% in the practice stage, can the participants take part in the experiment. During the practice trials participants received feedback.

Figure 2
The flow chart of Experiment 1

78919.png 

The formal experiment consisted of 4 blocks, each of 120 trials. The procedure of a trial is illustrated in Figure 2. Each trial starts with a white fixation cross (its angular size was 0.5°× 0.5°) at the center of the screen for 500 ms, followed by the centered color word with a grating or eye picture directly above the word. The color word and a grating or eye picture remained on the screen until the participants responded or 3000 ms had passed. After a 300 ms inter-trial interval, the next trial started. The participants were instructed to judge the color of the word as quickly and accurately as possible. The participants responded by pressing keys. The assignment of the key mapping was counterbalanced across the participants. The trials on which participant made an incorrect response or no response within 3000 ms were recorded as errors. The participants received no feedback during the formal experiment stage, which lasted 15–20 minutes.

Results and Discussion

On the Stroop task, incorrect trials or trials that had reaction times > 1500 ms were excluded from the calculation of the mean reaction time scores (on average, 5.0% trials were excluded). The results of the percent accuracy, the correct mean reaction time (RT), and the Stroop effect are summarized in Table 1, Figures 3 and 4. Due to near-ceiling effects on accuracy on the Stroop task (mean performance = 95.0%), all subsequent analyses were carried out using reaction times (RTs) as the dependent measure.

Table 1
Accuracy, correct mean RTs and Stroop effect for Stroop task in Experiment 1

Trial Type

Congruency

Accuracy (%)

RTs (ms)

Stroop effect

Positive gaze

Congruent

95.2 (0.8)

745 (14)

67 (10)

Incongruent

94.4 (0.6)

812 (15)

Neutral gaze

Congruent

96.1 (0.7)

758 (15)

64 (11)

Incongruent

94.6 (0.5)

822 (17)

Negative gaze

Congruent

96.0 (0.7)

733 (15)

64 (10)

Incongruent

94.2 (0.6)

797 (15)

Vertical grating

Congruent

94.7 (0.8)

760 (18)

39 (12)

Incongruent

94.5 (0.5)

799 (14)

Note. (1) Figure in brackets shows standard error of mean (SEM). (2) Stroop effect was computed as the difference in mean response time between congruent and incongruent trials (i.e., RT in incongruent trials–RT in congruent trials).

Figure 3
Correct mean RTs for Stroop task in Experiment 1

correct-7.jpg 

Figure 4
Accuracy for Stroop task in Experiment 1

accuraty-8.jpg 

The effect of gaze on the Stroop effect. In order to confirm the cost of eye gaze in the Stroop effect, the RTs data in neutral and vertical grating conditions were submitted to a 2 (Trial Type: Neutral gaze vs. Vertical grating) ×2 (Color–word Congruency: Congruent vs. Incongruent) repeated measures analysis of variance (ANOVA) with Trial Type and Color–word Congruency as within-subject factors. The results showed a significant main effect of congruency, F(1,36)=31.83, p< .001, η2 = .47, with longer RTs in incongruent trials (810 ms) than in congruent trials (759 ms), revealing a classical Stroop effect. More importantly, the interaction between Trial Type and Color–word Congruency was significant, F(2,72)=3.09, p= .087, η2 = .079. Simple-effect tests showed that the RTs in incongruent trials were significantly longer than those in congruent trials in neutral gaze condition (822 ms vs. 758 ms; F(1,36)=32.79, p< .001, η2 = .48) and vertical grating (799 ms vs. 760 ms; F(1,36)=11.29, p= .002, η2 = .24). Moreover, the Stroop effect was greater in the neutral gaze condition (64 ms) than that in the vertical grating condition (39 ms). However, no other effect was observed (F<1).

The effect of emotional gaze on the Stroop effect. In order to test whether emotion could affect the cost of eye gaze in the Stroop effect, the RTs data in positive, negative and neutral conditions were submitted to a 3 (Emotional Gaze Condition: Positive vs. Neutral vs. Negative) ×2 (Color–word Congruency: Congruent vs. Incongruent) repeated measures ANOVA with Emotional Gaze and Color–word Congruency as within-subject factors. The results showed a significant main effect of congruency, F(1,36)= 76.33, p< .001, η2 = .68, with longer RTs in incongruent trials (810 ms) than those in congruent condition (745 ms). Additionally, the main effect of emotional gaze was significant, F(2,72)=7.08, p= .003, η2 = .29. The RTs in positive and negative gaze conditions (778 ms and 765 ms) were significantly shorter than those in neutral condition (790 ms). However, the interaction between Emotional Gaze and Color–word Congruency was not significant, F(2,72)=0.02, p= .98, η2 = .001. Further analysis showed that a comparable classical Stroop effect existed in positive gaze (67 ms; F(1,36)=10.56, p= .002, η2 = .53, negative gaze (64 ms; F(1,36)=9.19, p= .003, η2 = .54), and neutral gaze (64 ms; F(1,36)=7.97, p= .006, η2 = .48).

As expected, eye gaze exhibited a larger Stroop effect than vertical grating in Experiment 1, showing that eye gazing could cause cognitive interference, confirming a gazing cost in cognitive processing found in previous studies (e.g., Conty et al., 2010; Hazem et al., 2017, 2018). This may be because gaze automatically and unconsciously occupies cognitive resources (Rothkirch et al., 2015), thus interfering with the current task. Moreover, Experiment 1 showed a reliable and comparable gazing effect in all positive, negative, and neutral gaze conditions, suggesting that the gazing cost was not modulated by emotion. There are two possible explanations for this result. First, the emotion effect was too weak to detect in the Stroop task. That is, emotion would play a role but is not as salient as eye gaze when the emotional eye pictures are located out of focus or attention is directed elsewhere. Second, a similar pattern of no differential emotion effects was observed in other basic perceptual matching tasks, in which the amygdala was activated regardless of emotion type (Arce et al., 2008; Paulus et al., 2005). Together with other evidence, the lack of differential effects suggests that more complex cognitive processes may be involved to affect emotional signals. Additionally, one issue that is still unclear is whether such a gaze cost exists in animals’ eyes, or whether this gaze cost is specific to human eyes. Experiment 2 is conducted in the present study to answer this question.

Experiment 2

Methods

Participants, stimuli, and procedure. A new sample of 33 participants (mean age: 20.8±1.70 years; 11 males) was recruited from the same subject pool as Experiment 1, using the same inclusion criteria. They were exposed to the same procedure as Experiment 1 except the feline eyes were used. The formal experiment consisted of 2 blocks, each of 120 trials.

Six different pairs of feline eyes with direct gaze were collected, which were sourced from the Google and Baidu photo library (see Figure 5). Eighteen college students rated the clarity of the feline eyes (1–7 scale) and identified their emotion categories (positive, negative, or neutral). The mean rating of clarity was 5.48±1.16 with each feline eye pair being higher than 5. All feline eye gazes were categorized as neutral. The size of the feline eye or vertical grating is 260 × 63 pixels with a horizontal angle of view of 6–7° and a vertical angle of view of 1.6°. In the experiment, the feline eye picture or vertical grating picture appears randomly and directly above the word picture.

Figure 5
Examples of feline eye and grating stimuli in Experiment 2

78873.png 

Results and Discussion

Incorrect trials or trials that had RTs > 1500 ms were excluded from the calculation of the correct mean RT scores (on average, 6.1% trials were excluded). The results of the accuracy, mean RT and Stroop effect are shown in Table 2, Figures 6 and 7. Due to near-ceiling effects on accuracy on the Stroop task (mean performance = 93.7%), all subsequent analyses were carried out using RTs as the dependent measure.

Table 2
Accuracy, correct mean RTs and Stroop effect for Stroop task in Experiment 2

Gaze type

Congruency

Accuracy (%)

RTs (ms)

Stroop effect

Feline eye gaze

Congruent

94.6 (1.1)

752 (22)

38 (12)

Incongruent

93.5 (1.2)

790 (24)

Vertical grating

Congruent

93.7 (1.4)

722 (20)

79 (11)

Incongruent

93.2 (1.5)

801 (23)

Note. (1) Figure in brackets shows standard error of mean (SEM). (2) Stroop effect was computed as the difference in mean response time between congruent and incongruent trials (i.e., RT in incongruent trials–RT in congruent trials).

The RTs data were submitted to a 2 (Trial Type: Feline eye gaze vs. Vertical grating) ×2 (Color–word Congruence: Congruent vs. Incongruent) repeated-measures ANOVA. The results showed a significant main effect of congruency, F(1,32)=40.60, p< .001, η2 = .56, with the RTs in incongruent condition (796 ms) being longer than those in congruent condition (737 ms). The main effect of gaze type was marginally significant, F(1,32)= 4.06, p= .052, η2 = .11. The RTs in feline gaze trials (771 ms) were longer than those in vertical grating trials (762 ms).

More importantly, there was a significant interaction between trial type and congruency, F(2,64)=8.38, p= .007, η2 = .207. Further analysis showed that the RTs in incongruent trials were significantly longer than those in congruent trials in feline gaze condition (790 ms vs. 752 ms; F(1,32)=10.28, p= .003, η2 = .24) and vertical grating condition (801 ms vs. 722 ms; F(1,32)=49.77, p< .001, η2 = .61). However, differently from Experiment 1, the Stroop effect was greater in vertical grating condition (79 ms) than that in feline gaze condition (38 ms).

Figure 6
Correct mean RTs for Stroop task in Experiment 2

means-10.jpg 

Figure 7
Accuracy for Stroop task in Experiment 2

stroop-11.jpg 

 

Experiment 2 showed that the classical Stroop effects were evident in both feline eye gaze and vertical grating conditions with feline eye gaze showing a smaller Stroop effect, which revealed that the eye gaze cost in the Stroop effect was not present in the feline eye gaze condition. Often dogs and cats are regarded as faithful friends and close companions of humans (Paul et al., 2010). The bond between humans and cats can have significant benefits in terms of emotional development and socialization. In other words, the presentation of feline eye gaze could be associated with high arousal and positive and/or arousing emotional states such as joy and excitement. Such high arousal could involve high level of cognitive control, making the Stroop effect smaller. Thus, these results suggested potentially distinctive cost mechanisms underlying human eye gaze. More discussion will be given in the Section General Discussion. In Experiment 3, the effect of mouth on cognitive control was investigated in order to ascertain whether eye gaze costs in cognitive control are just body-related effects.

Experiment 3

Methods

Participants, stimuli, and procedure. A new sample of 36 participants (mean age: 21.2±1.77 years; 17 males) was recruited from the same subject pool as in Experiment 1, using the same inclusion criteria. They were exposed to the same procedure as in Experiment 1 except the human mouths were used. The formal experiment consisted of 4 blocks, each of 120 trials.

Eighteen pictures of the mouth region of different people (half of the pictures were males and half were females) were selected from the Chinese Affective Face Picture System (CAFPS; Bai et al., 2005), with 6 mouth pictures expressing positive emotion, 6 mouth pictures expressing neutral emotion, and 6 mouth pictures expressing negative emotion (see Figure 8). The mouth pictures were 260×63 pixels in size. Similarly to Experiment 1, eighteen extra subjects were asked to rate all the pictures according to the following two criteria: (a) emotional valence: extremely negative emotion to extremely positive emotion (1–7 scale; 1 = very strong negative emotion, 7 = very strong positive emotion); (b) clarity: very unclear to very clear (1–7 scale).

The three types of pictures were balanced in subjective clarity rating (mean ratings of 5.73, 5.22, and 5.18; p > .05). Additionally, positive mouth pictures had a positive valence rating (6.28), while negative mouths had a negative valence rating (2.14). Neutral mouths had a medium valence rating (3.97). The mean emotional valence of positive mouth pictures was significantly higher than those of negative and neutral mouth pictures with negative mouth pictures being lower than the neutral mouth pictures (F(2,15)=526.276, p< .001).

Figure 8
Examples of mouth and grating stimuli in Experiment 3

78850.png 

Results and Discussion

On the Stroop task, incorrect trials or trials that had reaction times > 1500 ms were excluded from the calculation of the mean reaction time scores (on average, 4.5% trials were excluded). The results of the percent accuracy, the correct mean RT, and the Stroop effect are summarized in Table 3, Figures 9 and 10. Due to near-ceiling effects on accuracy on the Stroop task (mean performance = 95.4%), all subsequent analyses were carried out using RTs as the dependent measure.

Table 3
Accuracy, correct mean RTs and Stroop effect in Experiment 3

Image Type

Congruency

Accuracy (%)

RTs (ms)

Stroop effect

Positive

Congruent

95.1(0.8)

746(18)

73(9)

Incongruent

95.1(0.5)

819(18)

Neutral

Congruent

96.3(0.8)

751(19)

58(12)

Incongruent

94.5(0.6)

809(18)

Negative

Congruent

95.6(0.7)

756(18)

54(9)

Incongruent

95.4(0.6)

810(18)

Vertical grating

Congruent

96.5(0.6)

755(19)

56(9)

Incongruent

94.5(0.6)

811(18)

Note. (1) Figure in brackets shows standard error of mean (SEM). (2) Stroop effect was computed as the difference in mean response time between congruent and incongruent trials (i.e., RT in incongruent trials–RT in congruent trials).

Figure 9
Correct mean RTs for Stroop task in Experiment 3

means-13.jpg 

Figure 10
Accuracy for Stroop task in Experiment 3

accuracy-14.jpg 

The role of mouth in the Stroop effect. In order to test the cost of mouth in the Stroop effect, the RTs data in neutral and vertical grating conditions were submitted to a 2 (Trial Type: Neutral mouths vs. Vertical gratings) ×2 (Color–word Congruency: Congruent vs. Incongruent) repeated measures ANOVA with Trial Type and Color–word Congruency as within-subject factors. The results showed a significant main effect of congruency, F(1,35)=50.95, p< .001, η2 = .59, with longer RTs in incongruent trials (810 ms) than in congruent trials (753 ms), revealing a classical Stroop effect. However, the interaction between Trial Type and Color–word Congruency was not significant, F(2,70)=0.02, p= .88, η2 = .001. Further analysis showed that there was a significant Stroop effect for both neutral mouth (809 ms vs. 751 ms; F(1,35)=4.99, p= .029, η2 = .42) and vertical grating conditions (811 ms vs. 755 ms; F(1,35)=4.55, p= .036, η2 = .53). No other effect was observed (F <1).

The role of emotional mouth in the Stroop effect. In order to examine whether emotion could affect the cost of mouth in the Stroop effect, the RTs data were submitted to a 3 (Emotional Condition: Positive vs. Neutral vs. Negative) ×2 (Color–word Congruency: Congruent vs. Incongruent) repeated measures ANOVA with Emotional Condition and Color–word Congruency as within-subject factors. The results showed a significant main effect of congruency (F(1,35)=99.38, p< .001, η2 = .74) with longer RTs in incongruent trials (812 ms) than those in congruent condition (751 ms), suggesting the presence of a classical Stroop effect. However, the interaction between Emotional Condition and Color–word Congruency was not significant, F(2,70)=0.99, p= .38, η2 = .055. Further analysis showed that a comparable classical Stroop effect existed in positive mouth (73 ms; F(1,35)=8.39, p= .005, η2 = .65), negative mouth (54 ms; F(1,35)=4.50, p= .037, η2 = .51), and neutral mouth (58 ms; F(1,35)=4.99, p= .029, η2 = .42). No other effect was observed (F<1). Therefore, the study did not find any cost related to mouth, nor the modulation effect of emotion during this process, which is consistent with the evidence that eyes are distinct and can easily attract people’s attention (Baron-Cohen, 1997; Roberson et al., 2012; Henderson et al., 2005). Mouth is regarded as an important body part due to its life sustaining functions such as speech, eating, swallowing and breathing. It explains why speech sounds are usually localized to a moving mouth (Callan et al., 2015 ). Thus, it is possible that mouth is primarily related to the auditory attention resources, which would have a weak or no effect on a visual task such as a Stroop task. Additionally, as recognizing a mouth would share the same degree of cognitive resources as the vertical gratings, thus, no significant difference in the Stroop effect was found between mouth and vertical grating conditions.

General Discussion

This study investigated the eye cost in the Stroop effect and whether it is human-specific and body-related. We observed that the Stroop effect increases when the human eye with direct gaze was presented. However, such cost was not present in the feline eye condition, nor the human mouth condition. These results demonstrated a robust human eye cost in cognitive control and indicated that such cost is human-specific but not simply body-related. This study supported the notion that prioritization for attentional processing could depend on the presentation of social cues such as human eyes, and provided preliminary evidence for the unique importance of human eye in cognitive processing.

There are a number of potential explanations for the importance of eye gaze. For example, the eyes are the focal point of the face. As a result of their central position, expressivity, and contrast in colors, they easily and immediately attract our attention. Additionally, the white sclera and black iris of human eyes were found to be unique in primates, which made it easy to see where people were looking, and promoted social communication (Kano et al., 2022). Thus, it is not surprising that 60%–70% of our attention is focused on the eyes no matter whether we are looking at a familiar or unfamiliar face (e.g., Kingstone, 2009; Vernetti et al., 2018). Other experimental research also shows that direct gaze can easily capture attention and receive priority in visual processing even in the earliest stages of life (Böckler et al., 2014; Conty et al., 2007; Senju & Hasegawa, 2005; Farroni et al., 2002). Such attentional capture by direct gaze appears to be functional from birth, suggesting that attentional capture is triggered by low-level visual features (Kobayashi & Kohshima, 2001; Langton et al., 2000).

Another explanation is that perceiving the eyes plays an important role in detecting the presence of other minds (Grossmann, 2017; Desideri et al., 2021), with brain regions such as the amygdala, the posterior temporal sulcus and the medial prefrontal cortex engaged both when viewing eyes and when thinking about other people’s mental states (Amodio & Frith, 2006; Pelphrey & Morris, 2006). This explains why the interactions become more relevant when one realizes other group members’ gaze. An eye-gaze interaction might also increase the saliency of the interaction because ‘eye contact’ can create a sense of ‘being watched’ by an agent with intentions, thus creating ‘mind contact’ (Conty et al., 2010; Colombatto et al., 2019; Foulsham et al., 2010).

Meanwhile, humans are believed to constantly scan the environment for danger signals and automatically detect them (e.g., Carretié, 2014; Conty et al., 2010; LoBue et al., 2010). This effect would be mediated by an encapsulated fear module in the amygdala, which may act in a preattentive manner, and hence is independent of higher cognitive processing. Thus, our results about the task-irrelevant influences of human eyes on the Stroop effect are in agreement with those of the previous work that eyes are unique visual stimuli and are essential to the survival and communication of humans.

Moreover, the unique role of the human eye gaze in cognitive control has also been confirmed by previous studies. It has been proven that eye contact is a powerful signal which has many different effects on both social cognition and autonomic responses. For example, people tend to remember faces with direct gaze more readily than faces with averted gaze faces (Mason et al., 2004; Myllyneva & Hietanen, 2016). Some studies also demonstrated that direct gaze could facilitate face-related processes, social fluency, and memory for speech (e.g., Adams et al., 2010; Conty & Grèzes, 2012). A direct gaze also indicates an upcoming interpersonal interaction, so it orients the perceiver towards the interactant, their face, and sometimes also towards the self (Carr et al., 2021). However, other studies have shown that maintaining eye contact or merely observing direct gaze impedes performance in cognitive tasks (e.g., Markson & Paterson, 2009; Riby et al., 2012). Conty (2010) found that direct eye gaze was related to a stronger Stroop effect than closed eyes. Some researchers have also found that being watched by others interferes with the executive control required for performance on difficult tasks in humans and nonhuman primates (e.g., Belletier & Camos, 2018; Belletier et al., 2015; Huguet et al., 2014).

There are multiple explanations for such a negative effect of eye gaze. First, the direct gaze can better hold attention, which thus negatively affects performance in concurrent cognitive tasks (Senju & Hasegawa, 2005; Conty et al., 2010). On the one hand, as direct and averted gaze (or closed eyes) differentially influence attention (e.g., George & Conty, 2008; Myllyneva & Hietanen, 2015; Nummenmaa & Calder, 2009), direct gaze (eye contact) may act as a strong distractor. Under Resource Capacity Theory (RCT), eye gaze stimuli attract limited capacity of attentional resources, thereby reducing available resources for processing a concurrent Stroop task (Bower, 1992). On the other hand, social cues of being observed could affect participants’ decision-making criteria in nonsocial situations (e.g., the Stroop task: Conty et al., 2010; food intake: Herman et al., 2003). Following the belief–desire–intention (BDI) model which is a widely used model of agent decision making to construct reasoning systems for complex tasks in dynamic environments (Boss et al., 2010), it is necessary to incorporate communication among agents into human decision-making processes (Liang et al., 2016). Eye gaze not only shifts the observer’s attention, but also impacts their affective evaluations of the gazed-at objects. It explains the reason why it plays an important role in communication with others and decision-making.

Second, as measured by skin conductance, seeing a face with direct vs. averted gaze could cause stronger autonomic responses (Helminen et al., 2011) and heart rate deceleration (Akechi et al., 2013). Thus, it was argued that eye contact impedes cognitive performance because it may increase cognitive load (e.g., Phelps et al., 2021). Third, directed eye gaze can increase bodily self-awareness and self-focus (Conty et al., 2016), for example by activating mind reading abilities (Senju & Johnson, 2009). From birth, humans are biologically hard-wired to orient toward faces (Johnson et al., 1991), especially the eye region (Farroni et al., 2002). Thus, eyes have an enormous role in shaping our social behavior by providing us with social cues. There are converging evidence clearly demonstrating that prosocial behavior, decision making and task performance can be mediated by an increased self-awareness – a state that can be easily evoked by implicit observability cues (Wong & Stephen, 2019.). In our setting, under the circumstance of direct eye gaze, the participants are likely to be in a state of public rather than private self-focus, so that attention to social cues (human eye gaze) was particularly heightened while the capability to perform the concurrent Stroop task decreased by likely increasing their cognitive load or decreasing their cognitive functions such as cognitive control.

Notably, we used feline eyes and mouth in Experiments 2 and 3 but failed to show a stronger Stroop effect relative to vertical gratings, suggesting that the cost of eye gaze is human-specific but not simply body-related. Human beings are constantly in contact with other people around them. They are particularly alert to the gaze of strangers. Being observed by the strangers would cause anxiety, tension, and fear. However, animals, especially cats and dogs, frequently play a visible role in human society, both in the context of companionship and work. That is, a feline eye gaze would be a good indicator of positive emotion. Therefore, human and animal eyes with direct gaze would differentially influence attention. Although cat eyes are similar in morphology, physiology, and function to the human eyes, there are still various differences between them (Huang et al., 2015). When the cat eyes are presented with part of cat faces, humans easily recognize the precise cat eyes instead of human eyes. Thus, the presence of direct gaze of feline cat eye fail to capture or hold more attention relative to vertical gratings, or recruit more cognitive processing resources, neither involve the neural systems and functions that Senju & Johnson (2009) have described for the eye-contact effect. Additionally, as animal pictures are regarded as nonsocial scenes (Flechsenhar et al., 2018), our results are in agreement with the finding that the background posters depicting nonsocial scenes attracted less attention than those depicting social scenes (De Lillo et al., 2021)

However, there is evidence showing that eye contact can affect object-based attentional allocation by using rectangles overlaid with human eyes (Colombatto et al., 2020). That is, when face perception was weakened or disappeared, the human eye contact effect could still be generalized to objects. Other studies also report that inverted faces with direct gaze could affect individuals’ cognition and attention (e.g., Steinet al., 2011). These findings suggest that regardless of the background once the human eye is identified, direct-gaze eyes alone could automatically capture cognitive resources (e.g., Conty et al., 2010). However, neither the feline eye nor human mouth conforms to this prerequisite (i.e., being recognized as a human eye), hence explaining the disappearing cost in these two conditions. Additionally, different from the significant eye cost in inverted face found in previous studies (i.e., Chen & Yeh, 2012), Böckler et al. (2015) showed a breakdown of direct gaze prioritization for inverted faces. Such result suggests that even for the human eye the cost of eye gaze could be weakened with the weakening degree depending on the type of inverted faces used.

Although the present study represents the first direct investigation of the unique cost of human eye gaze in cognitive control, there are several limitations in the current study. First, the present study used only feline eye, and therefore, more animal eyes with direct gaze would be helpful to generalize the current results. Second, our data were taken from a sample of Chinese college students without controlling their experience of pets (Martens et al., 2019). Research shows that people with pets are generally healthier than those without pets as pets provide people with companionship and expanded social networks (Okin, 2017). Thus, our findings in Experiment 2 may not generalize to the populations having experience of owning a pet. Further study is warranted with a more diverse population by separating the participants with and without experience of pets. Finally, previous studies showed that different neural mechanisms would be engaged when the tasks (target detection, location, or discrimination) and timings (target SOA, target duration) varied (Fichtenholtz et al., 2009). This finding should be confirmed in the future studies by presenting the word and eye-gaze picture overlapping with each other.

Conclusion

In conclusion, this study confirms the cost of human eye gaze in cognitive control. Simultaneously, our results for the first time extend prior work by demonstrating that such cost is unique for human eye gaze and is not attributed to being body-related. Our results provide evidence for the priority of the eye gaze processes and reveal the mechanism and nature of how eye gaze affects cognitive control.

Acknowledgements

This work was supported by the joint project of “14th Five-Year Plan” for Guangzhou Philosophy and Social Sciences in 2022 (No. 2022GZGJ133), and the 13th Five-Year Research Plan Project of Guangdong Province in 2020 (Moral Education Special Sciences of Guangdong Province, No. 2020JKDY018).

References

Adams Jr, R. B., Pauker, K., & Weisbuch, M. (2010). Looking the other way: The role of gaze direction in the cross-race memory effect. Journal of Experimental Social Psychology, 46(2), 478–481. https://doi.org/10.1016%2Fj.jesp.2009.12.016

Adolphs, R. (2008). The social brain: neural basis of social knowledge. Annual Review of Psychology, 60(1), 693–716. https://doi.org/10.1146%2Fannurev.psych.60.110707.163514

Akechi, H., Senju, A., Uibo, H., Kikuchi, Y., Hasegawa, T., & Hietanen, J. K. (2013). Attention to eye contact in the West and East: Autonomic responses and evaluative ratings. PloS one, 8(3), Article e59312. https://doi.org/10.1371/journal.pone.0059312

Amodio, D. M., & Frith, C. D. (2006). Meeting of minds: the medial frontal cortex and social cognition. Discovering the social mind, 7, 183–207.

Amting, J. M., Greening, S. G., & Mitchell, D. G. (2010). Multiple mechanisms of consciousness: the neural correlates of emotional awareness. Journal of Neuroscience, 30(30), 10039–10047. https://doi.org/10.1523%2FJNEUROSCI.6434-09.2010

Arce, E., Simmons, A. N., Lovero, K. L., Stein, M. B., & Paulus, M. P. (2008). Escitalopram effects on insula and amygdala BOLD activation during emotional processing. Psychopharmacology196(4), 661–672. https://doi.org/10.1007/s00213-007-1004-8

Bai, L., Ma, H., Huang, Y. X., & Luo, Y. J. (2005). The development of native Chinese affective picture system-a pretest in 46 college students. Chinese Mental Health Journal, 19(11), 719–722.

Baron-Cohen, S. (1997). Mindblindness: An essay on autism and theory of mind. MIT press.

Belletier, C., & Camos, V. (2018). Does the experimenter presence affect working memory? Annals of the New York Academy of Sciences, 1424(1), 212–220. https://doi.org/10.1111/nyas.13627

Belletier, C., Davranche, K., Tellier, I. S., Dumas, F., Vidal, F., Hasbroucq, T., & Huguet, P. (2015). Choking under monitoring pressure: being watched by the experimenter reduces executive attention. Psychonomic bulletin & review, 22(5), 1410–1416. https://doi.org/10.3758/s13423-015-0804-9

Böckler, A., van der Wel, R. P., & Welsh, T. N. (2014). Catching eyes: Effects of social and nonsocial cues on attention capture. Psychological Science, 25(3), 720–727. https://doi.org/10.1177/0956797613516147

Böckler, A., Wel, R., & Welsh, T. N. (2015). Eyes only? Perceiving eye contact is neither sufficient nor necessary for attentional capture by face direction. Acta Psychologica, 160, 134–140. https://doi.org/10.1016/j.actpsy.2015.07.009

Bouw, N., Swaab, H., Tartaglia, N., Jansen, A. C., & Van Rijn, S. (2022). Early impact of X- and Y-chromosome variations (XXX, XXY, XYY) on social communication and social emotional development in 1-2-year-old children. American journal of medical genetics. Part A, 188(7), 1943–1953. https://doi.org/10.1002/ajmg.a.62720

Bower, G. H. (1992). How might emotions affect learning? In S.-Å. Christianson (Ed.), The handbook of emotion and memory: Research and theory (pp. 3–32). Erlbaum.

Boss, N. S., Jensen, A. S., & Villadsen, J. (2010). Building multi-agent systems using Jason. Annals of Mathematics and Artificial Intelligence59(3), 373–388. https://doi.org/10.1007/s10472-010-9181-2

Bradbury, J. W., & Vehrencamp, S. L. (2012). Principles of animal communication. Animal Behaviour, 83(3), 865–866.

Brown, G. E., & Chivers, D. P. (2005). Learning as an adaptive response to predation. In P. Barbosa & I. Castellanos (Eds.), Ecology of predator–prey interactions (pp. 34–54). Oxford University Press.

Buchanan, H., Markson, L., Bertrand, E., Greaves, S., Parmar, R., & Paterson, K. B. (2014). Effects of social gaze on visual-spatial imagination. Frontiers in psychology, 5, 1–7. http://dx.doi.org/10.3389/fpsyg.2014.00671

Burra, N., & Kerzel, D. (2021). Meeting another’s gaze shortens subjective time by capturing attention. Cognition, 212, 1–11. https://doi.org/10.1016/j.cognition.2021.104734

Callan, A., Callan, D., & Ando, H. (2015). An fMRI Study of the Ventriloquism Effect. Cerebral cortex25(11), 4248–4258. https://doi.org/10.1093/cercor/bhu306

Carr, E. W., Bird, G., Catmur, C., & Winkielman, P. (2021). Dissociable effects of averted “gaze” on the priming of bodily representations and motor actions. Acta Psychologica, 212, Article 103225. https://psycnet.apa.org/doi/10.1016/j.actpsy.2020.103225

Carretié, L. (2014). Exogenous (automatic) attention to emotional stimuli: a review. Cognitive, Affective, & Behavioral Neuroscience, 14(4), 1228–1258. https://doi.org/10.3758%2Fs13415-014-0270-2

Chen, Y. C., & Yeh, S. L. (2012). Look into my eyes and I will see you: Unconscious processing of human gaze. Consciousness and cognition, 21(4), 1703–1710. https://doi.org/10.1016/j.concog.2012.10.001

Colombatto, C., Buren, B. V., & Scholl, B. J. (2019). Intentionally distracting: Working memory is disrupted by the perception of other agents attending to you—even without eye-gaze cues. Psychonomic bulletin & review, 26(3), 951–957. https://doi.org/10.3758/s13423-018-1530-x

Colombatto, C., Buren, B. V., & Scholl, B. J. (2020). Gazing Without Eyes: A “Stare-in-the-Crowd” Effect Induced by Simple Geometric Shapes. Perception, 49(7), 782–792. https://doi.org/10.1177/0301006620934320

Conty, L., George, N., & Hietanen, J. K. (2016). Watching Eyes effects: When others meet the self. Consciousness and cognition, 45, 184–197. https://doi.org/10.1016/j.concog.2016.08.016

Conty, L., Gimmig, D., Belletier, C., George, N., & Huguet, P. (2010). The cost of being watched: Stroop interference increases under concomitant eye contact. Cognition, 115(1), 133–139. https://doi.org/10.1016/j.cognition.2009.12.005

Conty, L., & Grèzes, J. (2012). Look at me, I’ll remember you: the perception of self‐relevant social cues enhances memory and right hippocampal activity. Human Brain Mapping, 33(10), 2428–2440. https://doi.org/10.1002/hbm.21366

Conty, L., N’Diaye, K., Tijus, C., & George, N. (2007). When eye creates the contact! ERP evidence for early dissociation between direct and averted gaze motion processing. Neuropsychologia, 45(13), 3024–3037. https://doi.org/10.1016/j.neuropsychologia.2007.05.017

De Lillo, M., Foley, R., Fysh, M. C., Stimson, A., Bradford, E. E., Woodrow-Hill, C., & Ferguson, H. J. (2021). Tracking developmental differences in real-world social attention across adolescence, young adulthood and older adulthood. Nature human behaviour, 5(10), 1381–1390. https://doi.org/10.1038/s41562-021-01113-9

Desideri, L., Bonifacci, P., Croati, G., Dalena, A., Gesualdo, M., Molinario, G., ... Ottaviani, C. (2021). The Mind in the Machine: Mind Perception Modulates Gaze Aversion During Child–Robot Interaction. International Journal of Social Robotics, 13(4), 599–614. https://psycnet.apa.org/doi/10.1007/s12369-020-00656-7

Doi, H., & Shinohara, K. (2013). Task-irrelevant direct gaze facilitates visual search for deviant facial expression. Visual Cognition, 21(1), 72–98. https://psycnet.apa.org/doi/10.1080/13506285.2013.779350

Eastwood, J. D., Smilek, D., & Merikle, P. M. (2001). Differential attentional guidance by unattended faces expressing positive and negative emotion. Perception & psychophysics, 63(6), 1004–1013. https://doi.org/10.3758/bf03194519

Ekman, P., Friesen, W. V., & O’sullivan, M. (1988). Smiles when lying. Journal of personality and social psychology, 54(3), 414–420. https://psycnet.apa.org/doi/10.1037/0022-3514.54.3.414

Engelmann, J. M., Herrmann, E., & Tomasello, M. (2012). Five-year olds, but not chimpanzees, attempt to manage their reputations. PLoS One, 7(10), Article e48433. https://doi.org/10.1371/journal.pone.0048433

Farroni, T., Csibra, G., Simion, F., & Johnson, M. H. (2002). Eye contact detection in humans from birth. Proceedings of the National Academy of Sciences of the United States of America, 99(14), 9602–9605. https://doi.org/10.1073/pnas.152159999

Fichtenholtz, H. M., Hopfinger, J. B., Graham, R., Detwiler, J. M., & LaBar, K. S. (2009). Event-related potentials reveal temporal staging of dynamic facial expression and gaze shift effects on attentional orienting. Social neuroscience4(4), 317–331. https://doi.org/10.1080%2F17470910902809487

Flechsenhar, A., Rösler, L., & Gamer, M. (2018). Attentional Selection of Social Features Persists Despite Restricted Bottom-Up Information and Affects Temporal Viewing Dynamics. Scientific reports, 8(1), Article 12555. https://doi.org/10.1038/s41598-018-30736-8

Foulsham, T., Cheng, J. T., Tracy, J. L., Henrich, J., & Kingstone, A. (2010). Gaze allocation in a dynamic situation: Effects of social status and speaking. Cognition, 117(3), 319–331. https://doi.org/10.1016/j.cognition.2010.09.003

George, N., & Conty, L. (2008). Facing the gaze of others. Neurophysiologie Clinique/Clinical Neurophysiology, 38(3), 197–207. https://doi.org/10.1016/j.neucli.2008.03.001

Grossmann, T. (2017). The eyes as windows into other minds: An integrative perspective. Perspectives on Psychological Science, 12(1), 107–121. https://doi.org/10.1177/1745691616654457

Hasegawa, K., & Takahashi, S. Y. (2014). The role of visual awareness for conflict adaptation in the masked priming task: comparing block-wise adaptation with trial-by-trial adaptation. Frontiers in psychology, 5, Article 1347. http://dx.doi.org/10.3389/fpsyg.2014.01347

Hazem, N., Beaurenaut, M., George, N., & Conty, L. (2018). Social contact enhances bodily self-awareness. Scientific reports, 8(1), 1–10. https://doi.org/10.1038/s41598-018-22497-1

Hazem, N., George, N., Baltazar, M., & Conty, L. (2017). I know you can see me: Social attention influences bodily self-awareness. Biological Psychology, 124, 21–29. https://psycnet.apa.org/doi/10.1016/j.biopsycho.2017.01.007

Helminen, T. M., Kaasinen, S. M., & Hietanen, J. K. (2011). Eye contact and arousal: the effects of stimulus duration. Biological Psychology, 88(1), 124–130. https://psycnet.apa.org/doi/10.1016/j.biopsycho.2011.07.002

Henderson, J. M., Williams, C. C., & Falk, R. J. (2005). Eye movements are functional during face learning. Memory & Cognition, 33(1), 98–106. https://doi.org/10.3758/BF03195300

Herman, C. P., Roth, D. A., & Polivy, J. (2003). Effects of the presence of others on food intake: a normative interpretation. Psychological bulletin, 129(6), 873–886. https://doi.org/10.1037/0033-2909.129.6.873

Hu, F., Qian, W., Lian, X., & Ge, L. (2012). Multiple conflict-driven cognitive control mechanisms of the flanker, stroop and simon conflict. Journal of Psychological Science, 35(2), 276–281.

Huang, J. F., Zhao, H. P., Yang, Y. F., Huang, H. M., Yao, Y., & Wang, Z. J. (2015). Protective effect of high concentration of BN52021 on retinal contusion in cat eyes. BMC ophthalmology, 15, 1–8. https://doi.org/10.1186/s12886-015-0030-2

Huguet, P., Barbet, I., Belletier, C., Monteil, J. M., & Fagot, J. (2014). Cognitive control under social influence in baboons. Journal of Experimental Psychology: General, 143(6), 2067–2073. https://psycnet.apa.org/doi/10.1037/xge0000026

Kampe, K. K., et al. (2003). Hey John : signals conveying communicative intention toward the self activate brain regions associated with “mentalizing” regardless of modality. Journal of Neuroscience, 23, 5258–5263.

Kano, F., Kawaguchi, Y., & Hanling, Y. (2022). Experimental evidence that uniformly white sclera enhances the visibility of eye-gaze direction in humans and chimpanzees. Elife, 11, Article e74086. https://doi.org/10.7554/eLife.74086

Kesner, L., Grygarová, D., Fajnerová, I., Lukavský, J., Nekovářová, T., Tintěra, J., Zaytseva, Y., & Horáček, J. (2018). Perception of direct vs. averted gaze in portrait paintings: An fMRI and eye-tracking study. Brain and cognition, 125, 88–99. https://doi.org/10.1016/j.bandc.2018.06.004

Kheirkhah, M., Brodoehl, S., Leistritz, L., Götz, T., Baumbach, P., Huonker, R., Witte, O. W., Volk, G. F., Guntinas-Lichius, O., & Klingner, C. M. (2020). Abnormal Emotional Processing and Emotional Experience in Patients with Peripheral Facial Nerve Paralysis: An MEG Study. Brain sciences, 10(3), 1–14. https://doi.org/10.3390%2Fbrainsci10030147

Kingstone, A. (2009). Taking a real look at social attention. Current opinion in neurobiology, 19(1), 52–56. https://psycnet.apa.org/doi/10.1016/j.conb.2009.05.004

Kobayashi, H., & Hashiya, K. (2011). The gaze that grooms: contribution of social factors to the evolution of primate eye morphology. Evolution and Human Behavior, 32(3), 157–165. https://psycnet.apa.org/doi/10.1016/j.evolhumbehav.2010.08.003

Langton, S. R., Watt, R. J., & Bruce, V. (2000). Do the eyes have it? Cues to the direction of social attention. Trends in cognitive sciences, 4(2), 50–59. https://doi.org/10.1016/s1364-6613(99)01436-9

Laurenson, C., Gorwood, P., Orsat, M., Lhuillier, J. P., Le Gall, D., & Richard-Devantoy, S. (2015). Cognitive control and schizophrenia: The greatest reliability of the Stroop task. Psychiatry Research, 227(1), 10–16. https://doi.org/10.1016/j.psychres.2015.03.004

Liang, X., Chen, H., Wang, Y., & Song, S. (2016). Design and application of a CA-BDI model to determine farmers’ land-use behavior. Springerplus5(1), 1–18. https://doi.org/10.1186/s40064-016-3245-7

LoBue, V., Rakison, D. H., & DeLoache, J. S. (2010). Threat perception across the life span: Evidence for multiple converging pathways. Current directions in psychological science, 19(6), 375–379. https://psycnet.apa.org/doi/10.1177/0963721410388801

Markson, L., & Paterson, K. B. (2009). Effects of gaze‐aversion on visual‐spatial imagination. British Journal of Psychology, 100(3), 553–563. https://psycnet.apa.org/doi/10.1348/000712608X371762

Martens, P., Su, B., & Deblomme, S. (2019). The Ecological Paw Print of Companion Dogs and Cats. Bioscience, 69(6), 467–474. https://doi.org/10.1093/biosci/biz044

Mason, M. F., Hood, B. M., & Macrae, C. N. (2004). Look into my eyes: gaze direction and person memory. Memory, 12(5), 637–643. https://psycnet.apa.org/doi/10.1080/09658210344000152

Maurer, D., Le Grand, R., & Mondloch, C. J. (2002). The many faces of configural processing. Trends in cognitive sciences6(6), 255–260. https://psycnet.apa.org/doi/10.1016/S1364-6613(02)01903-4

Mendez-Bertolo, C., Moratti, S., Toledano, R., Lopez-Sosa, F., Martinez-Alvarez, R., Mah, Y. H., ... Strange, B. A. (2016). A fast pathway for fear in human amygdala. Nature neuroscience, 19(8), 1041–1049. https://doi.org/10.1038/nn.4324

Morris, J. S., Öhman, A., & Dolan, R. J. (1999). A subcortical pathway to the right amygdala mediating “unseen” fear. Proceedings of the National Academy of Sciences, 96(4), 1680–1685. https://doi.org/10.1073%2Fpnas.96.4.1680

Murphy, W. J., Sun, S., Chen, Z. Q., Pecon-Slattery, J., & O’Brien, S. J. (1999). Extensive conservation of sex chromosome organization between cat and human revealed by parallel radiation hybrid mapping. Genome research, 9(12), 1223–1230. https://doi.org/10.1101/gr.9.12.1223

Myllyneva, A., & Hietanen, J. K. (2015). The dual nature of eye contact: to see and to be seen. Social Cognitive & Affective Neuroscience, 11(7), 1089–1095. https://doi.org/10.1093%2Fscan%2Fnsv075

Niedźwiecka, A. (2020). Look Me in the Eyes: Mechanisms Underlying the Eye Contact Effect. Child Development Perspectives, 14(2), 78–82. https://doi.org/10.1111/cdep.12361

Niedźwiecka, A. (2021). Eye contact effect: The role of vagal regulation and reactivity, and self-regulation of attention. Current Psychology, 04, 1–7. https://doi.org/10.1007/s12144-021-01682-y

Nummenmaa, L., & Calder, A. J. (2009). Neural mechanisms of social attention. Trends in cognitive sciences, 13(3), 135–143. https://doi.org/10.1016/j.tics.2008.12.006

Öhman, A., & Mineka, S. (2001). Fears, phobias, and preparedness: toward an evolved module of fear and fear learning. Psychological review, 108(3), 483–522. https://psycnet.apa.org/doi/10.1037/0033-295X.108.3.483

Okin, G. S. (2017). Environmental impacts of food consumption by dogs and cats. PLoS One, 12(8), Article e0181301. https://doi.org/10.1371/journal.pone.0181301

Paul, M., King, L., & Carlin, E. P. (2010). Zoonoses of people and their pets: a US perspective on significant pet-associated parasitic diseases. Trends in parasitology26(4), 153–154. https://doi.org/10.1016/j.pt.2010.01.008

Paulus, M. P., Feinstein, J. S., Castillo, G., Simmons, A. N., & Stein, M. B. (2005). Dose-dependent decrease of activation in bilateral amygdala and insula by lorazepam during emotion processing. Archives of general psychiatry62(3), 282–288. https://doi.org/10.1001/archpsyc.62.3.282

Pelphrey, K. A., & Morris, J. P. (2006). Brain mechanisms for interpreting the actions of others from biological-motion cues. Current Directions in Psychological Science, 15(3), 136–140. https://doi.org/10.1111%2Fj.0963-7214.2006.00423.x

Perrett, D. I., & Emery, N. J. (1994). Understanding the intentions of others from visual signals: neurophysiological evidence. Current Psychology of Cognition, 13, 683–694.

Perrett, D. I., Smith, P. A. J., Potter, D. D., Mistlin, A. J., Head, A. S., Milner, A. D., & Jeeves, M. A. (1985). Visual cells in the temporal cortex sensitive to face view and gaze direction. Proceedings of the Royal society of London. Series B. Biological sciences, 223(1232), 293–317. https://doi.org/10.1098/rspb.1985.0003

Pesciarelli, F., Leo, I., & Sarlo, M. (2016). Implicit processing of the eyes and mouth: Evidence from human electrophysiology. Plos one11(1), Article e0147415. https://doi.org/10.1371/journal.pone.0147415

Phelps, F. G., Doherty-Sneddon, G., & Warnock, H. (2006). Helping children think: Gaze aversion and teaching. British journal of developmental psychology, 24(3), 577–588. https://psycnet.apa.org/doi/10.1348/026151005X49872

Riby, D. M., Doherty-Sneddon, G., & Whittle, L. (2012). Face-to-face interference in typical and atypical development. Developmental science, 15(2), 281–291. https://doi.org/10.1111%2Fj.1467-7687.2011.01125.x

Roberson, D., Kikutani, M., Döge, P., Whitaker, L., & Majid, A. (2012). Shades of emotion: what the addition of sunglasses or masks to faces reveals about the development of facial expression processing. Cognition, 125(2), 195–206. https://doi.org/10.1016/j.cognition.2012.06.018

Rothkirch, M., Madipakkam, A. R., Rehn, E., & Sterzer, P. (2015). Making eye contact without awareness. Cognition, 143(10), 108–114. https://doi.org/10.1016/j.cognition.2015.06.012

Senju, A., & Hasegawa, T. (2005). Direct gaze captures visuospatial attention. Visual cognition, 12(1), 127–144. https://psycnet.apa.org/doi/10.1080/13506280444000157

Senju, A., & Johnson, M. H. (2009). Atypical eye contact in autism: models, mechanisms and development. Neuroscience & Biobehavioral Reviews, 33(8), 1204–1214. https://doi.org/10.1016/j.neubiorev.2009.06.001

Vernetti, A., Senju, A., Charman, T., Johnson, M. H., Gliga, T., & BASIS team. (2018). Simulating interaction: Using gaze-contingent eye-tracking to measure the reward value of social signals in toddlers with and without autism. Developmental cognitive neuroscience, 29, 21–29. https://doi.org/10.1016/j.dcn.2017.08.004

Vuilleumier, P., George, N., Lister, V., Armony, J., & Driver, J. (2005). Effects of perceived mutual gaze and gender on face processing and recognition memory. Visual Cognition, 12(1), 85–101. https://psycnet.apa.org/doi/10.1080/13506280444000120

Wang, Y., Peng, S., Shao, Z., & Feng, T. (2022). Active Viewing Facilitates Gaze to the Eye Region in Young Children with Autism Spectrum Disorder. Journal of Autism and Developmental Disorders, 1–9. https://doi.org/10.1007/s10803-022-05462-w

Wegrzyn, M., Vogt, M., Kireclioglu, B., Schneider, J., & Kissler, J. (2017). Mapping the emotional face. How individual face parts contribute to successful emotion recognition. PloS one, 12(5), Article e0177239. https://doi.org/10.1371/journal.pone.0177239

Whalen, P. J., Kagan, J., Cook, R. G., Davis, F. C., Kim, H., Polis, S., ... Johnstone, T. (2004). Human amygdala responsivity to masked fearful eye whites. Science, 306(5704), 2061–2065. https://doi.org/10.1126/science.1103617

Wong, H. K., & Stephen, I. D. (2019). Eye tracker as an implied social presence: awareness of being eye-tracked induces social-norm-based looking behaviour. Journal of eye movement research, 12(2), 1–17. https://doi.org/10.16910/jemr.12.2.5

Yamaguchi, S., & Onoda, K. (2012). Interaction between Emotion and Attention Systems. Frontiers in neuroscience, 6, Article 139. doi: 10.3389/fnins.2012.00139