English colour terms carry gender and valence biases: A corpus study using word embeddings
Details
Serval ID
serval:BIB_5BF9A24CC7B9
Type
Article: article from journal or magazin.
Collection
Publications
Institution
Title
English colour terms carry gender and valence biases: A corpus study using word embeddings
Journal
PLOS ONE
ISSN
1932-6203
Publication state
Published
Issued date
01/06/2021
Peer-reviewed
Oui
Editor
Wichmann Søren
Volume
16
Number
6
Pages
e0251559
Language
english
Abstract
In Western societies, the stereotype prevails that pink is for girls and blue is for boys. A third possible gendered colour is red. While liked by women, it represents power, stereotypically a masculine characteristic. Empirical studies confirmed such gendered connotations when testing colour-emotion associations or colour preferences in males and females. Furthermore, empirical studies demonstrated that pink is a positive colour, blue is mainly a positive colour, and red is both a positive and a negative colour. Here, we assessed if the same valence and gender connotations appear in widely available written texts (Wikipedia and newswire articles). Using a word embedding method (GloVe), we extracted gender and valence biases for blue, pink, and red, as well as for the remaining basic colour terms from a large English-language corpus containing six billion words. We found and confirmed that pink was biased towards femininity and positivity, and blue was biased towards positivity. We found no strong gender bias for blue, and no strong gender or valence biases for red. For the remaining colour terms, we only found that green, white, and brown were positively biased. Our finding on pink shows that writers of widely available English texts use this colour term to convey femininity. This gendered communication reinforces the notion that results from research studies find their analogue in real word phenomena. Other findings were either consistent or inconsistent with results from research studies. We argue that widely available written texts have biases on their own, because they have been filtered according to context, time, and what is appropriate to be reported.
Keywords
GloVe word embeddings, natural language processing (NLP), artificial intelligence (AI), LIWC, colour psychology, emotion
Pubmed
Publisher's website
Open Access
Yes
APC
1695 USD
Funding(s)
Swiss National Science Foundation / Projects / 100014_182138
Swiss National Science Foundation / Careers / P0LAP1_175055
European Research Council (ERC) / ThinkBIG
Create date
02/06/2021 9:08
Last modification date
03/06/2021 6:09