Inducing Novel Sound–Taste Correspondences via an Associative Learning Task

Cognitive Science 48 (3):e13421 (2024)
  Copy   BIBTEX

Abstract

The interest in crossmodal correspondences, including those involving sounds and involving tastes, has experienced rapid growth in recent years. However, the mechanisms underlying these correspondences are not well understood. In the present study (N = 302), we used an associative learning paradigm, based on previous literature using simple sounds with no consensual taste associations (i.e., square and triangle wave sounds at 200 Hz) and taste words (i.e., sweet and bitter), to test the influence of two potential mechanisms in establishing sound–taste correspondences and investigate whether either learning mechanism could give rise to new and long‐lasting associations. Specifically, we examined an emotional mediation account (i.e., using sad and happy emoji facial expressions) and a transitive path (i.e., sound‐taste correspondence being mediated by color, using red and black colored squares). The results revealed that the associative learning paradigm mapping the triangle wave tone with a happy emoji facial expression induced a novel crossmodal correspondence between this sound and the word sweet. Importantly, we found that this novel association was still present two months after the experimental learning paradigm. None of the other mappings, emotional or transitive, gave rise to any significant associations between sound and taste. These findings provide evidence that new crossmodal correspondences between sounds and tastes can be created by leveraging the affective connection between both dimensions, helping elucidate the mechanisms underlying these associations. Moreover, these findings reveal that these associations can last for several weeks after the experimental session through which they were induced.

Other Versions

No versions found

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 100,676

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Perceptual Similarity: Insights From Crossmodal Correspondences.Nicola Di Stefano & Charles Spence - 2024 - Review of Philosophy and Psychology 15 (3):997-1026.
Audiovisual Cross-Modal Correspondences in the General Population.Cesare Parise & Charles Spence - 2013 - In Julia Simner & Edward M. Hubbard (eds.), Oxford Handbook of Synesthesia. Oxford University Press.
How automatic are crossmodal correspondences?Charles Spence & Ophelia Deroy - 2013 - Consciousness and Cognition 22 (1):245-260.

Analytics

Added to PP
2024-03-19

Downloads
22 (#966,846)

6 months
10 (#394,677)

Historical graph of downloads
How can I increase my downloads?