Abstract
The experience of embodiment may be studied using the rubber hand illusion. Little is
known about the cognitive mechanism that elicits the feeling of embodiment. In
previous models of the rubber hand illusion, bodily signals are processed sequentially.
Such models cannot explain some more recent findings. Carruthers (2013) proposed a
multidimensional model of embodiment, in which the processing of embodiment is
understood in terms of conceptual hand space. Visual features of hands are represented
along several dimensions. The rubber hand illusion is then explained as the erroneous
matching of the online representation of the artificial hand to the stored prototype in a
space defined by those dimensions. We conducted the first experimental tests to
investigate the multidimensional conceptual space account. First, participants performed
a series of odd-one-out judgments for triads of hand images (including their
own hand) and we then employed multidimensional scaling analyses. We found that a
multidimensional model of perceived hand similarity could be fitted to our data.
Second, we tested if a multisensory bodily signal manipulation (the rubber hand
illusion) influences the position of the viewed artificial hand in hand space. We
employed synchronous and asynchronous stroking and found that in the synchronous
condition, which elicits the rubber hand illusion, compared to the asynchronous
condition, which does not, the artificial hand was closer to the center of hand space; that
is, to a prototype hand. We discuss these findings in the context of the Carruthers
(2013) conceptual space model as well as other rubber hand illusion models.