In this study we want to address the question to what extent the visual and the haptic modalities contribute to the final formation of a complex multisensory perceptual space. By varying three shape parameters a physical shape space of shell-like objects was generated. Participants were allowed to either see or touch the objects or use both senses to explore the objects. Similarity ratings were performed and analyzed using multidimensional scaling (MDS) techniques. By comparing the unimodal perceptual spaces to the multimodal perceptual space we tried to resolve the impact of the visual and the haptic modalities on the combined percept. We found that neither the visual nor the haptic modality dominated the final percept, but rather that the two modalities contributed to the combined percept almost equally. To investigate to which degree these results are transferrable to natural objects, we performed the same visual, haptic, and visuo-haptic similarity ratings and multidimensional scaling analyses using a set of natural sea shells. Again, we found almost equal contributions of the visual and the haptic modalities to the combined percept. Our results suggest that multisensory perceptual spaces are based on a complex combination of object information gathered by different senses.