Speaker: Nicholas Colonnese - Meta Reality Labs Research
Description
Recent announcements across technology sectors highlight the continued interests in teleprescence, immersive environments, and the embodiment of artificial devices. Advances in visual and auditory sensory feedback especially through virtual reality systems have enabled more realistic virtual environments. Furthermore, the visions of a metaverse where all aspects of our physical realities can be experienced and/or augmented are becoming a mainstream topic of discussion. These global trends do not yet acknowledge the importance of touch to achieve goals like telepresence, immersive environments, or the embodiment of artificial devices. This Cross Cutting Challenges brings experts across a range of scientific disciplines to discuss the way in which tactile feedback influences our sense of self and our natural boundaries.
The divide between ourselves and our devices is defined by the sensory information received by our brain. At this point in time, prosthetic and robotic devices are not embodied because of the lack of integrated multi-sensory information which “closes the loop” and incorporates the device into our body schema. The understanding of the interplay between the brain and the machine, the conscious and subconscious, and the various sensory modalities required to elicit presence or embodiment requires an interdisciplinary focus. Here we bring together experts across the fields of neuroscience, phenomenology, biomechanics, neural engineering, and computer science to discuss the importance of touch for distributed embodiment, the best practices for eliciting presence or embodiment, and new thoughts on how to measure these elusive concepts. Each field can bring skills and best practices to this effort:
Neuroscientists study neurological pathologies which cause changes in the embodiment of the body and its limbs.
Phenomenologists tease out how mood and the preconscious shape our experiences and influence how embodiment can be created.
Biomechanists who study the neural control of movement model can explicate how the physiological feedback loops create the sense of ownership and agency within our intact physiological systems.
Neural Engineers use sensory restoration techniques to create artificial tactile sensations through central and peripheral nerve interfaces.
Human Computer Interface Engineers can replicate the biological control systems through distributed digital systems which can be physically dislocated and exist in a virtual reality.
End Users and Stakeholders interface directly to the digital systems and experience the multi-sensory integration provided by the VR or prosthetic device.
We will bring together experts across these fields to discuss the importance of touch in the creation of the sense of presence or embodiment and the ability for this to occur across distributed systems. Here we can discuss whether tactile perception is an experience of a stimulus outside the brain or whether the brain shapes the perception based upon prior experiences and future predictions. Furthermore, we can debate how those perceptions combine to engender a feeling of embodiment and how we might quantify the embodiment of artificial systems like Avatars or prosthetic limbs. Interactive sessions will include presentations and demonstrations from researchers working on distributed embodiment examples such as teleoperated robotic avatars and neuroprosthetic systems. The goal of this effort is to highlight best practices across our fields to apply them to challenges like creating telepresence in virtual reality systems and allowing the incorporation of artificial limbs into the body schemas of individuals with limb loss.
Soft embodiment for engineering artificial limbs
Tamar Makin - University College London, United Kingdom
Synposis
The nervous system, and the sensorimotor loop in particular, provides an existing model for the external world and predictions on how the external world can be manipulated by our actions. I will argue that these existing neural loops are not doomed to function only with something that is engineered to performs the same functions as the body and in the same way. This neural toolkit constitutes a powerful potential means for information compression, error-based trial learning, and reducing the cognitive load of artificial limb usage that could actually be exploited for a new range of purposes. I propose that (neuro)engineers should aim to create artificial limbs that are not enslaved to the body template and that exploit body processing for their own purpose. Instead, soft embodiment invites engineers to redefine the function of a certain neural process or computation. By shifting the function of existing neural pathways, we will have the opportunity to more flexibly adapt to new opportunities for improved motor control.
Expectancies and phenomenological control in the rubber hand illusion
Peter Lush - University of Sussex, United Kingdom
Synposis
When expectations for experience (expectancies) are not controlled in psychological experiments, measures of changes in experience may reflect the top-down generation of experience to meet expectancies (phenomenological control) rather than (or in addition to) other posited mechanisms. Consistent with this argument, there are substantial relationships between trait phenomenological control and measures of a range of effects studied by scientists (e.g., ASMR, mirror synaesthesia and visually evoked auditory response). This talk will focus on the rubber hand illusion (RHI), in which participants report experience of ownership of a fake hand. I will discuss problems with RHI control methods, consider the degree to which relationships between RHI measures and trait phenomenological control may confound interpretation of previous studies, and propose methods for developing controls for expectancy effects in body illusions.
Haptics in the Metaverse: Opportunities and Challenges
Nicholas Colonnese - Meta Reality Labs Research, USA
Synposis
Augmented and virtual reality interfaces promise to revolutionize human-computer interaction, but current technologies allow for the reproduction of only two of the five human senses: sight and hearing. I believe that the next major disruptive technological step will be novel haptic hardware and software technologies that will unlock the sense of touch in virtual worlds. In this talk I will first show an abstraction of the “haptic stack” aiming to describe the tremendous breadth required for conducting haptic research. Then, I will present what I feel are some of the biggest opportunities and challenges for enabling and enjoying touch in the virtual worlds of the future.