Interaction between Real and Virtual Worlds

Virtual reality (VR) technology associated for a long time solely with video games, has since experienced a major boom, particularly in the field of training. The «democratisation» of VR headsets is no stranger to this. The number of headsets sold has exploded from 5 million units in 2014 to 68 million in 2020, their cost has dropped and the technology itself has evolved. We are now talking about immersive technologies including virtual, augmented and mixed realities (VR/AR/MR). UTC has been a pioneer since it introduced, as early as 2001, teaching in virtual reality and launched, within its Heudiasyc laboratory, research on both the fundamental and application levels. The interaction between the real and virtual worlds opens up immense fields of application, particularly in relation to robotics. For example, we can interact with a drone that maps the damage caused by a natural disaster in places that have become inaccessible. Obviously, these new possibilities can be used for malicious purposes, and this raises several ethical issues. UTC’s academics are aware of this.

Interaction between Real and Virtual Worlds

The rapid growth of VR (virtual reality) and AR (augmented reality) research

As a lecturer-cum-research scientist, Indira Thouvenin set up a virtual reality (VR) and, more recently, an augmented reality (AR) activity when she arrived at the UTC in 2001. An activity that is to be understood as taking place as much at the level of course work as at the level of research.

Initially attached to the UTC-Roberval laboratory, Indira Thouvenin joined the UMR CNRS UTC Heudiasyc Lab in 2005. Among her fields of research? "I am working on informed interaction in a virtual environment. In other words, 'intelligent' interaction where VR adapts to the user. This involves analysing the user's gestures, behaviour and gaze. We then define descriptors for his or her level of attention, intention, concentration or understanding, for example. I also work on interaction in an augmented environment. In this case, the aim is to design explicit and implicit assistance, the latter being invisible to the user,' she explains.

This research is leading to applications in the fields of training, health, education and industry, among others. "The descriptors allow us to define the user's attitude, in particular, his level of concentration or distraction, his skills: does he already have experience, or is he a beginner? From all this data, we will make a selection of "sensory feedback" or adaptive feedbacks, since VR uses visual, sound, tactile or haptic feedback. All these sensory feedbacks will thus allow us to help the user to understand the environment without imposing all the feedbacks but only those which are most appropriate to him. In short, it is a highly personalised modelling of interaction in a virtual environment," explains Indira Thouvenin.

Is a task that calls for lots of simulation and modelling? "For my part, I work on three simulation platforms dedicated respectively to the railway, the autonomous vehicle and VR. The latter consists of two main pieces of equipment: VR headsets intended, in particular, for teaching, and the Cave system (cave automatic virtual environment, i.e. an immersive room) in which you can visualise at scale 1. We have a Cave with four sides, each supporting a screen with a rear projection system. The screens are displayed in stereoscopy or 3D. The Cave allows you to see your own body, whereas in a helmet you have to reconstruct a virtual body," explains Yohan Bouvet, head of Heudiasyc's simulation & modelling department.

Today, virtual reality and augmented reality are booming in various fields. Hence the multiplication of projects both at the academic and application levels.

A flagship augmented reality project? "We are working with Voxar, a Brazilian laboratory specialising in augmented reality, to develop an augmented windscreen for semi-autonomous cars. A project that is the subject of Baptiste Wojtkowski's doctorate, as part of the chair of excellence on intelligent surfaces for the vehicle of the future, financed by Saint-Gobain, the UTC foundation, the FEDER (European RegionalFund) and the Hauts-de-France region," she points out.

What does this mean in concrete terms? "It will be a question of defining what we will visualise on this "future window". Should there be visual feedback from AR? How and when? When does the user need to understand the state of the robot (vehicle) and what does the vehicle understand about the user's actions? In real time, we will not display the same feedbacksall the time but take into account the user's fatigue, concentration or lack of concentration," explains Indira Thouvenin.

Other projects in progress? "In particular, there is a project on "social touch" between a virtual agent embodied in the Cave and a human. This is a project funded by the ANR, led by Télécom Paris, integrating UTC-Heudiasyc and ISIR (the UPMC robotics laboratory). In this project called "Socialtouch", Fabien Boucaud is doing his PhD and developing the agent's decision model. Finally, the last project underway is the Continuum team. The idea? To bring together the majority of VR platforms, thirty in all in France, in order to advance interdisciplinary research between human-computer interaction, virtual reality and the human and social sciences," she concludes.