Virtual reality (VR) technology associated for a long time solely with video games, has since experienced a major boom, particularly in the field of training. The «democratisation» of VR headsets is no stranger to this. The number of headsets sold has exploded from 5 million units in 2014 to 68 million in 2020, their cost has dropped and the technology itself has evolved. We are now talking about immersive technologies including virtual, augmented and mixed realities (VR/AR/MR). UTC has been a pioneer since it introduced, as early as 2001, teaching in virtual reality and launched, within its Heudiasyc laboratory, research on both the fundamental and application levels. The interaction between the real and virtual worlds opens up immense fields of application, particularly in relation to robotics. For example, we can interact with a drone that maps the damage caused by a natural disaster in places that have become inaccessible. Obviously, these new possibilities can be used for malicious purposes, and this raises several ethical issues. UTC’s academics are aware of this.
Professor Philippe Bonnifait, has been director of the UTC-Heudiasyc Laboratory, created in 1981, since January 2018. A cutting-edge laboratory dedicated to digital sciences, which specialises in scientific methods related to artificial intelligence, robotics, data analysis, automation and virtual reality.
Created in 1981 and associated with the CNRS since its foundation, the UTC Heudiasyc Laboratory (Heuristics and Diagnosis of Complex Systems) is attached to the INS2I (Institute of Information Sciences and their Interactions), one of the ten CNRS national institutes and initially headed by Prof. Ali Charara, former director.
Can we single out a particularity of UTC Heudiasyc ? “It is a laboratory that brings together research scientists in computer science and computer engineering, two specialities that are often addressed separately; we are one of the first laboratories to have this vision in France,” explains Philippe Bonnifait.
As the world of computer science is, by nature, evolving, the themes addressed by the laboratory’s lecturer cum research scientists have themselves naturally evolved and quite considerably so.. The proof ? The development of research in the field of virtual reality. “A development that coincided with the arrival of Indira Thouvenin in 1995 at the UTC, a personality who is very well known in her field. The projects led by Indira were in strong interaction with cognition, human and social sciences, and involved numerous collaborations with other laboratories, notably UTC-Costech,” emphasises Prof. Bonnifait.
Since the restructuring of the laboratory in January 2018 and the down-sizing from four to three teams — CID (Knowledge, Uncertainty, Data), SCOP (Safety, Communication, Optimisation) and SyRI (Interacting Robotic Systems) — various interdisciplinary themes have been identified. This is the case of VR (virtual reality), an eminently cross-disciplinary subject, which thus ‘straddles’ the SyRI and CID teams. “WithinCID, we are interested in adaptive systems and the personalisation of systems; themes where we find a number of elements linked to immersive environments, one of Domitile Lourdeaux’s research fields. At SyRI, where Indira Thouvenin is based, we are particularly interested in robotics, with subjects such as robot autonomy- smart, autonomous vehicles and drones for our part -, control, perception and data fusion, and interacting multi-robot systems,’ explains Philippe Bonnifait. In short? “In the case of multirobots, there are three types of interaction: those with their environment, those with other robots and finally those with humans. In my opinion, it is very important to put the human being back at the centre of all the projects we develop. If we take the concrete case of the windscreen of the future, for example, a project by Indira, it is a question of creating mixed reality on the windscreen, head-up displays, an application intended for the autonomous vehicle in the long term,” he explains.
These interactions are evolving as we talk more and more about symbiotic autonomous systems. “We are moving towards the symbiosis of intelligent machines with humans in a large number of areas. One example? The Xenex robot, an American invention, which has been deployed in a number of hospitals for disinfecting rooms using UV light. This is very useful, especially in these times of Covid,” he says. While in this case we are dealing with a truly autonomous machine, this is not the case in all sectors. “Let’s take drones. They always need a remote pilot, and what we are trying to do at Heudiasyc is to have a remote pilot for several drones; a remote pilot who can take control in the event of a problem, because a drone in a crowd, for example, can cause a lot of damage. This is also the case for the autonomous vehicle, which, even in the long term, will need the vigilance of the driver. However, there is a middle way open to us: moving towards this human-machine symbiosis so that we can imagine new ways to drive,” adds Philippe Bonnifait.
The interaction between the real and virtual worlds opens up huge fields of application. For the better — the drone that monitors the state of drought in certain areas — and for the worse- certain military applications. This poses eminently ethical problems. “For a long time, ethics remained rather remote from our concerns, but the technological developments of the last ten years, with dangerous or even lethal uses, have put ethics back at the centre of the reflections on AI (artificial intelligence).
As a lecturer-cum-research scientist, Indira Thouvenin set up a virtual reality (VR) and, more recently, an augmented reality (AR) activity when she arrived at the UTC in 2001. An activity that is to be understood as taking place as much at the level of course work as at the level of research.
Initially attached to the UTC-Roberval laboratory, Indira Thouvenin joined the UMR CNRS UTC Heudiasyc Lab in 2005. Among her fields of research? “I am working on informed interaction in a virtual environment. In other words, ‘intelligent’ interaction where VR adapts to the user. This involves analysing the user’s gestures, behaviour and gaze. We then define descriptors for his or her level of attention, intention, concentration or understanding, for example. I also work on interaction in an augmented environment. In this case, the aim is to design explicit and implicit assistance, the latter being invisible to the user,’ she explains.
This research is leading to applications in the fields of training, health, education and industry, among others. “The descriptors allow us to define the user’s attitude, in particular, his level of concentration or distraction, his skills: does he already have experience, or is he a beginner? From all this data, we will make a selection of “sensory feedback” or adaptive feedbacks, since VR uses visual, sound, tactile or haptic feedback. All these sensory feedbacks will thus allow us to help the user to understand the environment without imposing all the feedbacks but only those which are most appropriate to him. In short, it is a highly personalised modelling of interaction in a virtual environment,” explains Indira Thouvenin.
Is a task that calls for lots of simulation and modelling? “For my part, I work on three simulation platforms dedicated respectively to the railway, the autonomous vehicle and VR. The latter consists of two main pieces of equipment: VR headsets intended, in particular, for teaching, and the Cave system (cave automatic virtual environment, i.e. an immersive room) in which you can visualise at scale 1. We have a Cave with four sides, each supporting a screen with a rear projection system. The screens are displayed in stereoscopy or 3D. The Cave allows you to see your own body, whereas in a helmet you have to reconstruct a virtual body,” explains Yohan Bouvet, head of Heudiasyc’s simulation & modelling department.
Today, virtual reality and augmented reality are booming in various fields. Hence the multiplication of projects both at the academic and application levels. A flagship augmented reality project? “We are working with Voxar, a Brazilian laboratory specialising in augmented reality, to develop an augmented windscreen for semi-autonomous cars. A project that is the subject of Baptiste Wojtkowski’s doctorate, as part of the chair of excellence on intelligent surfaces for the vehicle of the future, financed by Saint-Gobain, the UTC foundation, the FEDER (European RegionalFund) and the Hauts-de-France region,” she points out.
What does this mean in concrete terms? “It will be a question of defining what we will visualise on this “future window”. Should there be visual feedback from AR? How and when? When does the user need to understand the state of the robot (vehicle) and what does the vehicle understand about the user’s actions? In real time, we will not display the same feedbacksall the time but take into account the user’s fatigue, concentration or lack of concentration,” explains Indira Thouvenin.
Other projects in progress? “In particular, there is a project on “social touch” between a virtual agent embodied in the Cave and a human. This is a project funded by the ANR, led by Télécom Paris, integrating UTC-Heudiasyc and ISIR (the UPMC robotics laboratory). In this project called “Socialtouch”, Fabien Boucaud is doing his PhD and developing the agent’s decision model. Finally, the last project underway is the Continuum team. The idea? To bring together the majority of VR platforms, thirty in all in France, in order to advance interdisciplinary research between human-computer interaction, virtual reality and the human and social sciences,” she concludes.
Pedro Castillo is a CNRS research scientist who works at the UTV-Heudiasyc Lab. and a member of the Robotic Systems in Interaction (SyRI) team. He is specialized in automatic control applied to robotics. His research focuses, in particular, on the automatic control of UAVs, autonomous UAVs but also, more recently, virtual UAVs.
Arriving from Mexico with a scholarship, Pedro Castillo started a PhD thesis, in 2000, in automatic control, more particularly on the automatic control of drones, at UTC.
His thesis won him the national prize for the best thesis in automatic control in early 2004. During these years, he did a series of post-docs in the United States at Massachusetts Institute of Technology (MIT), in Australia and in Spain, then applied for a position with the CNRS, which he joined in 2005, appointed to the Heudiasyc mixed research unit, where he continued his research on the control of miniature drones. “As early as 2002, during my thesis, we conducted the initial tests. We were the first team in France to work on this subject at that time and one of the first to develop an autonomous four rotor drone,” he explains.
This explains the recognition that the laboratory enjoys in this field, both in terms of theoretical and experimental research. “UTC-Heudiasyc is known for having developed fundamental platforms. And it was during my thesis that we started to develop experimental platforms dedicated to flight modes in order to validate the theoretical research that we were carrying out. As early as 2005, we worked on setting up a common platform for the validation of aerial drone control systems, which was completed in 2009”, he explains.
While many researchers are placing their bets on autonomous drones, Heudiasyc has adopted a different gamble. “We realised that we cannot leave all the work to a drone. There may be situations that it will not be able to handle. In this case, the human must be able to take over. In robotics, we talk about a control loop where the human can interact with the robot at any time,” says Pedro Castillo.
Until recently, in the field of human-machine interaction, rather classical approaches have dominated. Those that use joysticks or remote operation, which, thanks to feedback from the system to the operator, allow the control of a remote robot. The idea of introducing virtual reality? “It’s to introduce visual and audio feedback. In a word: to see what the robot sees by equipping it with cameras and to be able to hear, for example, in the event of a motor problem. This will make it easier to control and therefore navigate the drone,” he explains.
But Pedro Castillo and Indira Thouvenin decided to go further and explore a new theme: virtual robotics. “We decided to represent our aerial drone test room in the Cave, i.e., with highly immersive technology. We also created a small virtual drone that can be manipulated, which can be given different trajectories and carry out different missions. It is a sort of assistant to the real drone, since the latter will then carry out all the tasks that the operator indicates to the virtual drone, “explains Pedro Castillo.
Concrete applications? “We are interested in civil applications and there are many. We can mention the inspection of buildings, for example, or the occurrence of any natural disaster. There may be inaccessible places and drones allow us to take stock of the material or human damage and act quickly and in the right place,” he concludes.
Sébastien Destercke is a CNRS research scientist and head of the Knowledge, Uncertainty and Data (CID) team at UTCHeudiasyc. His field of research concerns modelling and reasoning under uncertain conditions, in particular in the presence of high or severe levels of uncertainty.
Concretely? “Strong uncertainties aredefined as missing or imprecise data, poor or qualitative information.
What are the underlying ideas? “The main idea is to model this type of information in a mathematical language in order to carry out reasoning tasks. This can be automatic learning, i.e., learning from examples, or making decisions in the face of uncertainty, “explains Sébastien Destercke.
Is there a transition to virtual reality? “At Heudiasyc, we have strong expertise in theories that generalise probabilities, such as theories of evidence or imprecise probabilities. These are rich mathematical languages that allow uncertainty and incompleteness of information to be modelled in a very accurate manner. Such expressiveness is particularly useful in certain applications of virtual reality, especially in the case of designing training aids. Among the uncertainties requiring fine modelling, we can cite those concerning the learner’s competence profile or even his emotional states. Taking uncertainty into account in the reasoning will make it possible to better adapt training scenarios, which can be better personalised for each profile,” he adds.
This work on uncertainty has, among other things, led to the Kiva project, which is built around an “informed virtual environment for training in technical gestures in the field of aluminium cylinder head manufacturing”; a project that won an award in the “Training and Education” category at the Laval Virtual trade fair. What is Kiva’s objective? “We focused on a special training gesture: how to blow impurities off the surface of aluminium alloy cylinder heads that have just been cast? The aim is get the trainee to reproduce the ‘expert’ gesture. However, in this situation, we have at least two sources of uncertainty: on the one hand, the “expert” gesture can change from one expert to another and, also, the recognition of the gesture which cannot be done perfectly. The trainee is equipped with sensors; he or she will make movements that are not necessarily regular or collected in a continuous manner. This gives us partial information on the basis of which we must define the recognition of the trainee’s gesture and try to measure how this gesture matches the ‘expert’ gesture. Now, in the system that will guide the learner, it is necessary to include these sources of uncertainty” , adds Sébastien Destercke.
“An ‘expert’ gesture that the learner is asked to reproduce in a Cave. The use of virtual reality in companies for training purposes is recent. Until now, it has often been limited to improving the ergonomics of workstations, where issues related to man-machine interaction are less critical. With these training objectives, they have become fundamental,” he concludes.
Domitile Lourdeaux, a university lecturer at UTC, is a member of the Knowledge, Uncertainty, Data (CID) research team at the UTC-Heudiasyc Lab. Her investigations focus on personalised adaptive systems in VR (Virtual Reality).
VR is a field of research that she has been exploring since gaining her PhD. “I am interested in the adaptation of scripted content according to a dynamic user profile and, more particularly, in virtual environments dedicated to training. In as much as we are dealing with learners, I am working in particular on the adaptation of educational content and narration. In a word: how to stage learning situations in virtual reality (VR),” explains Domitile Lourdeaux.
The fields of application are varied and past or current projects attest to this. There was Victeams on the training of ‘medical leaders’, a project funded by the ANR and the Directorate General of the French Armed Forces (DGA) and ongoing programmes Orchestraa as well as Infinity¹, a European project involving eleven countries and twenty partners including Manzalab, coordinated by Airbus Defense & Space. The former was launched in November 2019, thanks to funding from the DGA and has among its partners Réviatech, Thalès as well as CASPOA, a NATO centre of excellence, the latter in June 2020; both for a duration of three years.
“Orchestraa aims to train air command leaders in military air operations centres. There are about forty people in the room coordinating military operations. The learner is given a VR headset and interacts with his team of autonomous virtual characters; these are linked to fighter pilots, helicopters, drones, etc. They will therefore play out a scenario planned several days in advance. They will therefore playout a scenario planned several days in advance, but on the day, there may be unforeseen circumstances that disrupt the original scenario and that they must be able to manage. These may be unforeseen requests for air support or a plane crashing, in which case the pilots must be rescued in the field. These hazards will require a reallocation of resources, ultimately creating a chain reaction impacting the entire operation. In this case, the adaptation concerns the difficulty of the scenario, which will increase according to what the learner is capable of managing. This will require them to demonstrate progressive skills,” she explains.
Infinity concerns a completely different field. This large-scale European project aims to provide tools ‑AI (artificial intelligence), VR for data visualisation and analysis — to improve the collaboration of European police in investigations against cybercrime and terrorism. “We have three use cases: analysis of the behaviour of cyberattacks during an ongoing event, rapid analysis in the aftermath of a terrorist attack and finally hybrid threats, resulting from the convergence of cyberterrorism and terrorism, “explains Domitile Lourdeaux.
Infinity is a project in which URC-Heudiasyc is the leader of a monitoring tool on recommendations to guarantee the well-being of police officers using VR, a technology that indeed causes side effects due to the helmets (cybersickness, visual fatigue) and can also generate cognitive overload and stress related to the tasks to be performed in VR. “In this project, we are focusing on the side effects. We are trying to measure them in real time while the users have the headset on to diagnose their condition. We are interested in “three states” in particular: the cybersickness common to all VR headset users, the mental load related to the complexity of the task and stress and its multiple causes. To detect these side effects, we use physiological sensors ((ECGs) electrocardiograms, electrodermal activity, oculometry and pupillometry), behavioural data (task efficiency) and questionnaires”, stresses Alexis Souchet, a post-doc student also at the UTC Heudiasyc Lab..
Virtual reality does, however, raise various legal and ethical problems. “Manufacturers are going to make VR tools available to people without taking into account the harmful impacts on their health. This is a real problem when you consider the explosion in the sale of headsets, which has risen from 5 million units in 2014 to 68 million in 2020,” concludes Domitile Lourdeaux.