Hi everyone, welcome to episode 5 of Lost in Immersion, your weekly 45-minute stream about innovation. As VR and AR veterans, we will discuss the latest news of the immersive industry. So let's go, Fabien, if you want to start, please. Thanks. So today I want to talk about the usage of VR for medical usage. So maybe what comes to mind first when we hear about VR usage is maybe assistance for surgery on this kind of usage. But actually there is a much more niche field where VR has a very nice impact. And it's actually been used in this field for many many years, starting from more than 20 years ago. And so this is about mental health and managing fears. So, for example, the fear of heights, or the fear of spiders, or the fear of flying, for example. And what's really good about VR is it can bring a level of immersion in, for example, let's take the flying example. So the user can be transferred into a plane, but at the same time they know that they are safe because they are not actually flying. But still, the level of immersion that the VR headset can bring is enough to start to trigger the fear response. And then how that works is just like an exposition, and gradually with multiple expositions to the same stimuli, the fear can slowly get better and better. And there are other applications like maybe more on the mindfulness side or meditation, where by cutting the user from its environment and bringing the user to another environment, maybe a more quiet space or with a special visual and sound, the mindfulness can be much more efficient. So yeah, I think I remember trying a VR experience years and years ago, where we had to walk on a very thin piece of wood between two buildings in VR. And I don't have a fear of heights, but I remember like the first step on that piece of wood was really... So I think you guys might have tested that as well. I'm not sure, but I'm curious to know, maybe first, if you have experienced such application and how you think we can use that maybe more in the future. Yes. Yeah, I agree. There's a lot of things to be done when you see the impact on someone that is at the fear of heights, like my girlfriend, for example. When she did that exact experience you described earlier, she was scared. I tried to make it even a step on the piece of wood. So yeah, I understand that it can really help slowly, but like you said, doing the experience several times with small steps, maybe increase the height after a while. That can help people to get used to different situations and get less fear when they really encounter that kind of environment. And I know for medical, for sure, there is a lot of use cases with VR, like training also medical doctors and stuff like that, even with force feedback on characters and stuff like that. So the real movement can be applied, you can really train with physical material. And also for optometry, you know, for to work on your muscle or for your eyes and make them work. I know there is a lot of work being done in that area, and it seems to help a lot. Because you do the exercise looking at something, but you also can track the eye and see if the eyes are moving correctly. So get some data on that and get a better understanding of the issue that the patient has and really target his own default medical issue. So yeah, definitely the medical. There is a lot of use cases for medical. What is your thoughts on that? So first of all, I tried all these to respond to Fabien. Indeed, they are very, very powerful and very efficient in the way of simulating fears of heights of other phobia. I would like to add two other main field of application. The first one is about trying to heal people who are sociopaths. I work with researchers in the jail of Toronto, and they are trying to bring back empathy for sociopaths just by showing them the kind of atrocities they may endure to their victim, but to another that is looking exactly like them. So they are seeing the pain they inflicted to themselves, and by this way, they are trying to reactivate this empathy into their brain. There is the same for sexual harassers as well. So I don't know where, at what point they are, but you should know that there is this kind of research as well with VR. And the second field of application I would like to bring up is the pain management. So the idea is to immerse yourself instead of using chemical products to reduce the pain. And I know that there is the company Bliss in France, and there is the same initiative here in Canada and in the US. And people should be able to test it, especially if they are doing spinal extraction, because this is a painful act and they don't want to use too many products. And just for you to know, this kind of application that are very heartless, you just have to put a helmet on and they are doing the procedure. There is nothing very hard to explain or to put in place. And this kind of initiative took nearly 10 years to be validated and used in the medical procedure itself. So we can see that the window of work to make it done and make it accessible to the public is very, very long. I don't know if you want to add something else. Yeah, yeah. Actually, I am curious to know what you guys think about. Actually, really, that is, it seems like most of the studies show that it's efficient. And I wonder why are we not being more and more like at the, like at the national level. In every country, applications of this, because the devices are there and not that expensive. So is it more about regulations or is there something else that will prevent this type of therapy to be more widely used? Well, I will jump right in because this is a subject I wanted to bring up. So there is, I got some answers for you because I read an article this week. It's not a great one, but it's asking a good question. This way, I won't show it on the screen. But, well, the main idea of this article is that, as you said in your introduction, is that when you're talking VR or healthcare or metaversary, people are thinking about the healthcare field at the second position. So the first one is industrial and then this is the medical field. And unfortunately, this field is the less advanced in the VR use at the mainstream level. So they had three different reasons why. So the first one is that most of health professional don't have any idea of the reality of the immersive technology and what it can do and can be done. So it's a, it's a lack of knowledge. The second one is that after nearly 20 years of using VR and lots of proof of concept in healthcare, most of them were just not that good in quality or in meaning because, well, you know, VR has advanced a lot during these past years and they are still thinking about what they saw maybe 10 years ago. And they think that it is ugly, that you can get sick with using it, when using it, sorry. And what else? Yeah, this is mainly that they are skeptical about the use of VR. And the third reason why we don't see VR that much in healthcare is that because there's no money. It's really hard to get research budget for this kind of experiment. And there are a lot of pressure, lots of results requested. So a lot of companies are not willing to go to this field because it's really hard. And you have to get your very, very solid financial background to get there. And as I said, just earlier, it's six to 10 years to get your product launched because you have all these researches or all this approval to get just to bring a headset to someone that is getting an injection. It's a very simple act. So I can't imagine what kind of processes you'll have to endure to bring your AR headset or VR headset inside a surgery room, for example. I know that Magic Leap succeeded in that way, but it took them six years just to get the approval. And they don't have any application yet. Just to put the Magic Leap inside a surgery room, it took them six years. So Seb, what's your point of view on this? I was going to say that I'm part of your explanation. But yeah, for me, the time to validate something, it's very long maybe for you to prove everything that it really works. So you have to measure everything and make the study on how you're going to measure everything. So yeah, it takes a lot of time. And then when you have five years of back and forth between tests and implementation and feedback from user cases, in the meantime, the technology improved a lot. So when you work with a five-year device, you have a very old VR device. So I think that's the main reason why it's five years in that domain. But it doesn't mean they are not investing in that area. Fabien? Yeah, I totally agree. I think it's an issue that is valid for a lot of other technologies as well, actually. The technologies are evolving so quickly. But regulations, approvals, like the regulatory institutions, government institutions have a very slow process of approval. So there is a disconnect between what can be done and what is actually used. We can see this with AI, which is exponentially growing now. And the regulatory part of this is just starting very slowly. So yeah, this disconnect between speeds of execution is a problem that I don't know if I can explain. We really have a good solution for that, except trying to advocate for it and hopefully make the regulatory people more aware of this. Yeah, maybe. It's maybe a bit controversial, but maybe we should put some younger folks on the regulation part that could understand that things are going very fast and some management ways are not the same as we used to be. We know on the development part that we have now this agile, very short cycle of development. And this kind of management should be applied as well on the regulation or the government parts. And I'm not sure this kind of short cycle is in application right now. I think in France it is. I've seen, I'm sharing my screen right now. I don't know if you can show what I'm sharing with you. Sure. I saw with the MagicLynx L1, which is not even available yet, they provided to labs and researchers in France the device and they did a training already and tried to develop that and see if that could work inside a white room, surgery room. And they did really quickly a test this year and shared the video. I think that's quite impressive. They did it really, really quickly. This is as simple. They just put the heart in and it's ready to go. Well, they had some media and stuff like that. I'm sure it's a prepared scenario and it's a training, so that makes sense that it's quick, but they were very quick in developing that and making that work as soon as the device was available, the first batch of device. The feedback is just that it seems that it's going very quickly in France. Well, to make just a small sum up to what I said earlier about the quality, we know that to have something realistic, like on the video you just saw, to have something realistic in healthcare or health in general, it's really hard because it's soft tissue. And all this kind of stuff are very hard to render. And I guess it augmented this way that health professionals are looking at VR and they're always thinking that it's just a gimmick because it's not as realistic as they would like to. And unfortunately, we are not on this page on the technical part of rendering. We are still not able to do a physical, realistic heart or any kind of organs. But if they can work already with something that is more like we saw, I guess we should do it. We shouldn't be waiting for the technology that would be able to make a digital twin of our organs for them to work. Yeah, and I think it's really the training is more about there's a lot of procedure to take care of to make sure that you are not contaminating the patient. So I think being able to really simulate everything with the glove, with the real tool that you would be using here with the IR headset, really makes sense. So you can really focus on a simulation of what you would have to do in the room, in the surgery room to proceed for the surgery. Okay, great. Do you have anything to add, Fabien, or do you want to continue Seb with your subject topic? Yeah, I'm good. Seb, you can go on. Okay. So on my side, I have two subjects. One was this really cool use case of using NURBS inside the NURBS. So basically using a NURBS to get the camera position and being able to add a NURBS inside of it and control the camera position inside the other NURBS to control the rendering. And so this guy did it and I think the result is really awesome. So here they shot a video of their space and shot a couple of different environments the same way. And so they are able to make a really cool portal on their door. And they pushed the concept a bit forward by shooting with the same camera someone in an environment and shooting the same environment and being able to make the person huge in that environment. And also the reverse case where you make yourself huge in the city with the same lighting because we shot everything in the same camera with a drone actually. So yeah, I don't know what you think about that. Maybe your thoughts, Guillaume? Well, I think this is very cool. I'm still questioning the use cases. Yeah, the technical demonstration is awesome. I'm very fond of NURBS technology. But yeah, what the point in this? Maybe, I don't know, if you want to do some new way of filming or television, I don't know. What are your thoughts about this? I just bring you another question to your question. Do you have any idea what could be done? Well, just as a cool effect, I would love to see something being done with augmented reality with this kind of 5D. Provide the camera position to the NURBS in augmented reality. So doing that light. It could be interesting for a museum, for example, when you want to do in another time, going through a door. And you go back to another time on your phone and see like the Versailles of the museum, where you have actor that comes and do a dance. Because to see this, do you see the user using an AR headset or VR headsets for this? Or something else? A phone at the beginning. So more on the entertainment and maybe the infamous wow effect to get people to experience some new stuff. I was thinking on the professional part and I can't see any application of this right now. But yeah, for entertainment, why not? Yeah, I agree about the entertainment part. I think I mentioned that a couple of weeks back. I'm really a movie fan. So if there is a VR experience that can allow me to watch a movie where I can choose my point of view in the movie, that would be awesome. And if the technology gets cheaper to use, maybe I can capture something and send it to my family for them to see, really to be immersed in that memory. So yeah, I'm really excited about this. And I don't know if this technology is already capable of doing real time. But if it is, that's really cool. I'm not sure yet to begin to that. But that's the idea to what we need to look for. Well, when you are editing your Nerf, you can navigate around your plon cloud in real time. But most of the time they are recording the camera movement to make it better. So it's not a real time usage to my knowledge right now. So what you are saying Fabien is that this would be the next step of 360 video in a better way that you can interact and navigate as you like. Am I right? With 3D you can move around. Yeah, that would be exciting. With quite a realistic rendering that you usually don't get unless you spend a lot of money in working with a 3D artist. So you can confirm to me that right now Nerf technology is only about plon cloud. I know that the first test I saw that is the meshing part of the Nerf is not that advanced as it would like to. They are working on it. Can you confirm my information? Or are we already in the meshing part with something that is tangible in VR? There is a meshing part. But right now you are using the perfect rendering that you get with the lighting that you get with the Nerf. So there is still information on this part. I've been doing some 3D scans lately. Especially with an iPhone. And I know that there is this what they call the Tide model. Where they are adjusting the photograph in a way that you see. So right here they are adding the cloth to the Nerf. And it interacts with the Nerf in real time. Even though the mesh is not in high quality yet, the mesh that you get, you can still use it to do something nice inside the Nerf. The quality is good enough to have some physics on it. Exactly. So that's still interesting. I agree with you that you still need to improve a lot to get a better result. In terms of mesh that you get out of the Nerf. It's funny that we are getting the same issues that we thought were solved a few years back about meshing for example. We know that there are a lot of algorithms and things that are very efficient. But we can see that with the new application of Point Cloud, these algorithms are not still efficient enough. And this is funny to see that we are going back to where we were at some point in time. So there is still some work on the meshing part that we thought were not the case anymore. So this is a good thing for people that are working on this subject. Yeah, for the rendering. You basically lose all the lighting too, so you get mesh with texture. But that's definitely not perfect. My other subject was the guys who did a press release about their new inside-out tracker. That they should release on the third trimester of 2023, this year. I'm impressed about the number of artifacts that they released. They released an eye and face tracking for the purpose suite, a wrist tracking. And now this device. And I also saw a question about how this device, which works with a USB dongle, you can have five on yourself. I wonder how easy it would be to calibrate that. Because it's decorrelated to the device inside-out tracking. Or it will be only kept compatible with the purpose suite. Or you should play with the lighthouse. How do you manage the merge between the space that you have with the lighthouse and this inside-out tracker. So I don't know if you have some thoughts about that. No, this is a nice tracker. We talked about the Moco-P as well. I know that Fabien was willing to get one in Japan. For sure, if it's an inside-out tracking, it should be better than the accelerometer and gyroscopic trackers. But I wonder about the occlusion when you are putting it on yourself. When you have the helmet, its view is always clear and the tracking is working very efficiently. I don't know about this, getting it on your arm, on your legs. Because when you are moving around, there should be some occlusion. I guess they work on these cases. If it works, it could be great. But the main question here is the price. I don't know if you have the price tag of this. Not yet. Because we know that the first Vive trackers are around $200. I guess these ones would be more expensive. I'm a bit afraid that it would get the $400 or $500 mark. It could be very unproductive for them. Because if you want to do a full-body tracking with at least four of them, it's getting very expensive. Fabien, what do you think? Yeah, I totally agree. I think the Mocopee tracking is something around $400. If I'm correct. I think it's in that price range. I wonder as well what is their target customer. For Mocopee, it's expensive. For Mocopee, it's expensive, but it's cheap enough for a VTuber to use it. Which I think is a very big market here in Japan. There is a lot of money in VTubing. I wonder, are they targeting more like an industrial type of client? So B2B or more B2C? I think we will have an answer when the price is revealed. You know that the VRChat community is very willing to get these kind of trackers. Right now, I think they are the main buyers for Vive trackers. Because they want to have their hands and feet inside the VR world. Regarding the VRChat community, it could be a very efficient market for them if the price tag is right. I was more thinking about the gaming industry where they need to reach more and more people. And they are still using the white board, the reflective board that you put on yourself. And it's long to reach someone. And something interesting also is that they still have their pin system from the Vive tracker. So you can have some force feedback, for example, on them. Or put that on any device. So I think for industrial use it could be awesome. However, I have the Quest Pro inside the controller. You have the same kind of camera that is filming the environment. And it's working great. But at first, when you start the application in the headset, you have like 1-2 minutes where you have to move around your controller. So they know the environment and start to be calibrated correctly in space. Otherwise, they are flying far from where you are at in space. And there, when you have 5 different controllers with no inside-out links, for example, if you use it with the Quest Pro. I really wonder how the calibration and setup could be easy for the user or the YouTuber, for example, to set up. So I wonder if it could be a good product for that. We'll see. I guess you'll be buying one of them if it's not too expensive. Just for you to try. For sure. So I guess this is it for today. Do you have any final words or thoughts you would like to share? One take? Two takes? Three takes? Okay, so we are good for today. Thank you, too, for your topics. And we'll see you next week for the next episode of Lost in Immersion. Thank you and have a good day.

Lost In Immersion

Episode #{{podcast.number}} — {{podcast.title}}

Transcript

Show Transcript
{{transcript}}

Subscribe

Spotify Apple Podcasts Google Podcasts Amazon

Episodes

#{{ podcast.number }} – {{ podcast.title }}

Credits

Podcast hosted by Guillaume Brincin, Fabien Le Guillarm, and Sébastien Spas.
Lost In Immersion © {{ year }}