Welcome to episode 35 of Lost in Immersion, your weekly 45-minute stream about innovation. As VR and AR veterans, we will discuss the latest news of the immersive industry. Hello, guys. Hello. Hello. So, Fabien. Okay. Do you want to start as usual? Yeah. Please do. So, my topic for today is about the usage of virtual reality at work, in remote work, or at the workplace, actually, but in the virtual reality context, especially for meetings. So, this article that I'm sharing right now is exploring, and they did a survey between users, between traditional video conferencing. They were using Skype in that case, and virtual reality for doing meetings. So, they did the survey last year, and the results are coming up last week, I think. So, first, you see these numbers that are pretty positive. So, I would say an improvement on all the factors that they were measuring, and especially a greater sense of what they call closeness to colleagues. I think what we can understand by that is because of virtual reality being in the same space, it's actually like virtual space, but 3D space improved the presence of the others during a meeting. And so, one thing to note here, the 15%, sorry, that says more comfortable, they are not speaking about the physical comfort. They are talking about, I would say, the psychological comfort. So, users found it easier to be in the meeting than on the Skype standard conference. So, that being said, you can see here a screenshot of a meeting. That being said, if we look down in the article, a very interesting point that I'm highlighting here, and that they didn't put in the big number. So, they found an increase in exhaustion of meetings for the participants that were using VR. So, as I said, I can agree that sometimes VR is more immersive than the traditional video conference. So, maybe it's more engaging. So, when in VR, we need to be more focused, more attentive, and that leads to maybe more exhaustion after the meetings. So, yeah, that's the output of this survey. And I'm curious to know, first, if you experienced the same kind of positive outcome and negative outcome, and what do you think about these results? Maybe we start with you, Seb. Yeah, it's interesting about the exhaustion. I wonder if it's because it's a new way to communicate. So, you may not have the same use and habits as you do with your computer when you share a screen or stuff like that. You have more complex operations to do at first when you don't know the technology. So, that might be improved with the quality of the headset also, being able to do end-to-end interaction more fluidly. But, yeah, that's interesting. And mostly this number, because that's either it's because you are more focused or because you have more manipulation to do that are not as fluid as with a keyboard and sharing your screen on a projector, for example, which is something that is more in the habits now. And everyone knows how to do that. So, yeah, I will be keen to have more info, but what was the reason for the exhaustion? Yeah, so maybe there is like a learning curve for people in VR. And tools to get more fluid interaction. Also, maybe AI can help with that too. Yeah. One thing that I forgot to mention is in the article, they say that the benefits were higher in a collective meeting, like workshops or stand-ups meetings, but in a one-to-one meeting, people didn't find VR to be so much better than standard video call. Guillaume, over to you. Yeah, maybe I have some clue about the 20% increase. If the picture you're showing represent the kind of meeting that they were trying in their study. We can see this is a whiteboard and different boxes and so on. So we can probably suggest that the interaction was some kind of minority report with laser beams and so on. And we know by experience that those kinds of interactions are quite exhausting because you are not used to keep your arm in the air for such a long time. And we know that for any kind of use, it could be VR or AR. We know that this is some kind of fantasy of working in the air, but we know that there is a study about this. I think this is like five minutes. When you are beyond five minutes, your arms are just too heavy and this is very, very hard for you to work. So we can imagine that if they did this for a long period of time, of course, you would be experiencing more fatigue than when you are on your desktop with your mouse and keyboard. For now, for this kind of works or typing, for example, air typing is very, very demanding as well. So we know that there is not much to do about this. And I think this is the same when you are doing collaborative work with other devices like touchscreens and all kinds of different types of interactions that we are not used to. Because it requires you to make more moves and it's basically more physical activity that maybe we are not used to do on a daily basis. So it's not shocking to see this 20% increase in exhaustion or people simply being more tired using VR and AR and all these collaborative works. But I clearly understand why this is an improvement regarding what we are using today with Zoom or Teams. Because once you have a large audience, I guess there are like maybe 10 people in this meeting, we all know that this could be a mess or you are not as integrated in the meeting as you would like to be. Because some people are talking louder or you don't really know what they're doing. And only one people usually get the control of the screen. So, of course, this kind of implementation would be better than what we are doing now. And the last point is that maybe because they are not using, I think those are very simple avatar. So maybe by not having realistic or maybe easy avatar to recognize, you have more concentration to have. You have to concentrate to project, to visualize the person you are speaking to. And this is maybe one other key point for you to be tired of. But overall, I think this kind of study will be more and more. We see more and more of these studies because of the spatial computing approach and the idea that at some point we won't need any more screens and we would have mixed reality, AR or VR as a daily basis tool for us to work. One thing I wanted to mention and you said it, not seeing the face expression of who you are talking to is, you are missing a lot of the intention of what the user is saying. Even if you have the gesture, losing that part of the interaction is a lot. So until the avatar, the real expression and mimic exactly the user face, I think it's also, like you said, more work on the user side to interpret and to focus more to really understand what we are saying. And I tried also with the desktop interaction where you can see your screen and use your mouse and keyboard. And it starts to look okay, but with the distortion, that's not usable for a long time. But at least you get a nice interaction and what you are used to and sharing your screen, doing some stuff is easier. So that's an improvement to keep the user keyboard, the mouse interaction that people are used to now. And that works great for this kind of interaction where, like you said, you don't want to have the post-it and type it on a virtual keyboard and place it in space. That's exhausting. So that's, I guess, what the Vision Pro is trying to bring. Yeah, I think that's very interesting. So a few things that I forgot to mention. The timing was 45 minutes, so it's quite long for this kind of meeting to be in VR. And yeah, so especially as you mentioned, Guillaume, if the hands needs to be always in front to drag something or to type, 45 minutes is a very long time. Okay, perfect. So if you want to move on to your next topic, Seb. Sure. So today I wanted to talk about the new Xreal R2 and R2 Pro announcement and release of the glasses that they did. So it's a new glasses that is not 6Dof, it's only 3Dof. So there is no cameras, no computer vision on it. It's only tracking your head, movement, orientation, and it allows you to plug your computer, your phone or gaming console to play games, to see stuff in space, like on a huge screen in front of you. So what is interesting is that I tried the first one, the R1, and it was not that comfortable, quite heavy, and they did a huge progress on the size and the form factor to make it work better. They adjusted the screen quality. And yeah, it started to be more comfortable, it seems, to use. You can customize it. Something that I did, and I will try to find it in the video, is yeah, the fact that you can dim down the surrounding glasses to put you more in a VR space and isolate you from the surrounding. And even hide what you are looking at so nobody can see your screen. So it's not really your glasses, even though they have a Nebula app on your phone that you can launch and try to see some 3Dof augmented reality experiences, which are like the Oculus Go at the time. So things that are displayed at a certain distance from you, and you can point with your phone and use your phone to interact with it. And I did not see any record of that yet. But I know that I tried to use the development kit and SDK to develop something on it, and it was not working that great. And it seems to still be the case. This seems to have a lot of improvement to do on the app on this side. But what is interesting is the form factors start to be like real glasses where you don't even see that there is a screen inside those glasses. The sound comes from the sidebar. And I think it's missing now AI improvement, like what they did with the Ray-Ban glasses with Meta. But that could start to be interesting to see interaction with AI and work with, try to adjust what you see in the screen with an AI tool. Get some information around what you are and display the information directly in the screen instead of having the sound in the ear. So, yeah, what's your thoughts on that, guys? Yeah, I think it's very interesting to see the evolution of the product, as you mentioned, the form factor, and I guess the resolution as well might be updated. And so, yeah, actually, except with the Quest, I didn't test these kind of glasses yet. So, I'm really curious to see what kind of feeling there is to, for example, look at the movie for one hour or two hours and see if it's really comfortable. And if the size of the screen that is allowed by these glasses really improve the experience. But, yeah, I mean, except for the audio, which I'm not sure how it will behave in the plane, I guess, yeah, to watch a nice movie in a plane, it can be a very good experience. Maybe they can partner with airplane companies to implement glasses in the plane. And one question I have is that they also have this kind of small device that can be connected. So, do you know what this device is for? Yeah, it's made to allow this kind of spatial display. So, with it, when you plug your phone onto it, it's like an external processor that does spatial localization and allows you to ensure the screen in space and be able to look around and not having the screen following your direction, but really anchor it in space. With more quality than what you get if you plug it directly to your phone, for example. And same for the games and stuff. All the screen can be anchored in your space or can follow you, but with a smooth follow or do a side view. So, like, if you want a tutorial or something in the side of your screen, or if you want to walk and still look at a video, you can have it onshore, small in space in the corner of the vision. So, this is what it allows. Okay. Yeah, interesting. Thank you. Thank you. Yeah. A few things about this. First of all, I don't found their communication very honest, to be completely clear with you guys. First of all, it's really hard to, to, to found out the resolution of the glasses. If you look at their websites and the global communications, they always say that it's the best resolution available on the market, given the Sony latest screen they are using. When you make a little research, you find out that it's only full HD. So, not a very high resolution for our time. We know that most of the products are now nearly 4k. So, this is the first thing. The field of view is still 46 degree. So, it's basically the same that we had with the HoloLens one. So, once again, not a very big field of view. And then you have, of course, they improved the form factor, but you can see here what they call the beam bundle. Basically, they took all the computation power and they put it in this little box for them to have this lighter form factor of glasses. However, you can't do much, because for me, the best use case for those glasses are what they are calling the body anchor, when the screen is fixed in front of you. Because for having tested this kind of glasses, having the screen following you around when you're moving your head, it causes a lot of discomfort. I'm not very fond of this. So, for me, the best way of doing this is with this kind of smart glasses is the body anchor. And if you want to have this feature, you have to use the beam bundle. It's not supported without it. And the last thing is about the refresh rate, because they are presenting it as a 120 hertz. However, it only works when you are not using any kind of body anchor. So, once you activate this feature, you are only 75 hertz, which is not ideal for most games and some movies as well, when you have a lot of movement. So, basically, for me, the best feature they've done here is the electrochromatic feature that allows you to make the lenses darker. You have three different levels of it. I found it very, very cool. But, yeah, overall, those glasses are still, for me, some kind of gadget at this point. Because if you don't have this body anchor and the frequency and the resolution and the field of view, because we are very, very far off what we are seeing here now on the Quest 3 or any kind of other VR headsets or mixed reality headsets. So, not being able to have all this latest innovation just for the form factor is, yeah, the step is too big for me. Okay, it's cool, it's light, it looks like glasses, but the overall experience is not very high. Given the price as well, if you want to have the glasses plus the beam bundle, it's more expensive than a Quest 3. So, you really have to make your choices. However, I don't really know if it's only a marketing or communication strategy, because they are claiming that they are kind of sold out or maybe there are a lot of orders for the glasses. And because of that, they are expending the time for you to pre-order. So, maybe you can see it two ways. First one, yeah, they have a lot of orders and they want to take all the benefit of it. Or the orders are not that high and they have to expend the time for them to reach their goals. So, I don't really know which one is really the case for them. But we can see, however, that those smart glasses are something, because there are lots of manufacturers doing them. I guess the Xiaomi glasses that were announced last year, they are still not available at this point. So, I don't know if they just simply cancelled the project or if they are experiencing some issue with some parts of hardware. So, I don't know. Yeah, I'm really excited about the Xiaomi as well. It seems like they have tracking. So, it's a Rheo 6 DoF device. One thing about the being sold out or is it a marketing strategy, we don't know indeed. One thing I witnessed is I went to an event here. It was an XR event with a lot of companies. There was Meta, Lenovo, Lynx and this kind of hardware companies. And the booth that was the most people in was X3 All. So, it seems like either they have really cool marketing and people are attracted by that. Or it's maybe a device that looks less intimidating than a full headset. Maybe people think that it's easier to use because it's lighter. And just that usage, that perception of it being easier to use. Maybe it's like a first purchase that the user might want to do easier than doing purchasing a Quest. I don't know. That's just my guess. But I'm curious to know what you think. Yeah, first I agree with all what you said, Guillaume. I tried the glasses several times and I don't feel like it's a nice experience. Even the body onshore, like it is done right now. It's only onshore in space in a certain direction around you. But when you move, it's moving also with you. It's not staying in place. So, after a while it feels very weird to have this. That's not something you are used to in your real environment. The screen is very small and if the glasses are not perfectly adjusted to your eye position, then you almost don't see anything. So, like you said, if it's the same resolution as the HoloLens 1, I don't see how usable that is with particularly this kind of quality of HD only. Like you said, the beaming is what is cool on this one, the way they integrated that. That's really cool. And for having done their booth at the CES, what is really nice in the way they do that is that they completely customize rooms with pictures on it. So, things make more sense with what they display. But that's faking a bit the usage in a normal environment. So, yeah, that might be why. And like you said, they are really focusing on making something cool, a nice booth and the glasses look good on you because they are close to Ray-Ban glasses. So, that's their way to communicate around it and get a lot of people's attention. I think that's it for that subject. Okay. So, last one. Last one here. So, just before we start with this last topic, just a new piece of information about Meta. They are still losing a lot of money because of their reality lab. But apparently, Mark Zuckerberg is not shocked by this. Just to mention. But overall, they lost like 21 billion since last year. So, it starts to stack up. And I didn't see their reaction on the market about this. As we talked last week, Mark Zuckerberg is still maybe the most convinced person that VR will be something in the hopefully upcoming months or years. So, this is about passion and conviction at this point. So, maybe not that economic strategy. So, yeah. And so, for my last topic about something that I found very interesting is that a new study just came out showing that you can generate cool sensations without using extensive material. You're just using visual effects and a cool air blowing on your skin. And your brain is doing the rest. So, why I'm talking about this? Okay, the research and the paper is great in its own. But it's just as a reminder that our best friend for us to generate sensation or haptic effects as well is only by using our brain. We know that especially for locomotion, for example, you don't have to use a treadmill or very complex equipment or hardware for you to generate long runs or long walks. There are studies that show that you can simulate the users to walk around in a circle and you have the sensation of doing a straight line. So, I guess we're not working enough on this psychological or like tricking the brain effect. We could do more. We know that you can simulate haptic as well. If you simulate the visual and just some kind of external stimulus, not very hard, you can generate very strong effects. So, this is more as a reminder for us to work that way. I don't know if you guys already tried those. I personally tried the walking around as a straight line or just having the sensation that you are walking a very long distance or that the environment that you're immersed in is way bigger than the room you're in. Once you are taking off the headsets, you're just shocked to see that you were just in a 10 by 10 space all this time. Yeah, I did an escape game in Virtual Reality. We were putting the headset before entering the room, so we didn't know the size of the room. The scenario was in a space station. So, I was walking around and taking an elevator. As you said, I really had the feeling that I was walking in a full-sized space station. Actually, at the end, it was a very 4 by 4 room. They just thought about the movement in a so clever way. I think the elevator was done with going into the specific area in the room and below the floor there was a subwoofer. When the elevator was activated, the subwoofer was also activated giving the feeling of movement. So, yeah, I totally agree. This is not used enough and the capabilities are actually amazing. Is it Eclipse you are talking about? Maybe, I forgot the name. Okay, because I did the same, I think, experience, so I know what you're talking about. The elevator is the one that is tricking the brain the most because you really feel you are going up and it's just a small movement on the floor. I had the same experience in The Void for Ghostbusters in New York. There was an experience where you were going outside on the stairs that are outside the building in New York, the famous stairs. Outside the building and it was falling down at one point and they also faked that on the floor. So, all those works really great. Talking about that, I had a brain fart with odor, with a smell being sent in the Atoropa Park when we worked on that experience. I was doing many tests and doing the experience many times and at one point I felt like the smell was blowing into space because we have particles being displayed at the same time and actually it was not working at that point but my brain thought that I smelled vanilla and stuff like that because I had done this experience many times and I think the brain was just putting back the smell in my brain. So, yeah, interesting. Yeah, the scent was associated with the visual and your experience. Yeah, because the scenario was exactly the same at the same time. Very interesting. I just realized at the end that I put away the glasses and just put my hand on where the air was blown and nothing was blown in the room. So, yeah, that was an interesting feeling. But, yeah, if you have the experience, Guillaume, you were talking about the room because I never did that in Eclipse. I did only a small demo so I was not able to walk around the spaceship like you did, Fabien, but I know the space was, like you said, 4x4 and there was distribution on the floor so I think that's really the same experience we did. But I would really like to see how it looks doing more distance and seeing that I'm going around in my space. I would really like to experience that. So, if you have a reference of games that does that. Yeah, I did the experience at Seagraph two years ago. But you're right about the void. We are seeing those shops popping around and I'm always very impressed by the size of it because now they are in big malls in the US and they are just like very, very small shops compared to others and they can simulate a very huge experience so they are mastering this effect of you being in this kind of little maze and simulate this very long distance-like experience. So, very impressive. Because I guess the first void was like a warehouse and now they are just very compact and small shops and I guess it's better for their return on investment as well. So, yeah, very good technology. I don't know what they are doing now. I stopped at the Star Wars and Ghostbusters experience as well. They are not communicating much maybe because it's just working as it is expected. I know that we have like two or three what they are calling VR arcades here in Quebec City and you can experience VR as a collaborative way. The first years you have those little boxes where you could experience VR on a single-use base but now they are completely switching to full collaborative works collaborative app and games and the space is bigger and they are just putting I guess four, five, four to five people at the same time and people are collaborating in VR and this is their main used and most used app at the moment. So, very interesting to see this switch to collaborative VR and once again it's a small space that simulates a bigger one. Okay, guys, so do you have anything more to add today? No, I'm good. Okay, so this is a wrap-up. Probably next week we'll have some more quotient splitting so keep looking at our Lost in Immersion podcast and see you next week. See you, bye. See you.

Lost In Immersion

Episode #{{podcast.number}} — {{podcast.title}}

Transcript

Show Transcript
{{transcript}}

Subscribe

Spotify Apple Podcasts Google Podcasts Amazon

Episodes

#{{ podcast.number }} – {{ podcast.title }}

Credits

Podcast hosted by Guillaume Brincin, Fabien Le Guillarm, and Sébastien Spas.
Lost In Immersion © {{ year }}