Welcome to episode 32 of Lost in Immersion, your weekly 45-minute stream about innovation. As VR and AR veterans, we will discuss the latest news of the immersive industry. So let's go Fabien, I guess you have some fresh news from today. Yes, hello. So today is the release of the new MetaQuest 3. And so I received mine this morning and I was able to do a few tests. And so I took a few notes of my first feedback, which is just a very, maybe one hour total of usage. So I think over the next weeks, we will test it more and see fully its capabilities. So first, the size. So it's a smaller form factor if we compare it to the Quest 2. The weight is about the same, a bit heavier. So I would say the comfort of usage is the same as the Quest 2. The straps are a bit different behind, so the support is better in the back of the head. I also have very small glasses and I'm able to use it with my glasses. So that's actually pretty good for me. And so then the onboarding itself. So when you power up the device, there is a process that is actually quite easy to connect it to the Wi-Fi. This is the mobile app, so you can sync the device pretty easily. It was actually pretty fast. And then there was a few updates, but it was quite easy to do. And something interesting is, I think, similar to what the Apple Vision Pro will do. When you put it on, you are in mixed reality. So it's not like the Quest 2 where the default view was VR. Here, everything is done already in the pass-through. Speaking of which, the pass-through, it's not as good as I expected it to be. There are distortions, especially with objects that are close to the device. If I look close at the screen, for example, I was clearly seeing distortions on the edges of the device. And in some specific situations where if I look at the corner of the screen, there is a wall behind. So I could see distortions in this kind of situation as well. But after the first deception, it wasn't as bad, actually. It was quite easy to use. I could stand up, go anywhere. The scale is good. So I tried to put my hand in front and remove it. And the alignment is almost perfect. The size is good. I can grab my phone, look at it. Even if the color is not perfect, I can read. It's not super easy, but I can read on the phone. A bit disappointed, I would say, especially on the color and the distortions. But then, once the onboarding is done, there is a mixed reality game that happens. You are asked to stand up to scan your room. This is done automatically. You just have to look around and the boundaries of the room are automatically detected. They didn't automatically detect tables. So there is a very easy process to put boundary boxes around the table and the furniture. It's a very nice game. It's basically a two-world defense. You have furry balls that come in and you have to capture them. You can actually destroy your walls and you are seeing the environment behind. That was really fun. Maybe I'm biased because I really like mixed reality, but this was a really fun game. It's very clever from Meta to have put that game just after the onboarding. So users can really try and experience a mixed reality game. The tracking itself. There is a depth camera. The tracking is really good. I tried to move very fast and the three objects were almost not moving. I tried in a sunny room after that and I could see a bit of jitter when I moved very fast. When a three object is there, it stays there. I'm pretty happy about this. The quality of the three objects, the rendering, the VR, is actually also very good. The lenses are very big. I forgot the exact numbers in terms of resolution and field of view. There's nothing to compare on the Quest 2. It's a really great upgrade in terms of quality. The controllers. They don't have the tracking circle around. We were worried about the actual quality of the tracking of the controllers if you move fast. I tried Beat Saber, which is a very fast pacing game. I didn't have any issues. I think we need more extensive testing to be really sure about overall. It's quite good. It took me a few seconds to realize that in Mixed Reality, they overlay the 3D object of the controller on top of the controller. It took me a few seconds to realize that I was not looking at the actual controllers but the 3D. Maybe it's an option that I can deactivate. I'm not sure. That was a bit disturbing. I think that's basically it. A very short review. What do you think? Let me know if you have any questions or things that maybe I forgot to talk about. On my side, I think I will buy one this afternoon. I was waiting for it to be released and going in the shop to buy it. Having your feedback makes me want one. I'm just wondering about the weight of the controller compared to the Quest 2 version. You said that you did not realize that the controller was displayed on top of the controller in 3D in the Mixed Reality environment. Was your hand displayed on top or were there 3D gloves on top of your hand? No, it was behind. Guillaume, you mentioned that last week that in the Mixed Reality view, the occlusions are not there. I can confirm that. But in the game, the occlusions were there. Maybe there is a process of scanning the room and then the occlusions will be there. I didn't get as far as this yet. What was the process of scanning the room embedded into the game or embedded into the headset onboarding? It was in the game, but maybe it's a feature of the meta SDK, I guess. About the weight, they are lighter. The Quest 2 are just a bit heavier. For my part, did you feel any kind of sickness or discomfort with the distortion? Because when we saw some footage of it, we could see all the scene wiggly or wavily around. I thought immediately how you feel about this or once you are in the game, it doesn't matter? I think if I had to wear it like here on my desk and seeing the distortions around the screen and around the desk, yes. I would have discomfort that could come pretty fast. But you're correct. Once I was in the game, I almost forgot about the distortions because the game was really engaging and really well done. And especially, I didn't have anything very close to me. I was in the middle of the room. So I guess the distortions are smaller for much further objects. I guess you tried it as a standalone headset without... First question, what about the battery life? How did you found it? And then, is there a cable like there was with the Metalink or whatever? I can't remember the name. Can you plug it in your laptop or PC for more powerful features? About the two questions, I don't know. I don't know if there is a link cable, something that we can research. And about the battery, because I was quite busy with work this afternoon, I just used it for one hour and I didn't have any issues. Just to know if during your one hour, your battery life comes from 100 to 50. Did it decrease very fast or apparently it didn't shock you? No. I can check real quick the battery life. So while you are checking this, I guess the articles we saw about the distortion and the fact that the cameras were not aligned with your eyes, they were quite right about this. So once again, will Apple be able to correct that with their Apple Vision Pro? It's a great, great question. A huge question mark about this. Yeah. I'm at 75% now. 25% through one hour. Okay. Seems pretty... Of onboarding. I'm not on playing all the time, so we'll have to check that. Okay. That's pretty much all the questions I had. So overall, great experience, despite the fact that what we were expecting, especially for the mixed reality, is not as good as it was advertised. So the great question here is now, can we work with that? Or is it just an improvement of the Guardian and maybe some fun games, but maybe not for professional use? So I guess you'll have to try that furthermore to have a better review about this. Yeah. Something that I want to try more is the persistence and quality of the tracking. So especially, as you said, for professional applications, if the tracking... If I put an object at an exact position, does it stay there? And are the distortions really impacting the experience itself? Yeah. That's the first test I want to do also. And I want to try also the distance and the size of the space you can map as soon as possible to see how big the space, the environment we can work with, that kind of headset. Do you know if there is some kind of persistence of, like you said, the bounding box that you created to make your environment better? I guess you didn't have the time to do so, but do you know if those bounding boxes are staying there or do we have to put them every time we are using the PC headset? So I did... I'm not really sure completely, but when I did the calibration in the other room, I came back here, I put the headset, it says you are not in the room that you scanned. I went back and I didn't have to scan again. Yeah. Yeah, normally they offer on-shoring, so normally that's really embedded in the SDK, so that's something they should offer. And you should be able also to scan multiple rooms and position on-shore in all those rooms. And when the headset recognizes it, you can reload it and know in which room it is in. So that's also something I want to try soon. The other lens 1 and 2 were really great at that without that depth sensor. So now that they added that sensor in the Quest 3, I really hope this is going to work great. We'll see. Okay, great. So maybe we'll have some other questions during the stream. I guess we will be reflecting about this. But Seb, can you present your topics as well? Sure. So this week I wanted to talk again about 3D Gaussian splatting that are improving and we see more and more stuff being done in Unity now. So in real time, some rendering. Here the guy is comparing the NERF rendering here on the top compared to the Unity 3D Gaussian rendering. And yeah, the result of what they get with that is only pictures of those assets. It's quite amazing. This is the first one I wanted to show and really the reflection on the glasses, the imperfect materials that you can see in it that are normally done in normal map and redone and redone to make sure they correspond to the reality. Here, everything is done through multiple pictures that have been pre-rendered on the software that generates a point cloud and on top of that point cloud now there is a render system that allows this 3D Gaussian splatting algorithm to render the scene quite amazingly. So this is the first one and there is another one which I found interesting too where they use the 3D Gaussian splatting but also modify the render of each point cloud to change its physics and change its color to fake a burning effect that looks pretty cool for me. And same in real time in Unity. I think the scene is quite easy so when they are moving you can see that the point cloud is not updating right away. It takes a bit of time to re-render everything correctly. So that was my third subject. I don't know if you want to react on it. Yeah, I approve that Gaussian splatting is still going on. We did an episode when you were on holiday or you were working I guess, when I gave some review about this. I'm still working on this, trying to find the best solution for to get the best result with the smallest amount of input time for getting the information. As I mentioned, I'm always a bit questioning how they can get such a beautiful result despite the fact that they should have a very very heavy and powerful setup because every time I'm trying to have a huge data set for Gaussian splatting I'm very fast limited by my amount of input time. And I don't have a lot of memory on my graphic cards. So now I'm updating my setup to better RTX cards and I hope that I'll have a better result as well. One thing that is very interesting in the community for people that are using 3D Gaussian splatting is that a lot of newcomers are working with old graphic cards like P100. Those are Tesla cards dedicated to computation power. So those are very cheap cards that you can get for like 500 or less. And the main issue of this is that they are they don't have any way of being. There is no vents or whatever on this or the community is trying to find a way to make it work. Over it during time. So it's very very fun to see that. But why are they using those cards is that they are like 20 gigs of memory on these. And this is very interesting for this kind of AI and especially for AI computation and especially for 3D Gaussian splatting. So I'll see what I will do with this. But you can see that people are trying to find the cheapest way of having a lot of memory for them to do with artificial intelligence. It's very interesting to see. So, yeah, I'm very glad to see that everything is improving at a very fast pace and hopefully we can get way better results and more integrating integrated application as well. Yeah, I don't have a lot to say on this, but it's interesting to see a technology that is very fast to be adopted by the community. Like, I don't know, it's been like a month maybe and already plugins and applications everywhere and optimization. So that's that's very interesting. And, yeah, curious to see how long it will take for this to be available on. I don't know. I mean, the viewer to be available on mobile or web to be that could change a lot of things. Right, I think they already did some tests on VR device. So I guess on mobile that should be available to augmented reality. I think I saw some demo with NERF, not with 3D Gaussian yet, but I guess it's coming definitely for web. Not sure. I think Luma AI has now switched from NERF to 3D Gaussian, this option too. So it's already available with a viewer online where you can check your 3D model and then download it and do whatever you want with it. But I'm not sure that there is any other functionality on top of that, like a VR functionality on top of a web page or XR availability. And we have to check Luma AI to see where that. Right. The other project I wanted to talk about is this announcement about C-Real. They revealed a couple of months ago their new system. So those guys are trying to make glasses that allows to have holographic 3D model displayed in front of it and allow the user to fly. And focus on different distance to really mimic the way your own eyes are working. So here they are showing the fact that the lens are ready for them. They are starting to sell their glasses in early 2024 for integration in other device. So it will be for manufacturer to embed their technology into their own headset. One thing I'm wondering because they show really nice use case where there is some focus done in real time with their glasses. But they are not showing yet how they will track the user eyes and how it will be done in real time really with a perfect adaptation of where the user is looking at and what needs to be in focus. And for this I'm really wondering how it will be embedded. If they are planning to provide this too or they are planning only to provide the capability of their glasses and they will ask the manufacturer to work on the way of getting the eye position where you are looking and where you are focusing. But yeah, it's a nice improvement and I would like to see them at an event to check how it's looking really inside the glasses and the quality they obtain for the rendering. So I don't know if you saw that news, if you have any feedback or thought about that. Fabien? Yeah, I mean as you say I'm curious as well to know how they track the eyes. Because on their graphics they don't even show the different modules that are on the glasses and they only talk about display, display, display. So I don't know if we are missing something here. But yeah, it could be very nice. And I'm curious also to see how VR headsets companies will adopt this. Yeah, they would have maybe challenge on that with collaboration, with Meta, with Arnaud. We want to have this in their device. Yeah, very cool to see progress there. The two things, first one we can see that optical see-through is not dead, which is a great thing because we know that most of manufacturers are now working with video see-through like Meta and Apple Vision Pro. So at some point we could have thought that maybe this kind of technology is too hard to do because since the HoloLens 1 we didn't have much improvement despite the fact that we are slowly gaining field of view. But the technology itself, well, there was not a huge step on from HoloLens 1.0 to 2.0 and Magic Leap 1.0 to 2.0. Well, it's better. The form factor is better. So small improvements, but not groundbreaking ones. And this one could be one. However, as you mentioned, so there is, from the first video when they showed the HoloLens 2.0, there was a lot of improvement. However, as you mentioned, so there is, from the first video when they showed this technology to this new one, well, they are never moving their head around. There is no tracking for the environment on their device presentation. So we can, so there are maybe two ways of thinking about what their strategy is. Is that first, they can provide the technology for other manufacturers or maybe they are targeting this assisted reality market with smart glasses, because there is no tracking here. So if you want this small form factor that they are presenting now, you can have external tracking to this, especially LIDARs or depth sensors and so on, because your design will be way more bulkier than what they are presenting now. So I don't know what their goal is, really. But yeah, the technology is great. And once again, they prove that they are making it better for months to months. And we can see the evolution here. Yep, so that's it for me. Same, same reflection as you. Okay, so I will conclude with my topics as well. It's a very too small information, really. So the first one is the information about MetaQuest 3. Apparently, the selling numbers are not as good as they were expecting. The behavior here about what is going on is basically the users of MetaQuest 3. But there isn't new comers to the VR community. It's mainly the same numbers that are. Yeah. So in one way, the strategy of MetaQuest 3 is to make the users of MetaQuest 3. It's mainly the same numbers that are. Yeah. So in one way, the strategy of making the mixed reality. Something that they were willing to put on the front. And the front page is not working as good as they were willing to. We could argue that maybe their marketing strategy to announce the headset was not on point as well. We discussed about it. There was not any kind of big announcement despite the MetaConnect. But for mainstream users, I know that, well, this information is absolutely not on top of their heads. Because there was not mainstream communication. So I'm guessing it's not that surprising that they don't have better numbers. Because they were targeting the community itself with the announcement. Because only specialists and people that are using VR were willing to have more info and getting the headsets. So maybe in the upcoming months when developers will bring on new games or new applications, maybe those numbers will go up. We'll see. However, Meta is already announcing that they are laying off again employees from their silicon units. Silicon units, which is the unit where they were supposed to create chips for the headsets to make the metaverse something real. So we can see that slowly this metaverse team, this delegated metaverse team is shrinking. So we'll see if this initiative is completely disappearing towards more VR, mixed reality or spatial computing, as they are calling it now. Initiative, we'll see. But the communication here is once again a bit weird. As you are releasing a new headset and a few days later you're announcing that you are laying off employees dedicated to the metaverse. So very strange strategy here. They could have waited maybe one month or two to announce that. And I do all the different small news that I have. It's about Immersed. We talked about this. They are glasses dedicated to working in mixed reality. They are two different sets of glasses. One was cheaper than the other one. And given the orders, 96% of the orders were towards the most expensive one. So they simply cancelled the cheaper version. So why I'm talking about this is that it shows that if your product is on top, well, is answering the demand, the price doesn't really matter. So it could be an argument towards the Apple Vision Pro once again. That if your technology is mature enough, if you have something new to propose and people are excited about it, then the price is not really the problem. People are willing to get the device despite the price. So what do you have to say about this? It's very interesting news, I think. We talked about that many times already, but this year is a very, very interesting year for virtual reality devices and all of the related technologies. A lot of things are happening, as we can see, about Meta and Immersed, as you say. The funny thing about Immersed is we don't know how many orders they have on the expensive one. But it's interesting to see that the quality and the market fit pays for them. One thing I saw, it's related. I saw news about the Apple Vision Pro that users are saying it's very heavy, like in the front. That's something that seems to be quite a huge issue for the Vision Pro. One thing that I forgot is that Meta announced that next year they will be releasing the real competitor to Apple Vision Pro. However, they mentioned that there won't be that many units for us to get and that the price will be very much expensive than what we are used to with Meta. Also, it's very interesting that they are now positioning another headset dedicated to compete with the Apple Vision Pro. Sorry, Seb. Will it be the Pro version? No, apparently not. The Pro will be dedicated to VR and this new headset will be very much dedicated to mixed reality. It's not the Ray-Ban one we talked about last week or the week before. It's a completely new one. Okay, so they are planning four or five headsets already. One very cheap at $200, something like that? I heard $300 but it should be maybe in Canadian, not in US dollars. The Quest 4 Pro, something like that? Yeah, Pro 2, Quest 3 and those very unknown glasses that would be mixed reality ones. Okay. Not sure if this is a great strategy. Yes, they are still investing a lot of money. It's amazing your news. I really wonder if the strategy to make a $200 version of something that will be similar to the Quest 2 probably makes sense. I guess the immersed news can be related to this will of Meta to make cheaper headsets because they want more users and bring their community to another level. I guess this is not the issue here. We already discussed about this but I don't think making it $200 or $300 headset will make any difference because for this price tag, you won't have the mixed reality, you won't have the comfort that we can have in the Quest 3. I know that it was the Oculus Go, if I can remember correctly, the cheaper version, 3 degrees of freedom that were made for trade shows and so on. It had some success on the professional side but it was not a success on the mainstream, main public one. I'm not sure that bringing on a cheaper version, especially announcing it now as you're releasing the Quest 3, people will be waiting for this to be released. You are crushing your curve of getting new users at the launch of your headset. I'm not sure that newcomers are willing to have those cheap versions. I don't know. It really depends on the specification. If those are good specs, maybe they will have less Quest 3 users and if those are not as good, there won't be any users for them. Once again, we don't really understand their strategy here, multiplying headsets, working on the low-cost or cheap side and getting high-end on the other one. It's very, very weird. We'll see. We'll see. Exactly. Do you have anything more to add? A new question for Fabien about the Quest 3, maybe? I have one quick thing. We spoke about the Unity pricing catastrophe that happened over the past weeks and today we learned that the CEO of Unity has stepped down. It's very unfortunate for him, but I think it was the only solution for Unity to try to get back the trust from the developers and from the community. It will be very interesting to see how that unfolds in the next weeks and who will be named as a replacement. I think it was expected that would happen. Fortunately for him, but I think it's a good thing. Yeah, they were at some kind of dead-end with the situation. They tried to correct their announcement, but people were really pissed off for this and they wanted the head of the CEO, so they got it. Now we'll see, as you mentioned, what will happen and if there will be drastic changes of direction. I guess some people are quite happy with the announcement, but we will see what it means in the future. That's great news. Great news, anyway. Great news!

Lost In Immersion

Episode #{{podcast.number}} — {{podcast.title}}

Transcript

Show Transcript
{{transcript}}

Subscribe

Spotify Apple Podcasts Google Podcasts Amazon

Episodes

#{{ podcast.number }} – {{ podcast.title }}

Credits

Podcast hosted by Guillaume Brincin, Fabien Le Guillarm, and Sébastien Spas.
Lost In Immersion © {{ year }}