Welcome to episode 19 of Lost in Immersion, your weekly 45-minute stream about innovation. As VR and AR veterans, we will discuss the latest news of the immersive industry. So let's go. Fabien, if you want to start, please. Yeah, thanks. So today, the topic I want to discuss about is a research paper that was published last week between META and Seoul University. And so a bit of background is to have a good presence in virtual reality. So to really if you are interacting with someone, it's of course better if the movements of this person are correctly translated onto the avatar that you are seeing. So the hands, every movement. So currently it's not very convenient. It works well if you have like a motion capture suit, but it's expensive and not everybody can afford that. So what META is working on is just by using the headset position and the two controllers, they use AI, of course, to reconstruct the movement of a full body just using these three informations. Well, three informations, three devices. It's more than three informations. And they show the video of the result. So you can see the movements that the real person is doing and the three positions of each element. And what they had also is a reconstruction of the environment. So the chair, the box, the sofa and so on. And as you can see, it's pretty impressive what they managed to do. So we can, you know, go into sitting position, standing position and such. You can see how it works by going over boxes. So, of course, it's not perfect. I will try to find, I think it's at the end of the video. You can see ways that the system kind of fails to manage the position, but it's still impressive. So I think it's just a start because here, as you can see, what is needed is to have the environment of the user also in VR. So if there is a chair that you are sitting on, you saw the avatar. I will play that again. That's funny. So you need to have the chair, if you have one, you need to have it in VR as well. But I think for most, you can see here a failure as well of the system to get back in a standing position. But anyway, I think it can be very encouraging. So usually for the problems that we have in VR, there are two solutions, like a hardware solution and a software AI solution. So it seems to be like a mix between these two solutions. And yeah, that's it. So I'm curious to know what you think and if you have comments or things that I forgot. Seb, I'll start with you. No, it seems to be a clever way to use the device and the controller and the information that we have right now to reconstruct a character position. With EA or AI or deep learning, to have the movement of the character in 3D that makes sense and not jitter in every direction. It seems to be smooth. I think it could be nice to have that in the game right now. It's more floating in most of the metaverse games that exist out there. So it's nice to see progress on that. Yeah. What about you, Guillaume? My reflection on that is, so we are all agreeing that the idea is to recreate the body of the user so that you can see yourself in VR with your headset without using any expensive other characters. The thing is, okay, they are making some progress, but before being able to make it a reality for users, I guess there's a huge gap. I don't know if you remember the time where we were doing inverse kinematics everywhere with VR headsets, especially at the beginning. Newcomers to VR find out that not seeing their legs or their arms were very troubling. Well, now people are used to that, so it's not that much of a problem. But we know that using this kind of prediction or way to replace real tracking is not that easy, and it may cause more discomfort to the users not seeing anything at all. I guess this is what is going on here. At some point, maybe we shouldn't be so willing to recreate absolutely the same movement, because if you are trying to be realistic and it's not, there's a little delay or mismatch between what you are doing and what you are seeing is very, very troubling. This is what Meta is doing, by the way, with Horizon, for example. They chose to cut the avatars in half and not show the legs because they thought it was more comfortable for people. However, when you are using VRChat, even if normal mode without any additional trackers, you can see your legs, and at some point, they're not trying to be realistic at all, and they're just moving around. Even if it's not that advanced, I guess it works, and it's more comfortable to see your legs or see the whole avatars of others. I don't really understand why they are so obsessed with not showing the legs with Meta, and they are, I guess, giving us some clues about what they are doing. They're on the absolutely exact movement of the users. Well, this is my thoughts on this, and I'm still thinking that they could make it a way faster just by adding some small characters on the foot. It shouldn't be so expensive, I guess, making these additional trackers. You can see that Vive did these inside-out trackers that can be used autonomically. Well, it can be single-use. I guess I don't know why they are making it so difficult to implement by using AI and so on, as they could be just setting two other scatters for the foot and make it way easier for them, especially with this kind of technology. I think they are still trying to address the market. They saw a lot of questions, and they tried to understand why people are not coming back to their Metaverse experience. They tried to find a way to make it with what they've got without having to buy anything else, making the Metaverse environment more realistic, more nice to look at. Yeah, but is this really what people want? This is always a question. Meta is always answering for people by their own view of what the Metaverse would be. I guess they should be maybe listening more to the community and what people are asking for them to be integrated or to be willing to go to the Metaverse. Because in VRChat, the community is using those Vivebox and Vive trackers for years now, and they are very, very happy with it, and they have the world tracking as well. Apparently, the answer is in the air, and Meta is not listening to this. They are focusing on newcomers and what they could expect and what they wanted to expect. So there is a different approach here. Yeah, I totally agree with that. One usage where I can see a good application is the location-based virtual reality or mixed reality experiences, because I experienced one a few weeks back here, and the avatars were really behaving weirdly. So it was a five-person experience in virtual reality, and you can see really strange positions of the hands, arms, and so on. And it was really disturbing to see that and kind of took me a bit out of enjoying the experience. So I think having a smoother, even if it's not accurate, but smoother movements of the others in such experience would be beneficial, I think. Yeah, I agree. We did Eternal Notre-Dame in Paris. We experienced one part together in group, and you move in group, and you see the other guys, and you can talk with your group, basically, on your tour. And yeah, the same, the avatar was behaving strangely. At least it was giving the position of your colleagues in the room, but it was weird to see the movement of the avatar itself. So I totally agree with you on that. But yeah, is it a market that is big enough for Meta to invest so much, instead of, as you were saying, Guillaume, focusing on the users of, like, Horizon users? To this point, I guess the number of users is still decreasing. I know that it was free-falling from the beginning of 2022. And I guess with all the metaverse controversy and so on, I guess they don't have a very strong community. And I don't know what the numbers are right now for the Horizon use, but I'm guessing it's not that good. I wonder about Wreck Room or other metaverse platform that had a community at one point. They are increasing the numbers of users that we know. It's one of the platforms that is booming for one reason or another. I don't really get it right now. But yeah, it's funny because it's on the same approach that Meta had by having those cartoonish, legless avatars. But as they are focusing on games and that the games seem to be enjoyable, apparently there is a community growing up in this. And the choice of Apple to be compatible with this seems to be a reasonable one. At least for now, maybe when the Apple Vision Pro will be released, the community won't be as much as it is right now. But it is taking the Roblox road right now, so I can expect a huge community in the upcoming months. Okay, cool. Well, I think it's a nice transition to the next topic. So my topic was about the information that was released last week about the Vision Pro not being able to do room tracking. So not being able to recognize where it is in a specific room and reposition itself. Or being able to move in a big space and reposition an element more than 1.5 meters away from the initial point. So it seems to be a device that you use only at one standing position and can't really move around with it. So that's weird for me. What really makes sense for me is being able to move around. And here they are trying to limit that directly in the code. They are also talking about limitation on the speed. So if you move too fast, they will cut the Mixed Reality experience and go back to a VR experience. And tell you that you have to move more slowly to go back to the Mixed Reality experience. So I guess that's to make the experience always great and not have something shaky or weird for the user. But still, that's a huge limitation on what can be done with GXL. So what do you think of that, guys? I think it's a surprise as well. When I saw the news, I was wondering. You mentioned that one reason might be because they want to preserve the quality. Like things are not shaky or twisting. Sometimes it happens on a Mixed Reality device. Well, oftentimes. But I wonder if there is another reason. Like security or they want people to bump into a wall. I don't know if you have more ideas about exactly why they did that. Or is it just a limitation for the beginning and something that will be lifted afterwards? I'm really curious to know the deeper reasons to that. So just on my take. Because there was another additional piece of news about this. They are completely confirming that they won't do any additional trackers. Especially controllers. And that they won't support third-party manufacturers as well. So there is this news plus the other one. I guess they are completely shutting the door to the VR support that we discussed a few weeks back. This is not really understandable. Especially when they are saying that they are supporting the Rec Room application. This is very, very strange. Because this is one of our fears, I guess. That this headset is only for some kind of AR usage. And for people just sitting and standing in a very, very close surface. Or not moving around that much. I'm feeling like a step back at some point. Because now we are very used to the 6DoF movement with a large scale. With Meta and all those headsets. Since the Mixed Reality initiative with Microsoft and their headsets. When we found out the inside-out tracking without those trackers all around your room. So why are they doing that? I really don't understand. They are shutting the door to the VR community, VR world and so on. And for me it's not a very good sign of what the headset would be able to do. I was really counting on the developers to get their hand in the code. And found out that they could do very well. More visionary applications, especially in VR. Apparently they can't do anything about this. Because everything is limited. So they will lose a lot of VR developers. That were willing to create some very interesting stuff on these headsets. I don't know if it's something to get high quality content. But by that I will answer that right now in the inside-out tracking. We can say that the experience is great in VR. We don't have this shaky or whatever tracking issue that we had in the past. Or maybe the captors in the Apple Vision Pro are not that good. Or maybe they don't master the inside-out tracking yet. It's very weird about this. So I don't really understand their marketing strategy here. The more time passes, the more I'm very anxious about what this headset will be. I will try to play the devil's advocate. Seb, you said it's 1.5 meters from the initial position, right? Well, it's 3 meters. It's a size of 3 meters maybe. I don't know my room here. I don't have 3 meters wide to move. So I don't know if this kind of usage where you can move but not that much. Maybe one of the most used cases for VR. People that are in their house in a small area. That's me trying to defend Apple but I'm not really convinced myself. Yeah, I agree with you for VR. But for augmented reality, mixed reality, usually you want more space. Yeah, we saw with HoloLens that they were bringing up the room scale or bigger space. I remember that you could move around the whole building and coming back to one room. And see your augmentation back where it was for the first version. One other point that I would like to discuss with you is that we know that Apple is targeting the coaching or well-being with their headsets at some point. So we can, despite doing yoga, maybe at some point they were willing to do some more intensive sports with the headsets. And with the weight of the headset, this lack of tracking. And if you are saying that it kind of shuts down if you are moving too fast. Only the mixed reality. Well, right now the mixed reality sport is a reality as well. So you can say that the VR part is out of the table as well. So it's very weird. It's not synced with their view, I guess. Maybe they found out that it's not as powerful or whatever they could do. Or it's just a prototype plus plus for them to show. And there will be something better in the years to come. I don't know. So we all agree, it's weird. We don't like that news. Yeah, the second news was about this IR monocle news. Which seems to be, for me, an upgrade of the Google Glass. The screen that you have in front of your eyes. And I saw that the Oculus founder backed up this company and invested in this company. So it looks like this. It's weird to me. It needs, I guess, to be integrated more. But they say it's only 15 grams. So not heavy at all. You wear it only on one eye. And like you see, there's no camera, no tracking. So you can only display the information from the device. You get information from your phone. So it streams information from your phone to the device. So you can get maybe GPS location. Whether our information or our views or your name. But the Google Glass needs no more than that. No more motivating information than that. So yeah, I wonder what you think about that kind of device. Let's come back. I think what they are trying to do is to build a personal assistant. Because I think they hook it up to a chat GPT. So yeah, I think the usage is very, very limited as you were saying. It's a Google Home on Alexa that you wear on your eyes to get the results. So it's a display. But I don't know. I'm really wondering if people will wear that. I mean, we all remember how people would wonder about the Google Glass. But actually now we all have this kind of thing in the ears. And I'm really wondering about the adoption actually and the usefulness. Do you have any information about the price of this? Or is it still on the prototype site? It's $349. Okay. Well, you know I'm a huge fan of Google Glass. So I don't really see the point here. Maybe just on one phase one and hooked it to a different thing. Because what is interesting is the small size of it. The technology behind that is not that innovative. However, so well, maybe we'll see what people can do with this. But yeah, I'm not very... I don't really see what we can do with that. Something that is interesting in the article that you are showing is... I'm talking about the article itself. They are comparing that to Magic Leap 2 and Apple Vision Pro. But it's not even the comparison. Interesting. Yes, they don't want to compare it to Google Glass. Because they could have the same future. It's the idea to integrate that in more full-fledged setup to work. These two glasses like that on two eyes maybe. I don't know. Maybe as a first step, why not? If they have a more advanced roadmap in the months or years to come. Maybe it can make some sense. If they just want to sell this as it is and see what's coming. Once again, it's one of these innovation companies. Like Ultralip for example. They created a very nice piece of hardware. But without knowing what people will be doing with this. On the Ultralip side, they were lucky with the VR and so on. People get there with their products and find a way of using it. But I don't know if you remember the first years of Leap Motion. With their old names. They were just like, it's a very good tracker. Please use it. What do you want us to do with this? I guess a lot of companies are doing this by passion. And creating new devices. Especially in haptics. Without doing a market scan. Or knowing exactly what people will do with this. We are just launching a new great product. Please tell us what you will be doing with this. It's a very risky choice. I guess it's another reflection about the economical situation. We can see that maybe one of the Silicon Valley companies. Will shut their doors in the upcoming months. As we know, Silicon Valley companies are not making any money. Investors are not investing as much as they did in the past. Now, a lot of companies will just fall down. Because they don't have these market studies. And find out their customers. Maybe we will find out a new way of doing innovation. Where you have to find somebody to buy your products at some point. Which is not the case for everyone right now. Because they just have money for doing and creating. Without any pressure of what they will be selling. And how much they will be selling. Maybe a new wave of innovative companies in the next years. Interesting. Do you have anything more to add to this? No. On my subject, there was an interview of our friend. Palmer Luckey. It's a very long interview. More than one hour. There were some interesting things about the Apple Vision Pro. And the meta of course. As the old inventor of the Oculus headset. On the Apple Vision Pro. He said that they did the right thing. The positioning of the headset is the right one. For all this high-end VR headset. And their approach to the market is the right one. For him, the price is not an issue. At that price, there would be a community of developers. That would be getting their hands dirty. And find some new way of using it. And hopefully find new ways. Innovative ways of doing applications. With this headset. It stands up for him. About the strategy of Apple. He didn't speak about this room-scale limitation. Maybe he didn't have the information at that point. So we should look out for maybe a shift. On this. By Palmer Luckey. Which is very interesting. It is all conversation about meta. And their new approach to the development of their VR headsets. He is talking about the new iteration. Especially the Quest 3. And the fact that at some point. They were making an evolution. With their headsets. Especially with the eye-tracking. They have the Quest 2. And the Quest Pro. With the integration of the eye-tracking. And when you get to the Quest 3. They are not doing the eye-tracking anymore. For him, it is a failure. Because they are not improving their headsets. As they should be. By investing in some innovation. And putting them on the lower price. At some point. For the eye-tracking. It is a failure for meta. Because it is not well integrated. And they are not putting it. In the most. The more. The lower price. For everyone to use. This is one thing. And the other one. Is their way of trying to get the. To build a community. And for him. Meta is just playing the price type strategy. And he is mentioning. When the free is not cheap enough. The thing he wants to tell us. Even if your headset is 50 bucks. If the technology. Or the community is not there. Nothing will happen at all. What he would have done. With meta. Is maybe not doing. The race to the price tag. And make it. As affordable as possible. Like meta is doing right now. And maybe more. Doing some very well integrated innovation. Like the eye-tracking for a specific community. That will be creating. The whole meta horizon. To create a whole strong community. And create engagement. To the users. So that at some point. It would be strong enough and there would be. Enough user to. To lift the whole project. And make the metaverse a reality. So for it, there's a. There's a lot of things that are working. And. How to build an old initiative. That would be lasting. For years and years left. We talked about second life. Last week. This is exactly the case. They have a strong 1 million. Community. That is still there after 20 years. So we can see that you can build this kind of project and have. A lot of users. And the price is not the answer to that. Basically. Of course, I invite you to. To watch the whole interview. It's very interesting to see what he's doing right now. Especially with AI. Military. But yeah. This is what. I want you to share. Cool. Yeah. I don't want to. Critic. But I guess. We already answered. Yeah. Please. Go on. Yeah. Yeah. I totally agree. Something that I'm wondering is. If I'm correct. If I'm correct. They are both. Producing a headset. And. Trying to push the horizon. So they have both like the software and the hardware. And. To my knowledge, like it's a very specific situation. And very difficult. Difficult one. So. I don't know if there are. There is. Another company. That can challenge them. This to. To the table. Or. If we, what we will see is. The. The. Rise of via chat. Roblox. And the other one. And the. Not tied to a specific headset. And not tied to a specific company. But yeah, it's. I would watch the interview. It's very interesting to see. And to hear what he has to say. Yeah. I wonder why they don't. They do not shut down. Horizon. I don't see that much. That much integration. So. It's still limited. But the price tag of the. The device itself. I think. I. Can disagree too. Have people. Use. You had to. Lower down. Get to. Good community. So it's more on the investment they do. Internally. More on the device itself. To the device. And then to. Try to make the. Metaverse. Right now. Maybe use all the company. To be on it. Stronger. Yeah. I guess. It's too soon for them to do that. Because of the numbers. They announced. The investment numbers. Okay. Three 3B. We know that. It's not. The real world for especially just. Horizon it's shared between there. Hardware and software. Divisions, but. I guess for investors. Announcing that they are shooting down. Yeah. It's a very strong strike. So. They'll do it. Slowly or silently at some point, like. Google did with the Google glass. It's just disappear. At some point. It was still there, but nobody. We're hearing. Leering about this anymore. So. They won't do as. Microsoft did. It's not a case for a long time. Everything. Which was more something. For. Investors to be a reassure. It's not a case for. So. I guess it will die slowly. And. Not surely. I won't see. Any. Thing that would bring. Horizon back from. Where it is right now. It has a very bad. And. So. I don't know what they'll do with this. Cause they should rename it at some point. Maybe you will see. We'll send you a new iteration. Yeah. So anything more to add? No. Okay. So it's a wrap up. So thank you both for your. Your. Your time.

Lost In Immersion

Episode #{{podcast.number}} — {{podcast.title}}

Transcript

Show Transcript
{{transcript}}

Subscribe

Spotify Apple Podcasts Google Podcasts Amazon

Episodes

#{{ podcast.number }} – {{ podcast.title }}

Credits

Podcast hosted by Guillaume Brincin, Fabien Le Guillarm, and Sébastien Spas.
Lost In Immersion © {{ year }}