So a few weeks back, we talked about prototypes from Meta that will be showcased at SIGGRAPH. So last week was SIGGRAPH. So a few users were able to test this prototype. And what I'm sharing right now is a report of a user testing the prototype called Butterscotch, which features three really new innovations for headsets that improve a lot the quality. At least, this is what he says in the article. So the three innovations are the one that you are currently looking at, which is, so in headsets, you have a lens. And so the lens distorts the image that is on the screen. So in the software, the image is undistorted. So you can see a flat image. But the distortion is computed from the center of the lens. And usually, the eyes are not exactly at the center of the lens. So there are like distortions that happen because of that. So the first prototype is, sorry, the first feature is actually like a variable undistortion that happens depending on where your eyes are relative to the center of the lens. So you can see how it works here on the video. And the second one is, so we talked about that a few weeks back, is the varifocal. So currently, all the headsets that are on the market, they have a fixed focal distance. So this is why a lot of, if the object is close to you and you look at the object that is close to you, it looks really like your eyes cannot focus on it because the focus length is fixed. And here, there's a focal that actually moves the screen depending on which object your eyes are looking at. So he mentioned a little bit of delay. So I think the time that it takes for the headset to react and to move the lens is actually a bit visible. But he said that it's really an impressive feature. And the latest one is what you can see here as a comparison, what they call it a near retina resolution. So they achieve 56 pixels per degree of vision, which is very close to the 60 pixels that we have in the eyes. And on this one, I'm not sure how they achieve it. Maybe it's a different lens, like a much bigger lens. It's not really detailed how they achieve that. So with these three features, the tester here, he was really, really impressed. He says the object feels almost real when looking through that headset. I forgot what he says. Yeah, anyway, I can't find the quote in the article, but he was really, really impressed. Okay, yeah, it's here. Some of the demo objects were so detailed, I'd go as far as to say they felt real. So that's great, I think, for what's to come in the future. It's not for tomorrow. The article mentioned the end of the decade, so 2026 to 2029, and something that can be produced at large scale. So yeah, that's it. Some users seem to be pretty impressed of that demo. So I'm really curious to know what you think. We'll start with you, Seb, and if you have more info on how they achieved the near retina resolution, I'm really curious to know about that as well. Thanks. I did not check out how it was done. Actually, I am discovering the news as you talk about it now. So yeah, it's great that in VR they achieved a better resolution like that and making everything more real. That can make things more realistic, definitely, but then we are still missing that for mixed reality. I think on top of that, if you want a good and more immersive mixed reality experience, we need, I guess, to have the same kind of technology also developed for the cameras, so they can also focus on where you are looking at, which seems a next step, even more complex to develop. But yeah, it's nice to see progress. It's tough to see it takes that long to implement it, and will take even more to get a headset with that kind of technology in a kind of price where we can afford it, and are able also to produce it as you say. But yeah, I will be very keen to look at it and see through it to check out how it feels. What about you, Guillaume? I was doing some quick researches to answer Fabien's question, but I couldn't find anything about this specifically, despite the fact that it's near the Vario headsets as well. So the question here is about the price, because if Seagate can do nearly as good as the Vario as they are doing right now, the only way of them making it more available is about the price, because as a reminder, the Vario 1 is about $6,000 right now. So they should do something cheaper for them to be competitive. One thing that is quite interesting about the picture we are seeing right now is that they are comparing their new headsets to the much older generations. Why didn't they showcase what the Quest 3 is doing, for example? Because the Quest 3 is supposed to be released by the end of the year. It should be interesting to see, to know finally what they are doing with this. And I'm quite surprised that they are showcasing early stages prototypes like the VarioFocal or the other one we saw a few episodes back. I think it was last week maybe. And they are not saying much about Quest 3, so it's very surprising to see that. I don't know if you have more intel. Why don't we have more information about Quest 3? Maybe they are ashamed of it. I don't know. But well, it's very, very surprising. I think I didn't look at it, but I saw in the news an article today actually about the Quest 3 with like expected release date and things like that. So maybe it's topic for next week. I'm sure that Mela is really betting on the Quest 3. Oh yeah, you think it is a lose or fail kind of bet? I don't know. Yeah. And yeah, concerning the VarioFocal competition with this kind of headset, I think they could sell it at that price, but it's not really yet or not even close to the 6,000 price of the VarioFocal. So yeah, and also they had a lot of failure with the Quest Pro trying to sell it to professional or businesses. So yeah, I think they are taking their time to go back to this area, I guess. Yeah. My question underneath that is that what's the point of making like a 10k headset with the VarioFocal technology if the VarioOne is already near that at the price? They should, if they want it to be something that is recognized on the market for its technology, it should be not quite the same resolution as the Vario, it should be the high resolution exactly. That would be impressive. And as they are showcasing right now, they are not there yet. So maybe it's for them to show their advancement on where they are and that they are close to it. But once again, I guess it's some kind of a failed presentation because they are just showcasing something that should be released in like six to eight years. And this is basically the same technology as we have today with the Vario quite, and less complicated. And yeah, it's a great technology, but once again, how can it be used as a large scale? I don't know. So the Vario has the VarioFocal? No, they're using another type of representation. They have the eye tracking, and then this is a dynamic adaptation of the resolution. I don't really know how it works, so I don't want to say any stupid things, but they can focus the resolution or whatever at some point and making it near the eye resolution. It's not a mechanic, it's software plus optic. Okay. Yeah, and so also when you see this, so of course it's a prototype, but when you see the size on the headset, it shows that it's not ready for mass production. But maybe they will, on the three innovations that they are showcasing into that headset, maybe they will pick one that can be ready in a few years and another one after that. Depending on what their priorities are. But to my knowledge, and maybe you have other info, these are, when we look at the possibilities of innovation in the VR headsets, this is the most up-to-date or most innovative feature that I've seen. Maybe there is something else about, I don't know, resolution, size, form factor, that could lead to innovations in that field. Yeah, we can't take that away from them, that they are doing some innovation, which is pretty rare in our field, in the VR world. MetaHash and Oculus and their teams are still trying to improve the concept. That part is very interesting. Yeah, it's the execution that is a bit... Communication, execution, it's questionable. It seems like the lab is working on their own and has their own marketing team, independent to whatever do Meta on the business side. I would really like to see how far they are on the force feedback, like gloves or stuff like that. That's still missing for me in the experiment, that would be the things that needs to move forward, and I think it can move forward quicker, and that could really make the experience more immersive. Because right now, the resolution, you can... I feel like you can adapt to it, even if it's not high-res. I was okay with the Quest 1. The Quest 2 added a bit more quality. The Quest 4 is even better, but in terms of experience, that's really the experience that makes me want to go back and play it or do it again. So more in terms of content, and what is missing right now is to have some force feedback on some of them, to make them really impressive and to make me want to go back and do it again, or really use it professionally. Well, I guess they are not even thinking about this right now, as they are completely blocked on the idea, they are focused on the idea that if they don't have that many users, it's because of the headset. So they are... Until they are changing this way of seeing the market or seeing their technology, I guess they won't be doing much about the optics or whatever, because usually it's the other steps that are coming afterwards. So they really need to understand what is wrong with the fact that they are not growing such a big community as they thought in the first place. And we all know that it is about content. Yeah, it would be interesting what Roblox brings to the table, and this new game studio that they have in-house. We are looking forward to see some metrics and if the game is getting traction. Yeah. Well, I don't know if we mentioned this, but they had like 5 million users the first week with Roblox. It's very impressive, but I don't know on the long term. It's about the same. It's the novelty or what is going on, and people are downloading the app by curiosity, and then they are just taking them off or not using it anymore. Like the thread, what happened with thread. They had a lot of traction the first days, and then out the users just deleted the app. So we will have to see on the long term what WarRock is doing. Okay. So Seb, do you want to talk about your subject? Sure. So mine was to continue the discussion on what I shown last week about the shadow projection experience where a kid was going inside a room where there was projection and was able to interact with some shadow that was dynamically displayed. Here it's Moment Factory that did a couple of experience with body tracking and doing some projection and interaction with body tracking. So the first one is quite simple. They are just detecting the user in front of a screen and make him able to move some particles. And it was shown at SIGGRAPH, but they did much more impressive experience with an audience where a complete audience can step underneath projector and dance or interact together. By being guided by what is projecting on the floor and how they move. So it makes really a body tracking of a lot of user in the same space and projection on walls and on the floor. And here I think it's even LED screen on the floor. So like I said last week, I found this experience really immersive and an audience can look at it and see what they are doing and understand it. So not like a VR experience where you are alone only showing what the user is seeing inside the screen. And here they even added on a ping-pong game. So the same concept where a user can move left and right to control their pad and control the ping-pong game. Here another user can join and use multi-reality to select, to vote on what will be the next power-up or change the way the ball basically will react in the ping-pong game. So I find this way of doing an event an experience really immersive. At an event, an experience is quite nice. It's, like I said, an audience to look at it and understand what is going on and vote and participate even with a mobile phone. So there's a lot to be done. I wanted to do that for a long time. So it's nice to see it deployed in a nice way like that. So I don't know what is your thought about that. There are two things that I find interesting there. The first one is on the first example that you've shown that they showcased last week at SIGGRAPH. The motion capture that they are using is move.ai and they don't have any depth sensor. I think they are using one phone or maybe two or three. I forgot how many. But they are just using standard phones pointed at the user and they can, in real time, using AI, transpose that to a skeleton animation. So this is the move.ai technology. It's really impressive. They have been out for a while, but it was only post-processing until now. They released recently, I think one month back, a real-time solution. So that is really impressive to me and hopefully will reduce the friction to have body tracking in interactive experiences because just from a phone, hopefully, we will be able to add this feature. And the second one is what you've shown with the other user using their smartphone. I think we can expand that to many experiences like even in VR. Let's say you are with friends and you have only one headset. So this is a very personal experience. It's only one user. But if your friends can join the game or the experience from their smartphone, then you have a really collective experience. I think that on their latest mixed reality demo, Meta has shown this feature. I saw that a couple days ago. So yeah, being able to have a collective experience even if there is only one headset, I think that's a key feature for interactive experience. And so yeah, that was my thoughts. What about you, Guillaume? The thing is, I don't really \u2013 well, the execution is great as part of the technology. Yes, they are using move.ai, but at some point, it's just the same use case that has been used for years now that we did with Kinect and projectors or LED panels at some point. So I don't really see \u2013 despite the fact that you can use AR to be part of the experience, pretty much all the use cases shown are not that innovative to my knowledge. Yeah, the thing is, it's always very nice and very well integrated. As what Seb said about the fact that people around were about to understand what is happening, the scene, I'm not that sure about this because we appreciate the experience because we have a high view of what is going on. But when you are on site, if you are not that tall, you can't see really what is going on on the floor. Maybe they have a global return on some screens for people to understand what is going on. But yeah, I'm not really that impressed about the setup itself. It's well executed once again, but not \u2013 yeah, it has been done. Of course, it's better now because we have better technologies, but the projected interaction and body tracking features are very common now. And especially in some key playgrounds right now, we can see this kind of stuff all around. I saw there is also a go-kart company, I guess it's in Europe as well, and you can have some kind of Mario Kart experience with your kart track and you have some inputs on your wheels. And you can, yeah, throw the bonuses and so on. And the fact is really interesting. This is a really interesting approach. They have just an empty space and depending on what racetrack you want to project, you can have different patterns or concepts. And I found it very, very interesting. They are using electrical karts, go-karts, so it's completely noise-free and they can all enjoy their experience. And this is quite interesting because this kind of concept, we were approached back in the day in 2015, something like that, to do this kind of concept. And it's very interesting and funny to see that those concepts imagined like way back are now completely available for the public. Try to find some picture of this at some point, if you want. I would like to mention, I'm from Belgium. Yeah, it's Belgium. Four years ago. It's quite close to my place and it was nice to Mario Kart like this, yeah, to be able to grab some features on the floor, being a hit, and you have your electrical kart that stops. Linking that to the projection on the floor and having even the race track being different between the race was very nice. So yeah, I guess it's not that new, but it's nice the way they integrated it, I think, still. And the ability, it's like a first step. And I think there is a lot of ideas that could come by mixing a lot of reality and this kind of projection and interaction at an event. And like you said, I think that's true. If you want people to have a really good reach, you may have to make them higher to the place you are projecting on the floor or make things displayed or projected behind the user, so everyone can see what is going on, like a street fighter. And you see only the background and a bit of the floor, but not like this, where it's only projected on the floor, like the ping pong games. But yeah, some nice content can come out of that. I'm pretty sure that was more my point. Yeah, I totally agree. It's very nice to see, like similar to what we discussed last week about the water slide, that you can enjoy it by itself. And you can also add a layer, a digital layer on top of it. And it's nice to see a digital layer on top of karting, a digital layer on top of ping pong. It's really interesting to see these developments. Yeah, this is one part of the approach of the metaverse, if you want to bring back the words. And yeah, of course, if you have these AR glasses on you every day, you can have imagine this kind of interaction with a whole bunch of everyday activities, like pools, ping pong, or tennis, or whatever. Okay, great. So I'll move up to my topic. And it is meta again. So there are two different things I would like to discuss about meta. First one is that they published the best practice for mixed reality. So basically, it's a revision of what Quest 3 would be capable of doing. They are projecting what kind of interaction you should do with the headset, especially mixing the end tracking and the mixed reality slash AR features. So some catch and grab and basic interaction, and how to merge the real virtual one. So they already have some very interesting way of doing this. I don't know how in technically it's doing, but well, they are showcasing some interesting visual of what they would like us to do with the Quest 3. So that said, we could have imagined that the Quest 3 will be their main device to approach the AR mixed reality part. And if they are that confident with their headsets, it should be their best device to do it, and to be the competitive counterpart of the Apple Vision Pro. However, they just announced that they are about to release like 1,000 AR devices for next year, especially like military grade. And they would like to showcase those 1,000 AR glasses to the world, to developers, to showcase what they are doing in AR. So what are they really doing? Because with the Quest 3, they are just announcing that they have in hand the competitor to the Apple Vision Pro doing video see-through AR. And on the other part, they are working on some AR glasses, maybe see-through ones. And like if they reach their goal of doing efficient mixed reality with the Quest 3, why are they doing an AR glasses one with maybe some other technologies? They also announced that they changed their technical roadmap by decreasing the number of feature of their AR glasses, because it would be too expensive and they want to be still competitive against Apple Vision Pro. So I really don't understand right now their strategy towards the adoption of mixed reality. And as you mentioned earlier, Fabien, is this just a bet on the Quest 3? And they just added some mixed reality feature to be like sticking the trend of the Apple Vision Pro. And maybe in the first place, it was not meant to do those kind of interaction and they just did this to put their flag on the mixed reality one and just saying that it's a $500 headset that is doing basically the same as the Apple one, just to grab some market users. I'm really curious to know what you're thinking about this kind of weird strategy. Not much reaction. Yeah, maybe let me clarify a bit by what I meant by betting. I didn't mean that it's their last move or putting all the cards on the table. What I meant is, I think they really believe in the Quest 3 and its success and its capabilities for mixed reality. So yeah, that was my meaning of the word betting. And about the AR glasses, I think, I don't know. To me, these are two different usage and two different technologies. So I will not walk every day and go outside with my Quest 3, but maybe I will go outside with my AR glasses. So maybe I think I'm a bit less confused. Very confused, yeah. I don't see really the clash between the two. Of course, ideally, in the end, in like 10, 20 years, we'll have a marriage between these two technologies. Don't you think they are missing people around with this? You know, like normal people, they are selling it as the mixed reality slash AR device. And then they will be releasing another device saying, oh, no, no, no, it's not AR, it's mixed reality. But this one is doing AR. And at some point, they are just, I guess, breaking their own strategy slash market, because they are dividing people. And as we know, the mainstream user really understood what AR is with the Apple Vision Pro demonstration. And now, you are sending different messages. And I don't think this is a good strategy to do different devices and just releasing them as they are not completely available. And, you know, I guess in some people's minds, the Quest 3 could be the same as the Apple Vision Pro. And, yeah, I don't catch the thing here. For me here, it's what they have done for all the preview sets that they made every time, update on the SDK and make it even better. So, I've seen this kind of video working with the Quest Pro. So, I think it's already usable with the Quest Pro. And so, letting the developer know that they can use the Quest Pro to develop this kind of interaction already, and that will be available on the Quest 3. I think it's what they meant with that. Now, as I announced, I did not see the news. I would like to see that. If you can share it, that would be great. And like Fabien said, I think that two different ways to go. VR has this use case that is very great. But if, like Fabien said, you want to use your augmented reality glasses, you want to see through and have the other users see your face and not like the Vision Pro did it with the screen. And that's really what you are doing, what you are thinking, how you react to their speech or what they are saying. So, yeah. And also, you, you don't want to look like an asshole with something on your face, a huge ski mask on your face. Yeah. So, I agree. Actually, what you just explained, Guillaume, is I completely understand. On our side, we understand the difference between the AR glasses and the Quest 3. But the public, yeah, many, many, many users, as you said, discovered VR and AR with the release of the Apple Vision Pro. So, I understand what you mean is like Meta is sending maybe two different messages that might confuse the general audience. And that I understand, yeah. Maybe three with their new prototypes, very focal motors, in a very short period of time. It's good that we are talking about Meta, but yeah, they are going everywhere, sending like different, as you said, different messages or different devices all around. I don't think that they did this in the past. I knew that they are already always working on different iteration of their prototypes back when they were Oculus. But at this point, they are like shooting everywhere at the same times. And we saw that with the Quest Pro. They released it quite quickly. And it's basically a commercial fail at this point, because they are not producing it anymore. They just cut the price in half. So, yeah. Once again, I don't really know if this is some kind of urgency on their part, and they are trying to occupy the space as much as they can with lots of information or news about what they are doing and what they will be releasing. But at some point, they are just flooding people's mind and people understanding of what is going on. It's not a clear message as Apple could have, which is we have one device. It's not ready yet. You'll get it. And it's awesome. And Meta is like, we are doing a lot of stuff. They are all doing something. And we are hoping that you'll buy them all maybe at some point. But yeah, it's really interesting to see the different marketing approach. It's a very calm, very focused one with the Apple and just like a frenzy of ideas and different devices on the Meta part. And it's the same about the content as well. We see that they are doing some Roblox on some parts. They are trying to save the horizon on the other one. They are working with studios and so on. You see, it's buzzing all around on the Meta part. I understood what you meant, Fabien, with the bet one, which was not meant to be like it's a live or die headset release. But at some point, the question will be asked, I guess, because they are still losing money. It's like 3.2 billion, I guess, last quarter just for the innovation team. So I guess at some point, they will have to pull the plug or not. And maybe it's their last effort at some point to see, do we have some traction of what we will be releasing on the Airglasses to see what people are thinking about and to know if they can still invest or not. So yeah. The huge bet for me is to bet as the VR headset, like they do it, will be like a mobile phone, like an iPhone or a PlayStation where you will always buy the latest because the quality will be better and the performance will be better. Here, I feel like most of the people that bought the Quest 2 are happy with what it does and already not using it every day. So they don't care about getting a new one. So yeah, I'm a bit worried about this strategy and not making the effort to make only one really huge step between the different headsets to have really a market where the people realize that they have a really old-fashioned VR headset and now there is a new one that is really different from the previous one they had. So yeah, that's a huge bet. I hope they will not. And just one thing that I saw is, can you see the distortion right here? When he's moving his hand in the real world. So I don't know if you saw this kind of distortion when you were using the Lynx ARG1. No, and for me this one, this is a record with an Oculus Quest Pro. So that's why you see this kind of distortion. So I hope the Quest 3 with two camera or two color camera will not get this kind of distortion, but the Lynx ARG1 doesn't have this kind of distortion. Okay, nice to... Well, once again, they are showcasing something that should be on the Quest Pro and they are showcasing it with the Quest 3, sorry, and they are showcasing with the Quest Pro and I guess I'm not the only one to have seen this. A kind of distortion that are not very, you know... Usually when you are doing a technical demonstration, you are just showing it better than it should be, than it is really in the headset. And then here it should be the contrary. It is the contrary, sorry. They did not announce yet or showcase anything on the Quest 3, so we don't have a vision of what the quality is. So like Fabien said, I think they are betting everything on this headset. They are waiting to get the final product and make some video, nice video about it and show everything when they announce it. But right now they are still staying on what they have available and also showcasing, like I said, to developers what they can already do with the Quest Pro. But still, yeah, we don't have much information about the Quest 3 despite one presentation with Mark Zuckerberg. So it's very weird that they are announcing a lot of stuff and not what is supposed to be released by the end of the year, which is like in two months or three months. So is it really ready or are they still working on this? It's taking some delay. Yeah, it's very odd. Okay, so do you have anything more? Yeah, just one quick thing, just to put numbers in perspective. So Meta seems to have lost around 20 billion since three years. But if you look at the large language model industry, they lost 25 billions during the last three years. And the self-driving industry, it's 100 billions invested over the last years. So I find it interesting to put the numbers in perspective between the industries and the innovation and how so much money is- Just wasted? I don't know, wasted or hopefully invested, not wasted. But thank you for those information. Yeah, so they are not losing that much money, in fact. Well, to nuance that a bit, it's only Meta. So it doesn't include, you know, Renovo and others. So maybe it's a bit more than 20.

Lost In Immersion

Episode #{{podcast.number}} — {{podcast.title}}

Transcript

Show Transcript
{{transcript}}

Subscribe

Spotify Apple Podcasts Google Podcasts Amazon

Episodes

#{{ podcast.number }} – {{ podcast.title }}

Credits

Podcast hosted by Guillaume Brincin, Fabien Le Guillarm, and Sébastien Spas.
Lost In Immersion © {{ year }}