And welcome to episode 17 of Lost in Immersion, a weekly 45-minute stream about innovation. As VR and AR veterans, we'll discuss the latest news of the immersive industry. Let's go. So we are not on Wednesday anymore. We are now on Tuesday. So it would be our new day for our stream. So just change your calendar, please. And Fabien, let's go. Cool. Yeah, thanks. Okay, so my topic for today is a new paper, a research paper that came in last week, a few days ago. And the title is Seeing the World Through Your Eyes. Basically, the idea is that by using the reflection inside the eyes, they are trying to reproduce what you are actually seeing just from a single image or from a few images. So you can see some examples that they are showcasing here. Of course, as you can see, it's a very, very rough reconstruction of the image, of the scene. But you can see that they managed to have already kind of impressive results. And so you see a few more examples here. First, they are doing some position optimization and some texture optimization. So I think it's, as you can see, still the very, very early stage. It's interesting to look at what this can become in the future. And all the good things that might come from it. And also the privacy concerns that might arise if, of course, anybody will be able, of course, if we go very far in the future, to reconstruct the scene just from an eye, the eye image. And so you see, they tried that as well on some music videos with not so good results, it seems. But I'm sure that as the technology improves, it will get better and better. And actually, I did a bit of research on the privacy concerns about that. And there were already some case of stalkers that managed to find a star just by looking at the reflection. So there was no technology there. It was just, you know, looking through the eye's reflection on a very high definition picture of a star. So, yeah, I'm very curious to know what you think about how this could be applied, the use case that that can be. And also, yeah, the not so concerning usage of this. Yeah, Seb, maybe? Yes, I was trying to remember the sci-fi movie where I saw that with perfect reconstruction of the 3D scene. And it's always funny to see what is the status we are at now compared to what they show in the movie. And I agree with you, it's for the privacy. If they manage to get a better result, it starts to be annoying if we have to, for each picture we take, to remove the eyes or put something else in the eyes to be safe on the privacy part. That starts to be a pain. But yeah, it's interesting to see the new investigation in something. What about you, Guillaume? Well, I guess we are maybe 10 or 20 years after most of the CSI episodes where they were decompressing the pixels to find out the killer through the rebound of the light in the different windows and so on. Yeah, and the reflection in the eyes. Do you know if they are using some kind of AI at some point for doing that? Or maybe it could be some kind of improvement in their methods? That's a very good point. I didn't go that far at looking at the technology, but I'm sure that generative AI can play into that. Yeah, I heard about what you said. I guess it was PewDiePie, but it was not in their eyes. It was a reflection in some window to find out where they were living, and they found their apartment by doing so in one of these videos or pictures. And indeed, they had to blur any kind of window in their stream after that to be sure that no one could find out where they live. Yeah, of course, it's a privacy concern, but at some point, what can you do despite just displaying a blank white picture or an avatar, for example? Maybe it could be one way of bringing the avatar mode inside our virtual conversation. I don't know how you can avoid this. The good part is that the technology is not there yet, and I really wonder what their motivation is behind that. Why do they want to see the reflection inside someone's eye despite the spying thing? Did they explain their motivation behind that, or is it just for the pure joy of science and to prove that they can do that, like I said, in the movies or the streaming episodes of CSI, for example? Yeah, from the paper, they don't explain any usage behind that. So they are just a bunch of stalkers that are dreaming of getting where people are living or doing their eye reflection. Okay, okay, okay, why not? And they seem to be using AI because they are speaking about RadianceField and training of RadianceField, so that sounds like AI. So do you have anything more on this subject or do you have any more concern? Yeah, mostly concerns. I don't know if there will be a return to sunglasses or something like that, but yeah, that's it for me. It should be easier to see the reflection in our glasses at some point. Oh, yeah, sorry, just one thing about that. Back to the Apple device. So there are cameras that are looking inside. So, you know, if that kind of technology is used in such device, I wonder what this can become. You don't see anything because you have your eyes inside the headset, so I don't know if there are much to see. Oh, yeah, sorry, I meant like in AR glasses, for example. Okay, so Seb, do you want to follow up? Sure. I said I wanted to talk about a small device that just came out. It was announced a couple of months ago, but it just went out and was delivered to developers, so I had some feedback from one of them. So the idea is really a simple headset where you put your iPhone, right now it's not working on Android yet, and there is some lenses here to spread the video to your eyes so you can see at the wall of your mountain in front of you, naturally. The device, as you can see, is working. But it seems to work apparently quite okay. We don't have stereo, we have only mono, because there is only one camera on the iPhone. But the controller seems to work apparently great, and the field of view that you get with the device, because it's using the wide camera, or you can plug a small FOV device that you put in front of your camera to have a wider field of view. And then you can interact like this. So the only thing that is a bit annoying is that it's mono, not stereo, so you don't see in 3D, you see only in 3D. But apart from that, it seems like a sitting experience where you display something in front of you at a certain distance seems to work fine. And the quality of the controller, they are very small. This is the thing they sell if you have only an iPhone, not an iPhone Pro. So you can plug it on your camera and have a wider field of view. And they manage apparently to have a natural field of view, so when your hand is out of the view and coming in the view, you can see that there is no distortion of the environment in front of you. So I think that's interesting. Now if you add the price of the iPhone and the price of the device, that starts to be more expensive than the Quest 3, but at least it exists right now. And the Quest 3 is not released, so yeah, I wonder what is your point of view on that. It's very interesting to see that there was an era of cardboard, so you know this like cardboard build when we are putting the phones inside for VR. And now we are seeing, I think in our first episode Seb, you showed us a similar device to this one. It's interesting, we are in kind of the plastic, not cardboard era for mutuality. And yeah, it's very interesting to see there are a lot of people with iPhones already, so I don't know what the use case will be for this. But it's very interesting to see that there are cheaper options available. And the controller seems to work with computer vision, I'm not sure. Yes, that's right. So I think that's right. It's a mix between sensor inside and the computer vision. Okay. It's a clever idea, I'm interested to test it actually, to see if it works well. I think the main issue right now is that they use the wide camera, and so they are not able to use a kit on top of that. So right now they are stuck with controllers and a marker that you put on your table. And they are saying in the developer area of their website that they are working on having a kit working with the device. So you can have hand tracking and use the physical space in front of you to have interaction with it. Yeah, I was making some research on the site at the same time. Because this kind of device reminds me of the Nolo VR when you had it tracking to your... You could add tracking to your cart boards or plastic ones at some point. I know that this is not the first and only initiative to do this kind of cheap devices. And the concern I have is that this kind of devices never really found their market. Because at some point I guess people are looking at or are seeking the real experience. And I was just checking that Nolo is now doing a complete VR headsets and AR headsets as well. So we can see that the approach of adding just the tracking or some VR or AR features to your cell phones is not enough apparently. And all these other competitors or companies that did this kind of devices when there was a cart board slash Kickstarter era of VR. When a lot of companies were doing cheap products to make VR and AR easier for people to discover. I guess we are past this period when people were trying to find any kind of devices that would make them experience VR and AR. Now they know what this is. They know what it can do. And their expectation is higher than just having a monoscopic experience that is not as comfortable as it could be with a Quest or any kind of AR or mixed reality headset. So maybe they are five years too late regarding the VR and AR timeline. But maybe they can hop on the train of the Apple Vision Pro and maybe some people will be curious to try this. But once again I guess the experience is not as good as it should be. And it can bring some people down saying that wow it's just that or AR is not as advanced as they would like us to believe. And it could be worse than just having people wait for the real devices if I can say that. One thing that is interesting is that they find a way to have the device in front of you. So you can see on the side and have a peripheral view of your environment. Which is for me the best way to do this kind of headset right now. And did you order this to try? A friend in France bought it and tried it and gave me his feedback. And so for you do you think this could be an option for trade shows or large venues for you to make some application with that? I mean to test it to see the result and see what could be the use case for this. Because renting an iPhone 4 is quite easy in France at least or in the US. And if the device is working then maybe there is some use case where the quality is enough and the price is right for the client. So I think we should buy one and see how it looks. But compared to the HoloKit, that was the one I talked about in the first podcast. This one works in real time, you don't have delay. So you can work together with someone else on something. And that's what could be interesting because buying two devices will be more expensive than renting two iPhones and buying that device. One thing that is interesting and maybe is a new trend, we can see that there is no way of hiding the view of the user on the side. And as you said with the links for example, maybe they found out that opening the field of view, especially when you are doing AR or mixed reality is bringing a better field of view. Even if it's not covered by the augmentation. But it's funny to see that now we don't have this plastic part on the side and we are opening the field of view for the user. So it's interesting to see this new approach as well. That's what Stan and Laurent said in this video, that they did iteration and they found out that it was more comfortable for the user to have his peripheral view always active. Because right now we don't have a way to have a really complete field of view for the natural eyes. If you add the 1 to 1 ratio, it should be more comfortable. But if you have some kind of distortion, it would be awful. Fabien, do you have anything more to say about this? Maybe just a question is, do we think that this kind of device are actually a bit disturbing the image of how immersive can mixed reality be? And by offering an easy, cheap device for first experience. But at least it's there and giving some people the opportunity to try VR. So a bit like balance between the two. OK, so I'll go with my subject now. Just let me. So I'll be talking about the latest news about the Apple Vision Pro. During some of my research, I found this website. I'll put the link in the description. It's someone that is doing a lot of breakdowns, teardowns of AR headsets. And right now they are just trying to find out what could the Apple Vision Pro could be. And what kind of issue it could bring on before the release of the headsets itself. So there are three different points he is talking about and I would like to share with you. The first one is that about the placement of the main cameras. They reflect about this and they found out that by having the main cameras placed below the eyes, the real eyes of the users. They could have some difficulties making the focal adjustment and convergence of their video rendering. And as he mentioned, everybody was stoked about the Apple Vision Pro demos. But as he mentioned, you can barely see any kind of default or issue with a new VR or AR headset when it's a 30 minutes demonstration. So they are talking about these placements that could be an issue. Especially when you are trying to catch a real object through the headset. So they are really waiting to see what Apple can do with that. Because they will have to do some kind of offsets and correction to have the correct angle of vision. So that's the first part of this. Second part, there is a lot of explanation about this. Next, they tried to calculate the field of view of the Apple Vision Pro. And regarding the device and the kind of lenses they are using. They think that the field of view should be 120 degrees for the Apple Vision Pro versus 52 degrees for the HoloLens 2. However, on the global field of view, it's 220 degrees because we are on a see-through device. So we have this peripheral vision that you don't have with the Apple Vision Pro because the vision is blocked by the headset itself. So on the AR scale, it's much, much better than the HoloLens 2. However, it's just on the normal field of view for VR headsets. For example, for a MetaQuest Pro or 3. So not that impressive on this part. It's way less than the Pimax, for example. So we'll see what can be done with that. And the last part is about the weight distribution. They analyzed some photos of people that tried the headsets for the 30-minute demonstration. They found out those concerning marks of having something pushing on the user's face. And by some calculations, they determined that the headsets were 450 grams. So barely half a kilo on the nose because you don't have any kind of support or balance on the other side of the head. And apparently in some footage, we can see a strap on the head of the users. But I don't really know about what is the use of this strap because if you want to put the weight back from the front to the back of the head, you should have a strap right there. Like we had on any kind of VR headsets for quite some years now. So I don't know why they use this strap there as the weight is obviously more on the nose and the forehead of the person. And just for a reminder, the HoloLens and the MetaQuest Pro have this kind of balanced design which are making them comfortable. As you mentioned Seb, I guess one or two podcasts ago. And even if they are heavier with 722 and 566 grams, their design seems to be better. And the reflection about this seems to be better than with the Apple Vision Pro. So very interesting, very nice website with a lot of information. Once again, it's deduction and guess. So we have to take this with some caution. But it's interesting to see that people are reflecting right now about these VR headsets. And besides the whole hype about this, there are a lot of technical concerns and technical choices of how people made their headsets. And just to come back to the placement of the cameras, they are just saying that on this part. Because when you are seeing all the mixed reality headsets, the cameras are indeed placed where the user's eyes are. And on this point, we can see that Apple preferred their outside screen to be developed more than the official placement of the lenses. And this is quite the same about the design of the headset itself. As we mentioned that we thought that the Apple Vision Pro should be in carbon fiber for a better weight optimization. And they prefer to stick to the metal and glass. So it's really fun to see that Apple is more on the design part than the technical one right now. So we'll see if they are changing their mind at some point. So what do you think about these guys? Fabien? Okay, yeah, thanks. So yeah, indeed, as you are saying, it's really interesting to see the choices that they made. Like the outside screen, instead of having the center of the camera, the weight balance, instead of having the battery, for example, at the back, or having a strap. So it raises a lot of interesting questions to see why did they choose that? Why did they make these questionable choices? Do they think that the outside screen is so important? It's a critical feature that they are willing to sacrifice some depth perception. And is the object itself, the quality of the materials, so important that they are willing to sacrifice our neck? That is really very interesting questions. And I'm very much looking forward to test it. And I don't know, do you know if Apple is really saying that the device that was tested is actually the one that will be in production? No, there is no information about this. And this is why people are now thinking that there may be some evolution of the headset at some point. Because as we said, they don't have a real production line for this headset. So it's still like handmade, if this is not the case, but at some point it could be. So maybe they are, because if you can remember, still a few weeks ago, we had some question about, are they really releasing the headsets as they were willing to in June? Some people were talking about September. So maybe they are just buying some time and see what the product will be released next year. Because it's in a year that we'll have that developers and professional and wealthy people will get their hand on the headset. So I guess at this point, they can still do some improvement of the headsets. And we saw that if I can remember correctly on the first iteration of the Oculus slash meta quests, we had some strap and weight and some final adjustment on the headset before they were released. So I think this was a crash test or test drive for them to see what would be the reaction of the people, of the public. And those huge demonstration that they did with the headsets, they were just writing down what people were thinking about the headsets and for them to make some adjustment. And apparently, the guy that is making this blog is saying that sometimes there are some Apple and Microsoft technicians and developers are looking at what they are saying and there is some kind of discussion between them. So maybe at some point, they are trying to listen to what more technical people are saying and they can do some adjustment. Regarding the placement of the lenses, I don't know how they can improve that because it would be sacrificing their front display with the eyes. And as it is their main sale value, I don't know how they could discard that at some point. The weight can be improved, of course, and those straps could be placed. They can have this without making huge compromises. Just by saying, oh, if you found the headsets too heavy, you can use the straps, but it's not mandatory. Yeah, they can do something about this, but on the lenses placement, I can't see how they can do that. They can't change the whole design at this point, but you can do some minor improvements. Same for the peripheral vision. I don't see how they could remove the mask. I just forgot something about the peripheral vision. You can see there that they bought the Limbac company that is making lenses, especially the lens lenses, which are called the freeform multichannel and with this very particular shape. Apparently, Apple would be using what they are calling the super pancake lenses. However, they are just saying that Apple don't want to say the word pancake just to be like the VR headsets. This is talking about a new technology, but they are talking about catadioptric lenses, which basically are a new way of making pancake lenses. They are just bringing some new terms, but this is basically the same technology. There is no real breakthrough. It's just an evolution of the pancake lenses regarding the companies they bought. Just a question about this is that if they bought the company that is making the Lynx lenses, I guess they should be concerned because at some point, this is a competitor for them. I don't know if they'll be still getting their lenses from them at some point. It's really for the next generation of the headset, right? Not for this one? Yeah, for this one. At some point in the video, they showed us the lenses and it was a stack of three different lenses. By regarding that, they found out that it was a new way of doing a pancake, which they are calling the super pancake lenses. Apparently, there is no real improvement about the field of view. It's more about the quality of the view inside these lenses. Seb, do you have a comment? I understand better than Stan now for saying that he was happy with the announcement and everything he saw about the headset because he feels like they made on the Lynx a better decision from the start. In terms of optics, in terms of the placement of the camera, in terms of weight balance, keeping the peripheral vision of the user, and the tracking that they got really well with the motion embedded into the headset. That's definitely interesting to see that Apple has moved in the same direction. If you are doing the comparison between the Vision Pro and the Lynx R1, I guess at this point, the Lynx headsets have more advantages than the Vision Pro on the spectrum. The only concern is about the computation power. We know that the Apple headset has the M2 chipset and the R1 as well. They just have a Snapdragon, I guess, on the Lynx R1. When we are comparing the resolution and some very precise aspects, they are, of course, less than the Apple one. But on the overall experience and price and so on, I guess the Lynx really has a chance right there on the market. Once again, as I said last week, my main concern about them is for them to be able to provide the headsets to the consumer. They will have a large amount of headsets to deliver in a very short period of time. It's a good problem, but it's still a problem. We know that. I just hope that they can go behind that. Once again, the best thing that could be happening for them is maybe to be bought by Apple at some point. It really depends on how they are seeing this company, if it's a very hard competition. It also depends on the price of the company at some point. They have big competitors in front of them in the name of Apple. It's very, very motivating for them, I guess, to see that they have a chance on the market and they could be blowing on this at some point and become a huge player in the VR world, like Vario did. Even if the Vario timeline is larger than the Lynx one, because it took about 10 years to get there. It's very interesting to see a small European company making these VR headsets. About the interaction of the MetaQuest 4, I don't know if you saw that news, but the Vision 4, the eyes, gaze, and pinch interaction. I don't know if you saw that news, but there are now a couple of developers that are working on their app to add that kind of interaction. It's like the Quest 4, for example, using the eye tracking and the finger tracking to do the same kind of interaction. It seems to work great. It has been implemented quite rapidly. We were talking about that two weeks ago, and now it's already there and already implemented. About the eye tracking, it's very weird that they took this off the Quest 3, because apparently a lot of new interactions are based on eye tracking. I guess it's a mistake by Meta for them to discard this technology inside their Quest 3, because they are missing the train here. Especially if developers and applications are adapting to this kind of interaction, they are missing the train of this new market and new kind of interactions. In this article and in this blog, the guy is telling that on the specification comparison, the Quest Pro is just out of the picture right now, because they are not on the same page as the Lynx or the Apple Vision Pro. The Quest Pro is already obsolete at this point, so their new iteration is the Quest 3, and the Quest 3 doesn't have the eye tracking, so it's very, very weird. I guess they made a mistake, a technical mistake on this. Fabien, anything more? Yeah, I was doing some research. Two things. One is, I wonder if the choice of material is related to heat. I don't know, like having two CPUs in the device in the front, maybe it's causing some heat. I don't know. I don't have any knowledge in materials and stuff like that, so I don't know if carbon fiber has maybe problems with heat. It's a solution. This is why the HoloLens 2 is embedding some kind of carbon fiber as well, because it's a good way of not feeling the heat, and it's a great heat dissipator. This is mainly used for racing as well. This is why cars and motorcycles integrate some carbon fiber and Kevlar parts inside their structure, is for the weight and heat distribution as well. This was a real solution at some point. Carbon fiber should have been the choice to have something lighter. Also, I saw another video bringing back the journey of the HoloLens 2, and they chose carbon fiber as well for shock absorption when the headset is falling on the ground. It's a deformation. Oh, sorry. Are you still there? Yeah? Yes. Oh, yeah, it's an issue. Yeah, well, the weight, the shock absorption is better with carbon fiber as well, so there's no good point of choosing metal and glass at this point. Well, they wanted, I think, high-quality metal, strong for the user, and to differentiate them from all the other brands that does plastic only. Yeah, it's really design first and the technological part in second place. Yeah, and one last thing is, it's amazing how the world of spatial computing has blown up. It's now in every, I think, every publication on the blogs or social networks, instead of talking about metaverse, they are now using spatial computing instead. So, it's amazing to see the power of, marketing power of Apple. It's the same about mixed reality as well. This is a new word used for all the video, see-through headsets and so on. Nobody is talking about augmented reality anymore. A new word to add to our vocabulary, and yeah, it's the same with XR. I guess XR completely disappeared as well. It was not used by the bigger player, and then just some kind of disappeared. And we can ask the future about the WebXR, OpenXR initiative. Will they change their name once again after the OpenVR and WebVR? So, we'll see what people are doing. But I guess the mixed reality was Microsoft. I don't know if they patented the name, but at some point it was the Microsoft initiative with their VR and AR headsets. It was all inside the mixed reality term. So, I don't know why they are bringing this up again. Okay. So, we are past our time, but yeah, we continue this kind of a very interesting discussion next week. So, see you guys next Tuesday at 6 a.m. for me, and you make the calculation for you guys as well. But yeah, see you next time, and thanks again for watching.

Lost In Immersion

Episode #{{podcast.number}} — {{podcast.title}}

Transcript

Show Transcript
{{transcript}}

Subscribe

Spotify Apple Podcasts Google Podcasts Amazon

Episodes

#{{ podcast.number }} – {{ podcast.title }}

Credits

Podcast hosted by Guillaume Brincin, Fabien Le Guillarm, and Sébastien Spas.
Lost In Immersion © {{ year }}