Welcome to episode 46 of Lost in Immersion, your weekly 45-minute stream about innovation. As VR and AR veterans, we will discuss the latest news of the immersive industry. I really wonder what we could talk about this week. So, Fabien, surprise us. Hello. Will we talk about the MetaQuest? No, no. So, the Apple Vision Pro is out and released since a couple of days, and actually in the hands of everyone who could buy it, who could afford it. And so I went through a lot of reviews and tech reviews to see what was the consensus, if there is one, around the Vision Pro. So, you know, all the major tech podcasts and websites have reviewed it. Let's start. Switch pages. We have the good. And I've picked up some images for fun of people using the Vision Pro in real situations, outside and even skating. Anyway, so the good first is the overall quality. So the quality of the build itself, the quality of the materials that are used, and the screen quality that are very, very high resolution. And something that I didn't know before looking at the reviews is that there is 4K rendering in the Apple Vision Pro, so thanks to the eye tracking, which seems to be very high quality. At least when you are in the Vision OS, you always have this 4K rendering. And you can actually see it if you look at videos that are recorded in the Vision Pro. In the Vision Pro, you can see that the center area where the user is looking is very sharp, and the outside are pretty blurry. So that's one way that they can also achieve this kind of very high quality. The path through seems to be the best one in the market from what I could read. So very high quality of the cameras, very high quality of the screens inside, as I said. So making it for a very nice path through. And I didn't see any comments about distortions. So we'll see about ourselves when we can hopefully test it, but it seems like the distortions are pretty minimal. Space tracking is very good. I don't know if there is really a difference with the Quest 3 that has also very nice space tracking, but it seems like you can put windows in a room and go to another room, put another window on the wall, go back into the previous room, and you'll see your windows still in place at the same position. And even you can see people walking, as you can see on the picture, skateboarding, and it seems like the windows really stay in place. Except I saw a test, someone doing it in a subway, and the windows were moving around. So tracking might have issues in this kind of situation. But I saw somewhere and I wasn't able to find someone who actually tested it, but there seems to be like a travel mode that kind of deactivates some of the sensors for this kind of usage. And then the ecosystem, so we touched about it a bit last week, but if you already are an iPhone user, you can just, similar to when you are changing iPhones, you can just put your phone close to the Vision Pro and the synchronization happens. So you can copy something on your Mac and paste it on your Vision Pro. So yeah, the Apple ecosystem is really making really sense here. So that was the good. And then the, I didn't say the bad, I said the cons. Can we say something about the good? Yeah, sure. Seb, if you want to, or I can start. Go ahead, Kian. So yeah, about the latency and so on. So that the people can play ping pong or skateboard or whatever. Apparently 12 milliseconds, which is surprisingly fast. Apparently the dedicated chip of the Apple Vision Pro is doing its work. So very impressive on that side. And the other thing I would like to mention that anybody seems to be surprised about is that it works in broad daylight, which is a premier, because any other headset is doing so. I guess the MetaQuest 3 can, but not in broad daylight with a very sunny day. So very impressive on that side. And I'm very, very surprised that no one is pointing that. Meaning that they are not basing their technology only on infrared, like the yellow lens, for example, which has a lot of issue when you are taking it outside, especially for tracking and especially when you are looking at the sun. I saw a video of someone trying to see the color correction when you are in broad daylight and looking at the sun. Of course, the camera adaptation takes more time and the color is not as great as it is in a controlled environment. But yeah, for me, it's a very, very surprising point. And of course, the distortion is a distortion. We talked about it for weeks and months. And apparently, the fact that the field of view is not that big. Some reviewers are talking about the view you would have in a ski mask or a Google mask, Google for skiing. So people are not pointing that out because it feels like what you are seeing in a ski mask. So they are used to it. But the field of view seems to be lower than in a Quest 3 or other competitors. So I guess they tricked. I don't know if they corrected the problem that way. Hopefully, you will get a device to make our own test about it. But yeah, that's pretty much it for me on the good side. Yeah, I agree. The 12 milliseconds is quite impressive with the camera exposure time and time to gather the picture, update it, and re-diffuse it in the screen. Yeah, it's amazing. And the great idea they had is to have a specific processor for that compared to the other headset. When you do a mixed reality experience on the Quest 3 and it starts to be low in performances, then everything is impacted, even the camera pass-through, the color pass-through. So you get lag on your video and start to be annoying for the user. You should get motion sickness, basically, quite rapidly. Here, with this system, you always get 12 milliseconds. And that's what you need in augmented reality. So that's just perfect to have a constant frame rate around that. Plus, the end tracking seems to work very great. The way they do the occlusion of the hand seems to work great, too, even though sometimes you can see some blur and some aliasing around the hand. Maybe that's something you wanted to talk about in the concept. But yeah, that still seems to be impressive for me. And if the color of the hand is great, then it's all good. Yeah, actually, I don't think I put it in the cons. No, because I didn't see so many people complaining about it. Indeed, in some videos, you can see some holes around the hands when the people are moving their hands quite fast. But overall, it seems to work quite well. So you're moving on the cons. So the weight and the pressure on the face is something that everybody is talking about, especially if you are using only the large standard strap. And we talked about it a couple of weeks ago. It seems like the other strap that goes above the head is kind of mandatory if you want to use it for more than 30 or 45 minutes. So yeah, that seems to be quite a big one that everybody is talking about. And the next that I saw almost consensus on it is the outside screen, the eyesight. So reproducing the eyes on the outside. And it seems like it doesn't work, basically. So it doesn't give the effect of that you are actually looking at someone in the eyes. And the screen seems to be quite reflective as well. So because the screens inside are not that bright, so the effect is kind of a bit lost here. Then battery life and the fact that you cannot hot switch the battery is also kind of an issue. So you have to basically turn off the device, switch battery and turn it on. I saw people having issues with the path through in low light. So in the dark at night, but that kind of makes sense with the limitation of the technology actually. It cannot be perfect. I saw quite a lot of people kind of being excited by the OS itself, the Vision OS. And the native apps that you have, the pictures, the videos and FaceTime and et cetera. But not many people really enjoying new applications from developers, except the standard movies. So it seems like as of today, maybe a couple of days. So as of today, we don't have that killer app yet on the Vision Pro. We talked about the field of view, which seems to be smaller than the Quest 3. But still acceptable, as you were mentioning Guillaume, similar to what you can see in a ski mask. So not that small. And so the avatar that you can see in FaceTime. So the one that you build when you are recording your face with the headset. Seems to be kind of uncanny at start. So yeah, it seems like it gets people a bit of time to get used to these avatars. Something like I saw a lot of people complaining about. And finally, I mean the price, we all know from the start that it's an expensive device. So yeah, what do you think Seb? Yeah, I agree on all points that you went through. For the price, of course, that's expensive. But you cannot compare it to the Quest 3. I think it's a really higher quality in terms of pass-through. So it's hard to compare it when it's in between the X Vario headset and the Quest 3 headset. And the Vario is way more expensive. The application, yeah, I agree with you. There doesn't seem to be a lot of augmented reality application. It's mostly virtual stuff that you place in your environment, but you do not interact a lot with them. It's mostly a video playing or Windows that you lay out in your space. One thing I think it's good is that they now showcase that the cable can be removed from the battery. So if you have an issue with the cable, at least this part can be replaced or you can change the battery easily. But like you said, it's an issue that you cannot run at all the headset without the battery. At least just when swapping one. Like you said, another swap is not feasible. And that's quite annoying when you have one minute of loading the headset again back when you stop it. I also saw that there is no button to stop the power. It's on this battery that you remove and it stops the headset. Otherwise, it's just in standby mode. So it's an interesting way of doing things. I think the first device that I see working this way without a power button. But I mean, that makes sense in their design that the way they thought it from the beginning, it's with the battery or not usable at all. But that's something that should improve in the next version, I hope. Yeah, I would be really curious to see. Hopefully, in the near future, we can test the Vario to see where the quality of each headset is set between all of them. That could be interesting. Yeah, I agree in terms of quality and pass-through quality. We cannot compare with the Quest 3. So what I wonder is, is the quality good enough? I mean, so good that we are kind of forgetting about the smaller field of view. Maybe that's a really big difference. Yeah, because it was shocking when you put the headset on someone new, the Quest 3 or the HTC Vive XR Elite. The first thing that they say is, oh, that's not the reality that I'm seeing. It's a video with lower quality. And here, with the Vision Pro, everyone is saying that it's almost real life coloring and real life quality. You can still see that it's a screen and cameras, but it's way better than what you can get with the Quest 3. Yeah, okay. And one note about the battery as well. I forgot to put it. I think maybe it's in the goodies. You can, if it's plugged, you can plug the battery. And, you know, if you are seated, you can write as much as you want. And they did it at some... They placed the cable to recharge the battery on the same side as a cable that goes to the headset. So that if you put it in your pocket, the cable goes from the same direction. It's not going from one side and the other side. And then it's harder to put in a pocket. Okay. One thing I saw on the cons is the keyboard. The fact, the way the keyboard works inside the headset. You either use your hand and you type in the air, but you have to use really the bottom of your finger. And you cannot type like this, not naturally. You type like a whole typewriter, an old typewriter. Which is okay, but it takes longer time to type. Or you have to look at each letter and then do this. So at least you have a force feedback, kind of force feedback, natural force feedback with your finger when you do that. And you look at each letter. It seems to be pretty fast after a while when you get used to it. But it still takes more time than typing on your keyboard. Like you said, the ecosystem is important here. So having the ability to use your Mac keyboard or your Apple keyboard is a plus there. Yeah, that reminds me. I have one note and one question for you. The note is, yeah, there is almost no AR. It's all spatial computing. And the only AR that I saw in the review video is when someone is looking at the MacBook. They see the connect button on top of it. So it seems to be, I don't know how they do it if there is a setup to do before or whatever. But it seems to be tracked to the MacBook. I don't know if the user has to place the MacBook first. I'm not sure. I don't know. I think they exactly know their device. So tracking them is not an issue for them. Yeah, but like, you know, with the difference of... I don't know. And also with the keyboard, if you use a Bluetooth keyboard. And I forgot the question. Okay. But yeah, someone was also saying that it takes a bit of time to get the fact that you have always to look at buttons to click it. And you cannot just look at it and then look at something else and click. Thinking that the previous button you are looking at will trigger. That takes a while to get used to. I thought that was interesting. Like you need to learn new behavior to use the headset. Okay. Just about the interaction. I think we are in the minority... How do you say that? Well, you know, back in the day, everyone wanted to do the minority report interface. And well, it doesn't work because we are not meant to put our hand in the air and do stuff. Because it triggers very fast muscle fatigue. And so we can't do that. And I guess we are in the same kind of problem. Meaning that on the paper, it could be a great idea. Not to have any controller and just using your eyes and your hand. Somebody is touching a mug. Can you stop that, please? But yeah. What I say is that, of course, it seems to be a good thing. Completely liberty, freedom and whatever. But once you are doing it for a long period of time, I can understand why people are... Some reviewers were frustrated. Someone just wanted to please give me a controller. Because I'm fed up with using my eyes. And it's just tiring. So, yeah. We understand that our head and our eyes are not working that way. At some point, yes. But not all the time. So, I'm maybe predicting that at some point, they will do exactly the same as Microsoft with the clicker. You know, this small remote that was just meant to click. Instead of doing the pinch and all this gesture that were very tiring on the long term. So, we'll see that. But yeah. I predict that they will be doing this at some point. So, you were on the battery items? Or you just did all the items as well, Fabien? Yeah, all of them, yeah. Okay. So, just a quick word on the avatar. Or the personas, as they are calling them. I don't know if you talked about it. Because I had some emergency to address. But everybody was like freaked out about the personas. And oh, yeah. It's ugly. It's awful. And whatever. I didn't think this was that bad. Given the fact that they are scanning it so quickly. If you are watching more and more reviews, at the end, people are not that annoyed. I guess it was just to trash talk about something. But of course, it's not perfect. Because people were complaining that their hair was not moving and so on. Of course, it won't. Because you are just doing a 12 or 15 second scan of your face. So, of course, it's all messed up. But given that fact, I think the rendering of your emotion and especially the mouth movement and expression. So, given the headset and all of that, I think it's pretty impressive. And I guess they will be doing a better scan in the future. Maybe by doing a longer period of time of scanning your face. Because 15 seconds is way too fast to get something with lots of details. So, yeah. No, I'm not that disappointed by the personas themselves. Where I'm very disappointed is about the available applications. We knew that developers have access to the Apple Vision Pro since the announcement. So, it's been one year-ish. And nothing is impressive at all. It's just a very, very simple AR app or games or concepts. Nobody found something interesting to do. So, maybe as we have the issue with some Apple devices. Maybe they don't have access to a lot of features of the headset itself. Like the environment mapping or the 3D scans of the room, for example. We knew that some times at the beginning Apple are not opening these. Because they want to keep the exclusivity of some technical features of their headsets. Or because there are some confidential or security issues. So, I don't know. But I'm very, very disappointed that we only have watching TV and spatial computing. Meaning we can work and not much applications that are doing augmented reality. So, yeah. Yeah, I completely agree on that. About the uncanny avatars. I think the issue is that it's close to reality. But still miss a bit to be nice to look at. I wonder if it's an option to switch to a more 2D avatar. But still with your real emotion would have been a better solution for now. Until the uncanny is just at the... We pass the uncanny to the okay, it's almost real. Because I think that's really the issue here. It's close to reality but not perfect or strange to look at. But if you look at the two doing that emotion, I think that would be acceptable. At least for now. For some people to have the choice between the two. And one thing we did not mention is the voice recognition. That seems to be also quite impressive. So, that's another way to interact. When you search, for example, on your browser, you can directly say the web page that you want the Vision Pro to open. And that really seems to be working really great. Yeah. And I didn't mention the sound as well. It seems like the speakers, special speakers are working very well as well. And just as a personal guess and conclusions on this. If I can switch pages. Yes. So, I think I'm looking at the reviews. And again, it's my guess. So, let me know if you agree on it. But I think I understand better the compromise that Apple has made on the quality. And it's like they wanted to start this new venture into a headset with something that has the highest possible quality with the current technology. And this means having the external battery, having a very heavy headset and a very expensive headset. But that was what they wanted to have. So, when people put it in, it's like, wow, it's as good as it can be with the current state of technology. So, yeah, that's my guess. I don't know if you agree or not. And I put some nice pictures that I found on Twitter. I think, like you said, the narrow field of view to make sure the quality of the video that you see inside the headset is not annoying or cannot be the first thing that you complain about. Like in the other headsets that are currently on the market. So, yeah, that meant a higher price because of higher quality of hardware, cameras and screens. And a dedicated chipset to this part of the experience, which is very important to be comfortable for users. Yeah, I guess it sums it up, meaning that they did what they could with the technology that is available right now. And even more because they invented the dedicated chip, which is a very big step forward with the 12 millisecond latency. They provided the best AR experience you can have right now. And it's a good, yeah, we can hope that it would be better in the future. But I guess they already know their issues because when they announced the 2025 headset for consumers, they already said that the front display will disappear. So they absolutely know that it's not working or it doesn't give that much compared to the technical issue that is probably giving them. So, yeah, it was just a commercial effect, I guess, for the first version. We know that when they are releasing their first iteration of a device, they always want to have this wow effect or whatever. But I guess the second version will be way more efficient for usage. And of course, the price as well will be lower and hopefully the technology will be better as well in two or in one or two years. So we'll see. But yeah, I guess they delivered what they wanted. People seem to be impressed nonetheless. So I guess it's a win for them as well. And all those people wearing it in public spaces, I don't know if it's just for the buzz or if they are really using them as is. If it is for them, we'll see in the upcoming weeks or months if people are still walking around with these headsets on. But it's interesting to see that now it's imprinted in the global, in the mainstream mind because they are seeing those people wearing masks. And I'm glad that they are not afraid of doing so, despite the fact if it's just for the buzz. But it's really interesting to see that people will see that in the street and they are now knowing that it's coming. They will have something on their face at some point. Yeah, I totally agree. And I think that's it for that part. Yeah, I just had one thing that I think is true is that the killer app, there is no killer app, but it's a killer ecosystem around this asset compared to the other one. So, I think that's one difference that we can note compared to the other headsets on the market. It's a bare minimum for Apple, I guess. I mean, it's a community that they have. Yeah, it's a nice move from them. On my side, I wanted to talk about an app that is coming to the Vision Pro, which is Polycam. So, this app exists since a long time now. It allows you to take pictures of your real models and generate 3D models. About them, we talked about this app since a long time, but now they are bringing the app to the Vision Pro. So, you will be able to check out your model or one that has been publicly made available on the platform, and then you will be able to drag them in your world and place them in your space. So, yeah, we talked about that, but that's one thing that is missing on the market right now and for the Mixed Reality experiences is to have a lot of assets to work with. And this opens up a nice interactive experience, a nice opportunity for users to create their own model and experience with them. So, yeah, what do you think about that, guys? Yeah, I mean, it's really cool. It's too bad that they don't have the ability to capture from the Vision Pro as well, but I guess it will come at some point with all the cameras. I mean, they are capturing special videos, so it should be possible as well. Yeah. Cool. Yeah, ours, I always wonder what the real use case here, just to see your 3D models or the library of 3D models and place them in your space. We've already seen that with the app like Augment, for example, which are providing this kind of database of 3D. So, here you can see you can do some interior design, but on the practical side, it's always minimal. So, I guess the point of Fabien is a good one. You should be able to 3D scan, then import it directly to the database and verify that it's unchecked, if everything works. But there, I guess you don't have any export, meaning that if you are doing interior design and having your room done on your table, I guess you can't export it to a new model. It's just playing with the 3D assets, and it's very, very limiting that way, and I hope that they will find a better use case for this. And once again, I'm very, very surprised that Luma AI, but maybe it's coming, which is a company that is very, very on the 3D scan side with Gaussian and so on. They didn't announce that they will be compatible with the Apple Vision Pro. So, maybe it's not yet finished, or as I said, maybe they don't have access to all the cameras and captors of the Apple Vision Pro, so that they can do what they are doing right now with a simple smartphone or tablet. Yeah, I agree with you on the fact that it's missing scenario and things that you can do to interact with your environment that you created and save it as an app or something that you can launch, or even create augments. So, when you go on specific place in your environment, then you can see again the 3D media that you have placed there, and also save that to share that with someone else. But I guess that's coming, that's the first step toward that. Yeah, I think it would be nice, like you go on a trip or on a holiday, you capture a few things with your phone, and when you're back home, you can showcase what you captured to your friends on the Vision Pro. Could be nice. Okay. Anything more, Seb? No, I think that's it. You can start your own subject. Oh, it will be very quick, because we don't have much time. But yeah, just to wrap this up about Meta. They earned money for the first time with their VR and immersive branch. They did like 1 billion last semester, so very impressive. And the other news is that Horizon, you know, the metaverse made by Meta that we buried months ago. Apparently, it is now in the top 10 most used application on the Quest. So people are saying that it is rising from the dead, and maybe it will become something in the upcoming months. But we know that they've made some Roblox partnership as well, so I don't know if they pulled the trigger too soon by making this kind of partnership, and now they will have two applications that are competitors to each other. Or maybe they would share the community. But yeah, I don't really know what it's going to be, but very good news that the metaverse of Meta is not dead yet. And apparently, it's becoming something. So I guess they are still working on it, making some improvement, and people are answering to that. Yeah, it's been a while since I've looked at, but last time I checked, Horizon was not available in Japan, so I wasn't able to test it. But I tried Roblox, and it was a pretty bad experience. It seemed to me like they just took the mobile and desktop app and put it in VR, so the user experience is really not what we would expect from a VR experience, totally different from what you can get in VRChat, which is intuitive. So I don't know. I need to test again. It's been a couple of weeks, so I don't know if they made updates. Yeah, that's maybe a hint as to why Horizon is rising up. Right. Yeah, nice to see that one metaverse is coming from the dead, like you said. Okay, so I guess that's it for today. We'll see you guys next week, and hopefully we'll talk about the Apple Vision Pro Killer app that will show up from somewhere. Otherwise, we'll find some other subjects of the immersive world. So see you guys, have a nice week, and goodbye. Thanks. Bye. Bye.

Lost In Immersion

Episode #{{podcast.number}} — {{podcast.title}}

Transcript

Show Transcript
{{transcript}}

Subscribe

Spotify Apple Podcasts Google Podcasts Amazon

Episodes

#{{ podcast.number }} – {{ podcast.title }}

Credits

Podcast hosted by Guillaume Brincin, Fabien Le Guillarm, and Sébastien Spas.
Lost In Immersion © {{ year }}