Welcome to episode 31 of Lost in Immersion, your weekly 45-minute stream about innovation. As VR and AR veterans, we will discuss the latest news of the immersive industry. And today, this is a special episode about the MetaConnect. So this will be the only topic we will discuss about. And to start, we'll see with Fabien. What do you have to say about the MetaConnect, please? Yeah, thanks. Hello. So there are a lot of things to say. There was a lot of announcements on, obviously, the new Quest device, the Ray-Ban smart glasses, and news about AI as well, MetaAI. So maybe we can start with the Quest 3. So it's a mixed reality device, $500 as a price tag for the 128 gigabyte storage. And the biggest feature that was really the core of the presentation is mixed reality. So I was actually fortunate enough to go to a retail store over the weekend, and they had Quest 3. I was just able to, there was tethered to the display. It was just, you know, with my hands putting on the head. And obviously, the first thing that I expected to try and to test was the mixed reality, the pass-through. And is the video pass-through as good as they claim it to be? And I want to preface this that I was in a retail store, so a lot of neon lights. So maybe not the best environment to test this kind of pass-through video. First, I think it's actually very good. The video is clear. The field of view, the very large field of view really helps to have a very good immersion into that video. The colors were a bit off, but that was not too much. And the scale. So we mentioned a lot over the past episodes that one of the main issues that mixed reality pass-through devices have is distortion. So the scale of what you see is... I was able to see distortions around, like when moving my hands in front in the video. So I was, like, distortions around the hand especially. So around moving objects. Other than that, I think, again, it's a really big improvement compared to the Quest 2. Especially the field of view and, again, the mixed reality. Very quickly, that's my feedback on the Quest 3. And I'm really curious to hear what you think of that release. Yeah, I'm glad you had the chance to test it and provide feedback on the quality of the video. It's good to know. I just released an experience for Europa Park with live XRLs, which in terms of video quality is a bit below what you want to have. So it's great that they bring that to the table and increase this kind of quality. The screen seems quite amazing when you see the size and the comparison between the Quest 2 sizes and the Quest 3 sizes of the screen. It seems a huge improvement. They kept LCD screens, so it's not OLED, so the dark are not that dark. But it seems like it's okay from the test I've seen from users. They don't seem to complain that much about it. And it seems like they did a lot of work on providing to the developer a kind of toolkit that allows to do this kind of experience, too. They showcased that during the conference. You can scan your room. You can see the mesh being displayed. Occlusions are being made with real objects in the scene. And all that is already available in a toolkit that they will provide. So that really will help developing more quickly and more nicer experiences. So it's good to know that they went that far. What about you, Guillaume? Yeah. First of all, I would like to make a comment about the communication strategy about the Quest 3. We were not very sure what it was. So apparently it was a well-kept secret. As 10 minutes after the official announcement, there were like dozens of reviews popping up everywhere about the Quest 3. So meaning that a lot of people had the MetaQuest 3 in hands before the announcement. And they succeeded in keeping the secret. So this is their strategy. I don't have much to say about this. But, yeah, we know that a lot of people had this in hands. As I didn't have the chance to try it yet, my review and my feedback is based on the reviews of those who had the MetaQuest in hand before the announcement. So I'm very cautious with this. Because this is kind of, you know, review plus advertisement. Because those people don't want to lose the right to have future devices sent beforehand. So everything is good. Of course, the VR is great. And as Fabien said, my focus was on the review about the mixed reality one. And for a lot of those, I find it very surprising that it was a very, very short part of the review. And, yeah, oh, about mixed reality. Yeah, yeah, it works. So it's great. Okay. So I dig a bit more about this. And I found two or three different reviewers that were a bit more precise about this. And so what they are meaning and what they are telling us is that, okay, the video is great, as you mentioned. Color, not so good. You testify on that as well. However, about distortion, some people were less enthusiastic than you, Fabien. Meaning that the middle to far distance were great. But once you are trying to get something closer to your eyes, you see the distortion. And especially when you are using your phone, they are just saying that, yeah, you can see your phone and some notification. However, when you're trying to read specific text or stuff like that, it kind of becomes very difficult to do so. Some of the reviewers are saying that your cell phone is bending like it was flexible when it's close to you. You have those blurry area around your hand, like in the Quest 3, but way better. Quest Pro, sorry. But way better than it was. And it was something that I found very surprising. And I don't know if it's real or not. They were saying that the occlusion was not supported. Meaning that if you have augmented reality objects behind real physical objects, they are not displayed as so. Which is very surprising because we know that the iPhone can do that. Especially with the latest, I call it the Tamagotchi app. You should know what I mean. We saw the live demo in the previous episode with Fabien. And given the fact that they have a 3D scan of the house, of the environment, sorry, where you are using the assets, I found it very surprising that you don't have occlusion. So something to check on this. I won't buy it like now because I find it very weird. Or maybe there will be some software updates at some point. And the last thing that we already know, but a lot of people say this, is that, especially John Carmack as well, the old CTO of Oculus. Is that, okay, it's a great piece of hardware. However, the software part is still a bit off. So especially for the mixed reality part, apparently you have one or two applications right now delivered with the headsets. And it's not very convincing. And so the main idea here is that, well, we'll have to see what developers and companies will do with this feature. Because it is not obvious that mixed reality will be a game changer, especially for them to grow their community at this point. So I don't know if you heard this or if you have the same feeling about this. I can comment very quickly and agree with the reviews that you saw on the distortion for very close objects. Like I bent down and tried to read the specifications of the Quest 3 that were printed on the display. And yeah, there was like, not floating, but waves into the video when I was very close to the object. So it will be interesting to see if this behavior is the same with the Lynx R1. We know that Seb has had one at some point in his hands. But I guess we'll do a comparison or a more precise review about this in the next episode. But I'm very curious to know, now that we have the two headsets in hand, we can do a very precise comparison between the two. And especially which one is the best for mixed reality use. Yes, and about the occlusion. I saw in the video that they shown how to scan the room and they explained it was for occlusion. So maybe it's just not implemented in the default app or in the app, the environment that you launch when you launch the headset. Maybe that needs to be developed and implemented in the game itself. So that's something to look at. My first thought was that maybe their 3D scan of the environment is not precise enough yet. So that they maybe deactivated the feature. Because if your mesh is too low poly or so, your occlusion won't work that well, especially on complex physical objects. So we'll see. I'm sure that we know that once you have the hardware feature like 3D scans and stuff like this, we know that some developers can improve that very well in the shortcoming terms. So once again, we'll see what people will do with this as Meta is not willing to do so on its part. So second topics. We are coming back to Fabien, I guess you have something to share. So it's a feature that I found very interesting. And I'm really curious to see if users perceive this as just a gadget that will be cool for one day and then they will forget about it. Or if it will really become something ubiquitous in Mixed Reality. So Meta called this Augment. And basically their thinking is the same way that you can decorate your room, your actual room with flowers or with frames or pictures and such. You can decorate your virtual Mixed Reality room with virtual objects. So Instagram Reels or pictures and things like that. And I guess like developers will be able to create these widgets and propose these widgets for the user. So yeah, I found this idea very interesting, especially because you can place objects and they stay there. So hopefully if the 3D scan is working correctly. So I'm really curious to see how users perceive this feature. And I'm also curious to know what you think about this. Seb? Yeah, I like the idea and the concept. I wonder how it operates with all the other apps and how big can be your room scale. So is it possible to put different things in different rooms inside your place and have like games inside only one room and decoration in your living room and even decoration in your desk environment? And then you can also place your desktop application and desktop view and have your work environment and different environment already configured. So depending on where the headset is, you are in the different configuration already. That could be interesting in this way. And now I wonder how optimized it is because if you start putting a lot of things, how are they controlling the amount of performances to render all that and make sure there is no issue with the headset starting to be low in performances and not be able to do the different interaction. Yeah, I agree with this. Well, this is not a new feature, meaning that you already had that in the HoloLens 1. And I think this will be very interesting to see, as Seb mentioned, one, how the 3D scan of your room is working. And two, is it as powerful as the HoloLens was back in the days when it was based on the Wi-Fi recognition of your environment and then adjust the 3D augmentations that you placed all around? I can remember it was very efficient back in the day. So we'll see if Meta is keeping this or making a step back. It will be interesting to see. One thing that I didn't mention about the MetaQuest 3 Mixed Reality is that apparently the transition between real world and virtual one is very well done. So this is one feature that is very interesting. And the last thing I want to say is that, once again, these kind of features imply that you are keeping your headset for quite a very long time if you want to have some advantages of this. And given the review, especially about the distortion, I'm not sure we are there yet, but it's a good proof of concept for introducing new concepts that will be used, of course, I guess, with the Apple Vision Pro at some point. Those kind of AR augmented daily life, if I can say this. But yeah, it's completely meeting the goal that we've had for quite some years now. Especially with those AR video in 2016 or 2017 when we had this completely fake and imaginary view. I'm sure you remember this, but someone putting some glasses and they have their whole daily routine augmented through the kitchen and their transportation and their work. So we are coming to that. I guess the hardware is not there yet, but it's very interesting to see those concepts finally introduced to the main public. Fabien, sorry, any last thoughts about this? No, yeah, I think what you say is very accurate. So Seb, this is your time. Sure. So I wanted to speak about the AI announcement that Meta did at Connect, where they showcase the fact that across all the environment, there will be AI experiences available. Like those one where you can speak to avatar that have specific knowledge around something, so you can talk to them and get them the feedback, the answer. And that has been done with kind of real actors, real person, like Snoop Dogg, for example. Fabien just looked at the account a couple of minutes ago, and it was already talking and discussing about Dungeon Master. So I think it's about everything Dungeon and Dragon related. And yeah, they announced also the ability to edit image with AI, to segment things, remove the background, edit your picture before you share it on Facebook or WhatsApp. They announced that it will come and they announced also AI assistance integrated into WhatsApp, Messenger, Instagram. But also the coming soon Reban Meta Smart Places and also the Quest 3. So the generation, the ability to generate augment from AI assistance will come soon also. So the ability for the user to generate their own picture, their own short video or GIF, kind of 3D GIF that we can see in the augment video that they presented. And also add that to real games so the user can use their SDK also to interact with avatars that has their own knowledge around the game and can help you solve stuff. So that's also quite a huge news that we were expecting, but that's coming at the same time. So that means that they really focus on that now and that's coming to the Quest 3 soon. So for mixed reality experiences, I think that can help a lot. What's your thought on that, guys? I'm really curious to test to see how it behaves, if it's really as good as they mentioned it to be. But yeah, something that people might forget sometime is that Meta is maybe one of the biggest player on AI. Of course, we can hear OpenAI and Microsoft and Entropic, which received a huge amount from Amazon last week actually. But Meta is like they have AI everywhere already and they are pushing with their open source, well, kind of open source models also. So, yeah, it's not a surprise, but the surprise is more like on the usage, on what the applications. Like I wasn't expecting this kind of AI influencers that they create. So, yeah. What do you think, Guillaume? Yeah, I completely agree. Indeed, Facebook slash Meta was not the most noisy player about AI. We knew that they were doing something like their announcement very secretly. And of course, they have access to huge database and database for their AI learning. However, I'm not very sure about the usage here. We'll have to try this because maybe like Seb at some point, I was like, okay, this is cool, but what can we do with this? Yeah, it took us a bit by surprise, I guess. We're not prepared for that. So, we have to digest this news, know what people are doing with this. Is this welcome or not? But for me, maybe at my age, I don't really see the point here. Maybe I'm too old for this stuff. I don't know, but yeah, not sure about this. But one very interesting point will be to test the intelligence or the smartness of this AI. To know if this is very flexible and it can adapt like TGPT can. We know that TGPT can do some role playing as well, but maybe not that good at those ones. And this is very interesting to see the association between the AI and some famous people. I think this can be a bit dangerous at some point because you know that AI can't be mastered in old point. We saw that with TGPT, that it can lie, it can make some mistakes. And even it can be used to have some very sensible information if you modify some question. So, having those words put in an AI representation of real people, it can be tricky or messy. So, only the future can tell us what will happen. But yeah, it's a very bold move by Meta to introduce those virtual copy of real human influencers and famous people. And there is also like a psychology side of things. Will some users become friends or more attached to these AIs? Addicts? Addicted to the AIs, yeah. Feeling that they have a friend, Snoop Dogg has a friend. And talking to them to do the avatar like they are real person and forgot that they are not. Sorry, I've seen that virtual girlfriends already exist. There are services that do that, AI virtual girlfriends. Yeah, two things about this. I know that a lot of influencers are complaining about this effect because so that are posting daily videos, for example. Some people are watching these every day and it becomes like part of their lives. And when they are meeting their companion or I don't know the term, in real life, they are sharing like personal details. And like, hey, you know, two weeks ago we did this. And like if the influencer were there. So it's a very interesting effect on human mind that having this kind of daily and repetitive interaction, non-interaction, but at some point to some people it is. So it will increase these effects. And yeah, maybe we will see some very interesting facts about what the human brain can do. And to make just a compliment to what Fabien said about girlfriend AI, we can see that there are lots of Instagram accounts that are made by fake people. We can see that they can generate videos, they can generate pictures. And even on paid platform, I won't mention the name, but people are paying to see some adult content made by AI. And when those journalists that create or influencers that created those fake accounts, they are taking contact with people that were paying to see that and telling them, you know, this is fake. It's not a real person and people just don't care. They just said, yeah, well, it's fine with me. And if I can interact with that, I'm very, very happy to do so. So it's very weird to see that even if it's artificial, when there is a link that is created, well, it doesn't matter anymore. So very, very interesting and maybe a bit scary at some point because people can very easily attach to non-physical, non-real people. And maybe, as you mentioned, create a life companion that is not real. Yeah, really weird for me too. We are not at the age, but I fear for the young generation that would be keen to use this kind of tool, that they miss the real exchange with other people and focus only on asking questions, maybe even dumb questions to this kind of avatars and getting dumb answers to that question. So, yeah, but I really don't like this announcement, but all the rest, having the ability to add some interaction, add some 3D model, create, generate pictures and be able to, yeah, add all the tools to enhance the experience inside the headset, that's a good move forward that I'm looking to try. And talking about relation between people, they showcase something they are working on, which is not yet ready for the usage, but they use specific camera to generate this kind of 3D model. But using the Quest Pro and the eye tracking, they were able to show that two people can be in front of each other and see their 3D face really realistic inside the headset and see the movement and be able to talk to each other like you were already in front of each other. And, yeah, the progress in that direction is quite impressive compared to the avatar that we have so far. They really did an amazing job. I really wonder the kind of 3D camera that they use to get this kind of fun cloud 3D model that are really defined and seems really accurate. I think, Guillaume, you told us before that they used a pre-scan done and that seems obvious because we don't have this kind of quality of camera inside the Quest Pro. So unless they increase the quality of the cameras inside of this one and this is not a real headset that you can buy in the commerce, they might have done something to pre-scan the user face with a specific camera. But, yeah, the improvement in terms of quality for me is quite amazing. I don't know what you thought about that, guys. Wait. I didn't like the title of the video. Interview in the metaverse is like, okay. A joke aside, even if it's not available yet, it's very impressive to see this kind of quality. Especially, so they were not in the same location. So taking into account the delay and such, having something in real time, that's super impressive. Yeah. Just for you to know, this kind of feature won't be available until 2025. They are targeting the Quest Pro 2, which won't be the name. I can't remember the one where it's Quest 3 or 5 or whatever. Well, yeah, it's very interesting to see the progress of 3D scans. I think we can make a link with the announcement of a volumetric video as well by Apple. We can see that two big players right now are willing to introduce this kind of interaction to us. Meaning that you can talk to a 3D avatar or representation of yourself in real time, which is a big step forward. Because those kind of data are massive and they are bringing some new issue on the table. Especially on the streaming speed and efficiency for you to have this kind of interaction. So, yeah, about this, the quality is very good. It's really close to what we can see in movies, especially when they are making some old actor younger. Especially in the latest Indiana Jones movie, for example. But I still think that we are in this weird area between the uncanny valley. The face is very realistic, but you still don't know what is lacking to make it very convincing. There is this waxy effect of the face or some micro emotion that are not there yet. I guess it's very difficult to know what is lacking at this point. Because you have the skin defects and all these colors that are very well done. So I don't know what is lacking, but something is off when you are seeing this. It can make you a bit uncomfortable at some point when you are watching the whole interview. This is a very weird feeling when you are watching this. But very interesting to see that all those players are targeting this technology as something that will become real in the near future. Okay, so last topic. So I'll talk about the Ray-Ban glasses. We knew that META and Ray-Ban have a kind of partnership to create augmented assisted reality glasses to take the right term. So not much information about this. We know it's $300, so not very expensive. But the main interesting part about this announcement is that it integrates AI. I think this is a first to have smart glasses directly embedding an AI. So very interesting to see that. However, the way they showed how it works really brought me back in the past when we saw the announcement video of the Google Glass. Because those renderings are obviously completely fake. Because you are wearing sunglasses, so the image won't be that clear. And the messages are printed on the top side, on the lower side, and in the middle. I'm not sure, and I'm quite sure that you can't display that much information on such a big field of view. So I'm very curious to see the final result of this and if you can have this kind of display inside the glasses. But yeah, it's really easy. Once again, the kind of video we've seen, the fake video a few years back. When you have these smart glasses and AI so that you can be very, very smart on a date. And you can answer all the questions and bring some very interesting topics. And be like a super boyfriend or super girlfriend having all the answers to all the questions and be very interesting. But yeah, despite this, they showcased some use cases about this. So for me, this is the most interesting device. But I'm very willing to know how it is displayed for real in those glasses and how it works. Do you activate the microphone or is it always listening to what you are saying? And it's triggered by some words like Siri or Alexa and so on. So I don't know. What do you think? Yeah, I think they have the hey meta keyword. I've seen that in the video. So I'm not sure if this is correct, but I think they have that. I agree with what you say. Nothing much to add. I'm really curious about that streaming feature. If I'm correct, in the previous version, when the camera was active, there was a small light. That could notify other people that the camera was recording. I don't know how other users could react to someone having these glasses. Privacy concerns and stuff like that. Especially because they don't look like a Google Glass. You could definitely see that it was a special device. Here, it looks like a normal pair of glasses. So yeah, I'm a bit worried about that, I think. Yeah, Sam? Yeah. It seems like they integrated the system inside different kind of frames. So it's not only one version. It's really their own design of a famous Ray-Ban model. And it seems like they have a light embedded into it. So you have a way to announce to the user that you are filming with that light. That is on one of the arm branches. They also have an interface on the side, on the branches, where you can tap on it. And maybe slide from what I've seen to go through the menu and interact with the headset. On top of maybe having also voice recognition. And the streaming and the quality that seems to be available with the cameras of the glasses seems amazing from what I've seen. That is a good quality, something that you expect to have on your phone. But in terms of information that you get inside the glasses, I'm not sure that there is anything. It seems more like the information can be added on top of your phone and pictures that you edit that on your phone. I have not seen anything really displayed inside the glasses. Am I wrong? Did you see something like that? Even the screen that you are sharing with the text, it looks really like a portrait, but something that you see on the phone. Not something that you see inside the headset. So I don't know what is your thought about that. If you have seen really something displayed inside the glasses, someone shooting inside the glasses and showing that there is something inside. But from the design and from the size of it, for me, it's only the branches that are equipped with something that handle the camera, the Wi-Fi and the sharing on the phone of your video. Yeah, I really don't know what if something is displayed in the glasses. If there are no things displayed in them, of course, the use case is very limited. Because in every single video of presentation, people are looking around and doing stuff, not just looking at their phones to have the information that they are. So, yeah, I really don't know. It's very, very blurry to know exactly what it's doing. But yeah, you are completely right at this price tag. And given the form factor of the glasses, I don't really see how they can display very big information or very detailed information inside the glasses. So, yeah, we'll see what people are saying about this, doing more precise review and we will come back to this at some point. Okay. Do you have anything more to add to this MetaConnect event? No? Okay. So this is a wrap up for today and we'll see you guys next week for another episode of MetaConnect.

Lost In Immersion

Episode #{{podcast.number}} — {{podcast.title}}

Transcript

Show Transcript
{{transcript}}

Subscribe

Spotify Apple Podcasts Google Podcasts Amazon

Episodes

#{{ podcast.number }} – {{ podcast.title }}

Credits

Podcast hosted by Guillaume Brincin, Fabien Le Guillarm, and Sébastien Spas.
Lost In Immersion © {{ year }}