Welcome to episode 63 of Lost in Immersion, your weekly 45-minute stream about innovation. As VR and AR veterans, we will discuss the latest news of the immersive industry. Hello, guys. Hello. Hello. Hello. Hello. So, Fabien, if you want to start as usual, please. Yeah. Cool. Thanks. So, I have a few updates today. The first one I want to discuss is actually something I just tried right before the podcast is I saw that the Vision OS 2 beta version was available. So, I installed it and tested it very quickly. So, the first thing that happens when the update is installed is you have to rescan your face. So, there is an update on the persona. So, to be honest, I didn't see a huge, huge difference between the previous update and the current one. I think it's still looking pretty cool. Just, as usual, a bit, like, uncanny. The face expression and the way it reacts, there is absolutely no delay. The face expressions are really, really good. So, I realized that in the video it's pretty small, so it should be quite hard to see. But, yeah, the quality is really good. So, yeah, that's it for this one. Nope. I don't know if I mentioned it, but the process to actually scan is very fast. So, you have to put the Vision Pro in front of you. There is a few, like, just to close your eyes, to raise the eyebrow, to smile, smile with the teeth out, to look right, left, up, down, and that's it. And to show the hands once you... No, before, actually. Before you scan the face, you show the hands. So, all the process takes up, like, I don't know, two, three, four minutes max. So, it's actually pretty impressive. So, that's the first update. The other one is there are new gestures to navigate. So, this gesture that you can see here, try to move this, yeah, to... So, it's a flip of the hand and a pinch. Actually, pretty similar to what is on the Quest. So, with the palm up to bring up the menu. And if you flip the hand the other way, you have access to the control center and to the volume of the... The sound volume. I had to get a bit used to it, but, you know, it's... Yeah. Suns up. It's very reactive. I think when we discussed the Vision Pro update a few weeks back, we were wondering if it will collide with the standard usage in other apps. So, that's something that I didn't have time to test. Because, yeah, I just tested it right now. But it seems pretty natural to do. And maybe it's deactivated when an app is full screen. I need to try that. So, that's the second update. For now, pretty convinced about it. It's indeed much, much easier than reaching to the crown. And it's much easier than looking up to reach to the control center. So, in my opinion, it's an improvement. And the last one is they use AI to transform, like, a flat picture into a spatial picture. So, I tested that as well. It's available. So, the icon here is a bit different than on a spatial picture. It's very quick. As you can see, just a few seconds to transform the picture. And the results are actually pretty good. I could see the depth in the image. I tested another picture. I don't have it here in the recording. But it's something much closer to the camera. And the depth was really present. You could really see the 3D on the video. So, yeah. Again, just as a very quick test, it works quite well. So, yeah. That's the three things that I tested just right now. So, maybe we can discuss that first. Seb, do you have any comments or questions? No. I would be keen to test this 2D picture to 3D in the Vision Pro. I saw that there might be some things going on on the Quest 3, doing similar things. So, I may try it on the Quest 3 on my end. But it's quite nice. I was wondering, when you tried something close to the camera, was it blurry behind when you were rotating and trying to look behind the thing that was closer to the camera? Or do they reconstruct the background and use AI to compensate the fact that you don't have the information about what is behind the user or the object? Yeah. That's a very good question. I don't know. I will need to try it again to look into the details. But it wasn't shocking. It was actually pretty natural moving around. Regarding the persona, I don't know if there is any improvement on that. Like you said, on the video, it's hard to see. I saw also that they released a lot of advice and video on how to implement correctly different things in the scene. Like getting the real light environment. So, your object gets an environment light, an environment map on them and looks more realistic. And also, a way to transition between the virtual environment that you display in the headset. So, even the environment map matches what level of immersion you are in. If you are fully immersed in the environment or in your room. Or even in between. The map seems to take into account both. And there is a lot of things like translucent materials. Things to really improve the quality of the experience. So, that's great that they are providing advice to developers on how to use their headset and make better experiences. And that they keep to release updates. And now, like you said, for the interactive gesture that triggers stuff. I really wonder how it will work inside the app. If it needs to be implemented by the developer. If they want this functionality to be available for the user during the app. While using the app that they developed. Or is it only for the menu when you are outside an app? I would be keen to have an answer to that. Because I guess it will mess. If it's available during a game or something. I guess this kind of gesture that you were showing. Is the one that you will do inside the game. So, it will definitely bother you. If you have the ability to test that, Fab. And give feedback, that would be great. Yeah, I will test that. My guess is that it will not be detected when the full screen app. But maybe available during the standard special apps. That would make sense. I will try that, yeah. Guillaume, any comments? Everything that Seb said. Not much more to add. Okay, cool. And then, one. So, it was at the last week. So, it's one of the biggest conferences about our field. Immersive experiences, AR, VR, and so on. And I saw these glasses. They are like standalone glasses. Made by NTT. A Japanese company. One of the major telecom companies here in Japan. And I'm actually really curious. And I didn't find any explanation. About how the display is working. So, you can see here. A picture of the display. And here, how it looks. Taken from a smartphone. And in the audio, they say that we cannot see. As we really see with the eyes. So, I'm curious. If you guys know how that works. If you know that. I don't. So, Guillaume? No, I didn't see these glasses. It's very interesting to see. That no one talked about this much. And I also don't know how it can work. Especially the curved line at the bottom. You can imagine that they are. Playing with the capability of the eye. To generate an image. Even if it's striped down. With these different lines horizontally. But, no. I don't have much more information than you. You just have to read the Japanese part on the right. I used the translate. He says similarly. He cannot figure out. How to record it properly. So, we can actually see what the quality is. So, this is why you have the. Cranes maybe here. Yeah. So, what's interesting here also. Apart from the display. I don't really know how it works. It's full. Autonomous glasses. He says here. It seems like the tracking. Is very similar. So, I will do some. I will keep looking at this. Maybe the price. If we are able to test things like that. There is an event in two weeks. In Tokyo. Where last year there was meta. I tried that. So, I am hopeful. That NTT will have a booth. That's a lot of hope. You are the man of the situation. If you want to try this. It was the closest one. Yeah. And the last. There was a lot of updates. During the AWE. One thing I found interesting. There is the studio. Even if they had an editor. It was done from code. Now they have a studio that looks like. Any studio. Like play canvas. Or a studio like that. One thing I really liked. You can connect your device. It shows the QR code. You can see the changes. You save the changes. It reloads the page. I think that's really good. To test in real time. On an actual device. Yeah. Nothing much to say. It will allow more and more people. To have access to. Creation of apps. Increasing. The speed to market as well. It's pretty cool. I don't know. If you have anything. To say on the studio. Otherwise I will let you. The studio seems nice. It's great to have a real tool. To implement and test. What you are implementing. It's very important. To always try it. To make sure it's working correctly. One thing. Do they still keep their license fee. Has it changed at all? Do you know? It changed. It's free. The license fee. You can test it. For free. You can't share it. It needs to be linked to your account. With 8th wall. You have a demo license. You can share. If it's a commercial project. You have to use it. To buy the commercial license. I had. The same remark. The biggest update we wanted. If they could make it cheaper. For smaller studios. It would be great. It's shaped for big commercial applications. If you have smaller projects. It's incompatible. With the use of 8th wall. Cool. Over to you Seb. On my side. I have news. On my side. I wanted to talk about the lens studio. It was released last week. The new version. Version 5. They added the support for AI. So AI. To generate different environments. Also different 3D objects. Texturing an object. For example. You can train. In the environment. A model. And apply it to your scenario. Adding a filter. Where you can have a panda face. Here they are showcasing. How to inpaint a picture. So you look like a character. From an animated movie. Or a style. Based on the picture you provide. 3D generation. You type your text. You provide an image. You say what you want. It generates a 3D model for you. You can also generate texture. And apply it. Inside the studio. They also released. A way to display. Gaussian splatting. You can import it. Inside the app. Here is a test. So the. Football helmet. Was scanned. With a phone. Exported to Gaussian splat. And used inside the studio. Here is a couple of examples. From the same guy. He scanned another hair. And applied it. To his own face. And another example. With a sunflower. So. Quite an impressive release. I did a lot of things. We will see. How great they are working. It is a nice improvement. Inside the tool. It is a great improvement. To generate new content. Guillaume. Any comments? It is nice to see. The results. We know. To achieve these results. You have to. Spend time. Cleaning your. Initial splat. Without any. Splat noises. I get. The integration. Inside. The whole bunch. Is on the way. We know. The studio. Has done it. For quite some months now. We can see. That every studio is doing so. I am very curious. To know why. Some software. Which is completely. It seems to be. Quite fast to do. In other software. I am waiting. For this feature. We know. They are efficient. Working with this. It is one of the best software. I am very curious. To know why. They didn't do this yet. The results are great. Once again. Do you have some information. About the licensing fee? I guess this is the subject here. When you are working. It is very interesting to work with. Do you have information. About the licensing fee? It is in beta right now. I need to dig into that. To see. How much it will cost. To develop this kind of feature. I guess for AI. There should be somewhere. I don't think they will release that. I will dig into that. Fabien? Yeah. It is really cool. To see how. All the major. Social networks. Instagram. TikTok. And Snapchat. They have their AI. And Snapchat. Has always been ahead. With the vision. The name is. Slipping my mind right now. In the city. All the features. And the quality. Has always been at the top. Compared to the other. As you were saying. It is a lot of new updates. In just one release. So. It is really cool. To test it. To see how far we can push it. I know they also have. A web SDK. So that opens up. Other opportunities. Outside of Snapchat itself. Which is really cool. There is still this limitation. When you release a lens for Snapchat. Your lens needs to be 8 megabytes. So quite small. But they are opening up. This kind of content. It is all generated online. I don't know if there is anything. Inside the lens. Is it outside of the 8 megabytes. And you can do much more. Than what you could do before. That is a lot of things we need to do. To see what is open up. Now. And that is it. For that subject. The next one. It was also presented at the Augmented Reality Expo. In Long Beach last week. So it is. Kind of a better version. Of what we see in the Vision Pro. That was released by Ultra Reality. This one is a bit strange. Because it is only visible. From one angle. These glasses. Are quite impressive. In the way they are recreating the face. Behind the mask. It could be fake. I don't know if there is a real head. Behind the mannequin. That they used. But yeah. Much more impressive than what we have right now. With the Vision Pro. It is nice to see that some companies. Are trying to move forward on this part. Because right now it is not usable. As it is in the Vision Pro. As a gadget. So yeah. Guillaume. It is very interesting to see that. Companies are stepping in. In this kind of features. As we already know. It will be abundant in the future. Vision Pro release. Next year or the year after. Because it is taking a lot of power. It is not that efficient. I guess people are not very fond of it. Also. Why persist in this way. While we obviously know now. It is not a cool feature. It was at the beginning. But now people are over it. We will see the future of this company. To see if they can stay. Or maybe they are targeting the robotic part. Because we know it is very complicated. To recreate a human face. Through mannequin heads. So maybe this is a solution. For them to just have a display. Of a human face. Or other kind of face. And it would be an easier integration. To recreate a synthetic head. But we are very curious to know. Their future. I don't have a lot of things to say about that. It looks really sci-fi to me. There are a lot of science fiction movies. Or badly. Like real. But when you see. The faces. Stamped on the movie. It is funny. I didn't think about the robotic applications. But maybe it is a good one. To have a real face. To have a real face. To talk to. It could change a bit. The next one is haptics. Which revealed their new haptic gloves. Which are much more precise. Than the previous version. With a lot more actuators. And force feedback in the gloves. However, when I see the video. It still seems very heavy. And big on yourself. It still seems to be using air compressor. In your backpack. So we will try it again. But it is still really big. And only for industrial use cases. Where you need a precise gesture. To be trained. Or to showcase also. That with this. They are able to replicate on a robot. The same gesture. Maybe to teach the robot. How to do some interaction. And use AI to train that model. And have the robot make a specific gesture. To be useful. Or to do operation on a distance field. Where it is quite dangerous for a human being. To be there. And use this to control the robot. In a really delicate gesture. Other than that. I don't see it being used by a standard user. Right now. Do you have any thoughts about that? Guillaume? It seems really natural. And lightweight. They are trying to showcase. That it is not that heavy. But you can obviously see. That people are struggling. Moving. Not their hands. But the whole arms. And so on. You can see that the backpack is not very light. Once again. I believe the haptic feeling. Should be great. But the whole equipment. Is clearly a no-go. It is really dedicated. To some very specific industrial tasks. We are still. Not in the. Smaller form factor. Of haptic devices. We are still on the huge one. Very good. Any thoughts? I don't have much to say. I think you said it all. I think I saw a vice tracker. Am I correct? It seems like they are also using. Different types. Of motion capture. As well as the force feedback. It was interesting to see. Compared to the previous version. The current version. Was using. Something very big on the finger. When you were touching. Even pinching was weird. At maximum you were able to do this. And not really close your finger. Because the plastic. Was really big on your finger. Here they seem to have made that. Really thin on the finger. With a lot of sensation. Even in each finger. I really would like to try it. I heard some person. Look at the post. Where people are saying. Even doing a searing gesture. The feeling of handling the searing. Was quite amazing. It was quite a small object. They were saying. Putting the finger next to it. And missing the spot. You were not getting any feedback. When you were on the spot. I would like to try it. The force feedback. Changes a lot. The experience you feel. In virtual reality. And the last one. Is a more funny one. This company is releasing. Another board for VR. For you to play. For gaming. To move yourself. As you were on another board. I'm not sure how it feels. You won't get the acceleration. I don't know how the body. Will react to it. I would like to try it. To see how it performs. And if it gives me motion sickness. I'm quite susceptible. To this kind of experience. That's interesting. To see another way of doing motion. Instead of having a treadmill. It's way more smaller. In some cases. To use on some kind of game. I think that could be fun. What is your thought? It's a great way to break your knees. If you are bending. Too far forward. I guess. They should have thought about this. Once again. Because of this knee strap. Not very comfortable. I'm not sure about the initial position. It doesn't seem very natural. I'm always surprised. About the imagination. And creativity. Of those companies. We always think. We have reached the top. Of what we can do. With different treadmills. And sliding boards. You have the sliding boots. And the rolling one. It's always a great thing. To see new devices. It should create new traction. Or ideas to other companies. And find the best product. To create this locomotion. It's not natural. The guy. Leaning forward. And getting it on his knees. Yeah. Having tried on a treadmill. It's not natural. OK. Basically the person. On the stand. With the knee locked. I could see the video. But I can't see it very well. The thing is static. It's static. But it turns? No, it turns. OK. Interesting. The demo. With the pickle. Have you seen. These shoes. You slide with the shoes. It's also. Could be really. Easy to fall down. Maybe it's pretty good. You don't dig into it. On all of them. You need to change the way you walk. It doesn't seem natural. Like Guillaume said. I don't think we have a solution. That is perfect. All of them are trying different ways. Of simulating the walk. We see which direction. Goes further. And that's it for me. Guillaume. The AWE. Once again. Sony unveiled. Their new enterprise XR. They showcased. At the beginning of the year. They allowed. Just as a reminder. It's a standalone one. Thank you. It's a standalone. It's a standalone. It's a standalone. And. We have a new customer. It's a standalone. It's a standalone. It's a standalone. Thank you. Bye bye. Bye bye. Bye bye. Bye bye. Bye Bye Bye. Bye bye. Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Bye Fabien? Yeah It's a... I guess it's a difficult situation for them One thing that I noticed in the video about the professional one and I didn't thought about that as well is actually the ring is quite nice when someone is working let's say on a keyboard for example and wants to do something in a mixed reality they don't have to grab another device they can just act really quick without really going away from the keyboard so I found it was quite interesting but if it's the only really plus about this headset yeah I really hope as you said that maybe it was a better version or the environment of the trade show was not the best for the brain but yeah as you mentioned it and we spoke about it last week the Logitech status is like 130... 130, yeah 130 US so it's pretty cheap and if the precision is as good as they claim they can also grab part of the market as well so yeah, again I think maybe we say it every week since we started the podcast but the VR, MR industries are moving quite fast a lot of things are happening right now their global pace seems to be too slow for the market right now because they announced it in January and the headset is already old in VR terms meaning that the technology is not as high or as advanced compared to Apple or Meta so yeah you have to keep the pace and not everyone can do this and one of the thoughts that I have looking at this video is I had some thoughts about Lynx that we completely forgot and we don't have any news about them anymore so I don't know where they are and yeah unfortunately it confirms our prediction meaning that they lost their slot they could have released the headset just before the Apple Vision Pro and it could have been a success and now it's too late for them as well yeah, what I find interesting is that they are the first one releasing a headset with the new XR2 Plus Gen 2 which is higher than the Quest 3 so I would like to have some feedback on the pass-through quality because also they changed the design and the camera seems to be in front of the eyes with the display right in front of the eyes also so maybe there is a lot of improvement there that has not been discussed because they are focusing only on the fact that they are using different controllers and like you said maybe the controllers are not working right because in this kind of environment you can have lighting, Bluetooth issues, Wi-Fi and a lot of things that are bothering the headset yeah, this is the worst condition you can have we all know this from different experiences that the traditional one it's a nightmare especially this one about technology where there is technology everywhere around your booth so but yeah, I'm very keen to know about the pass-through quality because that's still the key point for me and for mixed reality experiences which is the thing that I like to develop the most with this kind of headset so but in terms of quality of the screen it seems like what they are displaying seems to be at least I found those feedback online it seems like the quality is very sharp now I didn't hear about the field of view like you said Guillaume so I don't know if it's a bad thing it really depends on the way you wear the headset because this one is more to be worn in front of your eyes so maybe there is a way to put it back a bit or adjust it a bit differently to have it more closer to your eyes and have a bigger field of view and also I quite like the form factor the fact that you can develop and use it to flip it and talk to someone else and then flip it back and go back to your experience that's really something that is not implemented yet in most of the headset and is really great for example on the other lens too it's just perfect to use About the flipping side, if you watch closely to the different people using the flipping feature apparently the flipping is not high enough so you have to bend your head like this to be able to see your screen once again I'm very I very like the flipping side feature because it was on the Microsoft Mixed Reality headsets as well and it works really well when you are developing but yeah if the headset can't flip high enough and you have to bend it's a complete failure for me at this point but yeah maybe it's because people were not wearing it the way it should be and the flipping was not the best position for them but we'll see as well with the final review when it will be released Yeah the head strap is also a bit strange for me the way they are placing it they seem to have kept the design they had for the PSVR Yeah I'm not sure it's the most comfortable way of doing things only having something on the surrounding and not something on the top of your head and also the balance with the the balance however I had some feedback about that it seems that the balance on the headset is good so they seem to have really the battery at the same weight of the front part so that's great to be tested Yes That's it for me, do you have anything more to add this week? No? Ok, so that's it for today, see you guys next week for another episode of Lost in Emotion Thanks See you next week, have a good week