Welcome to episode 43 of Lost in Immersion, your weekly 45-minute stream about innovation. As VR and AR veterans, we will discuss the latest news of the immersive industry. So, hello guys. We are back to our classic configuration. Fabien, if you want to start, please. Yeah, sure. Thanks. Hello. So, today I want to talk about a new accessory. Well, it's a bit more than an accessory for the HTC Vive XR Elite. So, this new device that one can plug into the headset adds eye tracking and face tracking to the Vive XR Elite. So, it has this form here, and as you can see, it's an add-on that gets plugged directly inside the headset. So, there are a lot of things that are very interesting with this new device. The first one is because it's plugged directly into the headset, they have what they call an auto-IPD adjustment. So, the cameras that are inside will track the eyes, detect the IPD, and automatically adjust the IPD on the headset. So, that's very good news. If it's a headset that is used by multiple people, multiple persons, the device can automatically adjust between users. So, that's a very good feature for location-based entertainment or everything where the headset is used by multiple people. And then, because you have face and eye tracking, these movements can be shown on your avatar. So, with all the good implications of that, you can have a better communication between two persons. When we know that a lot of the communication is also with the small expressions that are on the face, in the eyes, the eye movements, the small movements on the bottom of the face, the smile, and so on. That can give a lot better expressions to the avatar. So, as you can see, a full face motion capture, which is great. And another nice feature that gets enabled, and I don't really know if this will need to be implemented by the application itself or if it's a feature of the headset. But because with eye tracking, the full-verted rendering can be enabled. So, rendering in high quality only the part where the user is looking at to improve quality and performances. So, the bottom part here has one camera for the face tracking, and it's foldable, so it's easier to transport and store. So, yeah, it's not that expensive, around 200 euros. So, I was actually a bit surprised by the price. I think it would have been more expensive. So, it's good news, and I'm really curious to see how this gets adopted. It's kind of a really, really huge upgrade. HTC chose to have that instead of releasing a new, complete, full new headset. So, it's an interesting strategy to have this kind of like a modular design to the headset. So, yeah, I'm curious to know what you think. Start with you, as usual, Seb. Yes. Hi, guys. Yeah, like you said, I think it's really nice that they made modular headset where you can swap the battery, change the device that you implement on top of it. I don't like the design, though. It looks weird to wear that on your face, but it seems to be a nice upgrade. And like you said, the fact that it tracks the eyes is very nice. I would have liked it to have also an add-on for tracking way better the hands. Right now, the way the XL works, only use the front cameras, and the field of view is quite small. So, you really have to put your hands quite high up to start to have the tracking working. So, but like you said, the price seems to be correct for this kind of add-on. And yeah, it's a nice improvement to add some more feedbacks on what the user wearing the headset is looking at and how he's feeling by mimicking his face expression on the avatar. And like you said, I would like to know if it needs to be implemented in the code for the aviation rendering, or if it's directly embedded inside the way the 3D is rendered on the headset. But I guess it needs to be implemented. Do you have any clues on that, Guillaume? And yeah, I'll let you give your impression. Yeah, no, I don't have any information about this. So, I have three things to say about this. First is to, I would like to thank HTC for this great cosplay of Carlo Tentacle of SpongeBob. Indeed, it's not very, well, everything can say what you want about the design, but yeah, it's a bit weird to wear. Second thing is that, I guess it proves that eye tracking is something that we need at this point. This is a technology that all headsets should have, because of all the uses that can be done. And the last part is, do you think, my main question is, why did they do this complementary device? Is this because they didn't have the capability to do it when they released the HTC Vive Focus Edit? Or just like, is this maybe as a strategic move to say, no, the eye tracking is not that important like Meta did with the Quest 3? And is it backtracked a little and say, yeah, well, eye tracking is something that is needed, especially for professionals. So do you think this is an oopsie move, like, oops, we forgot this, we need to correct this and not release a new headset? Or is this really part of their main roadmap, meaning that they thought about creating a new device that you have to buy afterwards? Because when you are seeing this, yeah, I guess the design is still a bit bulky and it adds weight to the headset itself. So I guess the weight distribution made, especially for this headset with the battery at the back, may be a little unbalanced because of this addition. So they are like destroying all the work they did with their headset. So I would like to know your, to have your comment about this. What do you think? Yeah, I can talk about that. I have the headset and I use it for Europa Park. And with the one that you receive, it's like a small foam piece of thing that is only attached by a magnetic stuff. And it's very fragile, it breaks every time. So it's like it was meant to be modular. You can already change it by a piece of plastic. So you can have more like a Lynx headset without foam, without border. For me, it's really kind of nice when it's outside. And with this one, it seems to have made it way more comfortable to use and more sticky to the device. So when it's plugged, it doesn't move. And that's a really good add-on. So I think they planned it from the start. It was made to be modular. And that's like the way they foresee the headset now to be used. It's a really better cushion, better things to plug on your headset. That's my feeling. Yeah, I think I agree with all what you said. And I will add that maybe the announcement and the soon release of the Apple Vision Pro that has eye-tracking and where eye-tracking is the core part of the UI and the user experience. Also, because it's Apple, and if there is one company that can drive such adoption, it's Apple. The other players maybe thought that, oh, wow, UI using eye-tracking and interaction with the virtual elements using eye-tracking will be popular thanks to Apple, and we have to catch up to that. I don't know. That's another option. Okay. So do you have anything more to add, Fabien? No, that's it. Thanks. So Fabien, Seb, it's your turn. Yes. On my side, I wanted to talk about this announcement about Canon that show a concept of mixed reality headset at CES. So it's a small device not integrated with the headset that you expect. It's more to display what you can see in mixed reality through those glasses. And knowing that they have a big knowledge on how to do this kind of screen with their cameras, they seem to have done a nice work. I don't see it in this video, but they show people were filming what they see inside the headset, and you can see a 3D camera being displayed on top of the QR code. And, yeah, the image, the picture seems to be very clear and very wide. So I'm keen to look more into feedback from people that use it at the show and see how they feel about the rendering through those glasses. And it seems to be really thin compared to other systems. So it seems that they have found a new way of displaying the pass-through with a very high-quality picture. I don't know if you saw that news, if you have any feedback on it. Yeah, I saw some people showcasing what they saw inside those glasses. Indeed, it's not really easy to really understand, well, to measure the quality and the rendering of these. But, yeah, we can obviously see that the system is very, very small. However, I guess all the, you know, this is basically an old-school augmented reality used with QR codes. So we don't have all the captors that are used for insider tracking as well, which usually are taking a lot of space on the headset device themselves. So we can't say that the new headsets will have this size because it's lacking a lot of technical hardware on this. But, yeah, sure, of course, this kind of improvement is coming from lenses manufacturers and optics manufacturers. So it's very interesting to see Canon back in the game because they were in the game in 2015, 2016, the golden age of AR. And I guess Epson will be back as well, and Leica for sure. So we'll see some improvement, we hope, about the lenses. But I saw the news that they would like to sell this kind of device by 2025. So very interesting to see if they are becoming one manufacturer of VR slash AR glasses as well, or if they are making partnership with other manufacturers. So it's something we'll have to follow up on. And at least the cameras are in front of the eyes on this one. Yes. For me, I think that this kind of format is kind of interesting for museum or theme park, where you can have this plugged in and locked to a totem or furniture. And you can have an experience through that. That could be interesting to keep that kind of format, maybe had more cameras so it tracks the surroundings, like you said, again. But yeah, this kind of goggles that you wear like this could be interesting to use for theme park because you won't have the need to clean everything, the head straps and stuff like that. Yeah, the form factor is very neat, very nice. Another news with this one, also announced at CES, it's e-ink reader glasses on which you can load your books and read directly through your glasses the content. So you have a small device on your hand to scroll down and change the pages. I found it interesting that it's used e-ink, so it should be not consuming a lot of energy and you can use it for a while. And it removes the needs of having your book in front of you like this or your e-ink reader in front of you and having to hold it. So it's nice to see that other technologies are starting to use this kind of glasses. I guess it's missing some AI technology so you can talk directly to the glasses and directly load your book or change the page, just asking to the glasses to do it. But yeah, it seems to be nice. I would like to see how it looks inside the glasses just to know what is the state of the art in e-ink. We see that now e-ink readers start to have colors. Yeah, love to see comics books displayed this way and we enjoy it nicely. So yeah, Salim, do you have any feedback on that on your side? Well, I'm a bit surprised by the size of this device because we know e-ink is a compact technology. So maybe they add some features that we're not aware of because it's basically the same size or bulky design as a real glasses, which they have more capability. But yeah, basically it's a stereoscopic reader. So maybe we'll have to see if there is this kind of 3D effect of the text or not. But yeah, it's a nice way of using these glasses. And once again, the prediction for 2024 about the era of assisted reality seems to be true because we are in January and it's like the fourth and fifth smart glasses that is announced right now. So very interesting to see that slowly, but surely the new device that they want us to use as a daily device is glasses. So very interesting to see that. Yeah, I'm really curious to see how the lenses distort or maybe alter the sharpness of the text. So we know that reading a text in the headsets generally is not really sharp. So I'm worried about that on such glasses. I don't know. Hopefully they have a good calibration of the lenses to avoid that. It's a good point in terms of prescription. I don't know if they plan anything to wear your glasses behind their glasses. And in terms of sharpness, when you look at the text they display here in the video, it's kind of pixelated. So they say that they made the font and the rendering in purpose for being easy to read. But yeah, it's nice for my eyes when I see it this way. So yeah, that's why I'm keen to try it and see what is the state of the art right now in this technology. So maybe at Mobile World Congress in February, I will try to see them. Yeah, so the next news was the Rabbit Air one. So a small new AI device that you wear and you press on a button to ask what you want the device to do, like buying a pizza for you or call someone. So it's kind of a phone without being a phone. You can't touch the screen. It's just a display. And then you have to talk like using Siri on your Apple device to ask questions to an AI assistant that will do the task for you, like transfer money or stuff like that. So it's interesting, but I don't see the use case being done this way. Maybe with the glasses, it will make sense. And being able to interact also with the device to confirm or type a code or something like that to secure a bit more the interaction with the device. But here on the device where you can't interact with a button on it, I don't see the use case. I don't know if you have any thought about that. Yeah, let's start with you, Guillaume, again. Yeah, despite the fact that it's a fun, cool, great design device, I'm exactly like you. I don't see the point because you can see the size. It's basically two thirds or maybe half a smartphone. Everybody has a smartphone right now. So I don't see what it can add to the table because everything that is proposing is already done better with a smartphone. So I can't really see the point. Despite the fact that there is some kind of trend, there is a small pocket game, pocket console that is something like it's Playdate. I don't see if you know that. This is about the same size and you have some new way of interacting and it's like this small resolution, very high number of hours of use. So I don't know if it's a trend somewhere for designers or some kind of specific community to have those pocket devices and make some kind of collection because it's about the same form factor. So I don't know. They can sell some through their buzz because it was one of the main devices that popped up during CES. So maybe they have some traction here. But yeah, I can't really see the point now about this kind of device. Yeah, same. I already have a phone. I can do all of this with my phone. So yeah, I don't need that. Or maybe there is something that we cannot do with the smartphone. It should be a game changer, but it's not presented here. So I guess they don't think about it yet. All right. And the latest news was about Stability AI that released the new 0.1.2.3 3D generation from text AI model that starts to be, I think, really usable. Here you can see what was the previous model doing with this kind of picture as an input image. And now what you can get with this new Stability 0.1.2.3 model. So yeah, I think I will try it soon because it seems to, for me, to start to be really usable. Like here for the furniture. If you provide only one picture and you'll get that result. Yeah. Start for me to be usable compared to the previous version where you had some kind of artifacts everywhere. So yeah, I don't know if you still have two and if you have feedback on that. No, I didn't see that. I guess we'd have to try this because since I used Pika, which is an AI-generating movie through text or image, I'm a bit cautious because the results that you can have never match what they are showcasing. So I'm always a bit disappointed when I'm trying these because I'm spending a lot of time trying to find the right prompts and I'm never getting the right results. So maybe at some point they will provide some assistance or maybe do a new kind of service. It could be some kind of business for AI solution provider like this because I guess they know what you should do to get the best result or maybe. Yeah, that's my question. So I guess I'll try this as well because it's very interesting to generate assets and it seems to be way better than Luma AI as well. So yeah, give it a try. I might be wrong, but I think what you get out of Stable 0.1.2.3 is not a 3D model. So you can have views on 3D model, but it's images or videos. I might be wrong about that. So we need to check that. I don't think you can actually download the GLB model, which is something that you can do on Luma. So there might be a difference there. Still, there are many use case where we don't actually need to how to download a 3D model. Maybe like if you want to generate a new view on some objects and the output of the view is a video or an image that might not be needed. On that same topic, I saw that Luma AI raised, I forgot the number, but a lot of millions to improve their 3D generation model. So I think there will be a lot of improvements on that this year. Okay, great. So I guess we did what we had to say. So for my topic as well, here. And here we go. So I would like to talk about leaked videos. Leaked videos. Okay. Okay. Okay. So, yeah, I have two things to say. The first one is I really like the idea. And objects that you can interact with them. That's a really, really nice feature. And I guess that Apple Vision Pro will have this kind of similar feature. So, yeah. The usefulness of this then is also something to debate. I hope people will be able to release augments that are really, really useful. But anyway. And the second thing is, yeah, Metaspark is the software that we use to create filters for Instagram. So it's really interesting to see that they are leveraging this to create the augments for the Quest platform. My personal experience with Spark Studio is a bit difficult, I would say. It's a bit difficult to handle sometimes. And the features are quite limited. But maybe they did a lot of work on, hopefully, they did a lot of work on Metaspark. And actually, I have a third thing that just popped up. There are a lot of creators that are working on Instagram to create filters. And is Meta hoping to create a new creator economy around these augments on the Quest platform? I don't know. That could be a nice bet for them. Just excuse me, Seb. But do you think Meta is believing that those augments could be the, you know, we are always talking about the Kedar app or the applications that everybody should use on a daily basis to use their VR headsets. Do you think that they are trying to push that as this Kedar app? Maybe. Maybe, yeah. That's an interesting thought. And that could be. They have a lot of, this has a lot of potential. So, we'll see. But still, again, I think we say the same thing at almost every episode. It needs to have a purpose. It needs to be useful. There needs to be more than just something that I can put on the wall. And when I get back tomorrow, it will still be on the wall. Yeah. Good game. Let's go. I already caught you once. I was going to say, yeah, I really like the idea too. I think for me, it's kind of the way Windows tried to change the Windows experience where you had like these clippers that were in a 3D kind of environment where you can clip. And activate your folders by clicking on the drawers on the 3D desk. Here you can put 3D directly. And if it's used this way, I think it changed completely the way you use the OS and the panels and stuff like that. You can have like a small 3D character of your games on your table or on your shelves. And you can click it with your finger to activate the game. So going through the menu and Windows and panels that are usually displayed. So really stuff that are placed in your environment. And I think one thing that could be useful too is to be able to see it through your phone. So you have the same experience in Mixed Reality and you can access the same information. Like they were showing the weather and a couple of information layout on one wall. If you can have that also on your phone when you are not wearing the headset, that could be a nice continuity of experience. Like a mixed reality layer that you can access through different kind of device. I think that could be a nice add-on. It's a metaverse. Yeah. And with an AI assistant and stuff like that. That you can see through your phone but also through the Mixed Reality headset. And add-on. Even though you change your headset through time and buy the new one. If you can keep your layout in your space, that could be nice. In your space or in your work environment too. Yeah. Well, I guess once again, we are heading towards the Apple vision. Meaning the vision of Apple. Sorry. There is a spatial computing one with those 3D objects being part of our everyday life or daily routine or whatever. Meaning that at some point we will have to wear a headset or glasses for a very long time during the day. So, I guess there isn't much place for doubt here. That all these big manufacturers want us to have these new devices. I guess the new way that computers and cell phones will merge into this kind of immersive world. So, as you said Seb, every brick is becoming more and more mature. And they are adding up. They will be added at some point to create what I was joking about. But yeah, it's basically the vision of the Metaverse at some point. Where we have a digital layer on top of the real one with all the information, the AI, the virtual identity and so on. So, we are just continuing the work that has been done during the Metaverse year, which was last year and the year before. So, still very interesting to see. I'm very curious, as you mentioned Fabien, if this is a strategy for them to create the same way you are doing Instagram filter. You have a whole economy behind that. Are they trying to do the same with the MetaQuest 3? I don't think that having weather forecast or news pinned somewhere in your living space will be enough to make people come back to their headsets. Instead of checking their mobile device when they are waking up, just putting on a headset and see the news. I don't think this is the application that would bring people to use the MetaQuest more often. But we can trust the developer to find something that would be game changing. I forgot to mention that I think it needs to be digital also, so you can interact with your LED, your light and your everyday use of your electronic device in your space. So, maybe at this point Meta would be right, meaning that the use case is here, but the device is not. Because you know that the MetaQuest 3 is still not that easy to use, the autonomy is not great. So, maybe at some point they just reverse the problem their way, meaning that they have the application now and the device is not there yet. So, we'll see. It's very interesting to see this strategy. Which will be the same for the Apple Vision Pro by the way, I guess. Okay, so anything more to add? No? So, I guess that's it for today. So, thank you guys once again for this episode and we'll see you next week for another session of our discussion. So, see you guys, have a nice day or evening and see you next week. Thank you.

Lost In Immersion

Episode #{{podcast.number}} — {{podcast.title}}

Transcript

Show Transcript
{{transcript}}

Subscribe

Spotify Apple Podcasts Google Podcasts Amazon

Episodes

#{{ podcast.number }} – {{ podcast.title }}

Credits

Podcast hosted by Guillaume Brincin, Fabien Le Guillarm, and Sébastien Spas.
Lost In Immersion © {{ year }}