Welcome to episode 13 of Lost in Immersion, your weekly 45-minute stream about innovation. As the R&AR veterans, we will discuss the latest news of the immersive industry. So let's go, Fabien, if you want to start. Yeah, thanks. Today, I want to talk about this laptop that you can see here, which was announced last week. So it's claimed to be, it is actually the world's first AR laptop. So what do they mean by that is that there is no screen on the laptop. And instead of the screen, they attach AR glasses. So they are using NREAL glasses. These glasses have been out for quite a while now. I think maybe two years or maybe a bit more. So they are using the light version of the glasses. And so the concept is instead of having a screen, you are seeing all the windows of your laptop like this in front of you in AR. So this allows to have like a bigger, larger screen without the bulky real screen in front of you. So I think it's very interesting. And with the Apple headset coming up in the next few weeks, maybe coming up in the next few weeks, there are some rumors that the intent of Apple is also to have this kind of experience. So using the space that is available in the physical space to have the windows laid out, to have a bigger and larger working space. So I think it's very interesting. And that might be a trend that will lead to like the end of screens. The thing I'm wondering and I'm curious to know what you think is, this is the beginning and the AR glasses are maybe not yet as, the performance of these AR glasses are maybe not yet as good as we expect. So staying all day long with these glasses on the head. I'm not sure if we are ready yet to do that. So I'm really curious. I know Seb that you tested the Unreal. I don't know if you did, Guillaume. But maybe we can start with you, Seb, to know if you have some ideas and feedbacks about this one. Yeah. So I don't know if they updated the Unreal Lite device, but I tested it three years ago now. And the experience I had with augmented reality scene, mixed reality scene, with my environment was that everything was displayed, but like in a very horizontal small window in front of my eyes. So looking at the dinosaur, for example, in my living room was very hard because I had to really look up and down to see what was in front of me. And also the tracking was only using the ARCore technology. And so it was sometimes a bit laggy and was using the space environment with the SLAM. So it was not a great experience. Now I understand that they moved back to this kind of experience where you are sitting in front of your laptop, at least a keyboard, and maybe more power than what you can have with an Android device because you are using a laptop. So that could be a good move. Now I really would like to test how big is the rendering inside the glasses to see if it's really useful. But it could be great. I think the ARCore can allow that to be able to come closer to the screen, look maybe at the text closer, be able to move and even look away and have still the screen in front of you locked down to your space. And even if it moves a bit, that's not that destroying the experience for the 3D model. So I guess that could work. And it's a nice move from them. Yeah. One thing that I... Yeah, it would be very interesting to see the field of view indeed. One thing I'm wondering as well, if they are thinking also of... The ad that we are currently seeing is very work-focused. It's not really focused on entertainment. So like movies, games and stuff like that. I know Unreal is very big on games. They are supporting a lot of gaming platforms. So to have a very big screen, maybe to watch movies as well. I don't know if that would be a good use case. So yeah, what do you think, Guillaume? Yeah, I'm just making some researches about the glasses specification. Because one of my main concern is that indeed, these kind of glasses are having kind of a success right now because people are buying them to replace their TV. But about this work use case, I'm a bit concerned because of the resolution. When you're watching a movie, you don't need that much pixel. But when you're reading text and doing work, basically, you need to have more than just a basic resolution. So just for you to know, for example, the Rocket AR glasses, which are the main competitor right now to the Unreal, I guess. They have a 43 degree field of view, which is very narrow, I guess. And 1080p resolution, so not that great. On the contrary, when you're looking at the future Xiaomi AR glasses, they seem to have 58 PPD, which is pixels per degree. And for you to know, the human resolution of our eyes is 60 PPD. So I don't know what kind of conversion you would get in terms of resolution because I can't seem to have the information. They are always talking about this PPD unit. So it seems to be like they kind of have the same resolution as we would have with our eyes. And they have a 96 degree field of view, which is they are doubling the field of view of the Rocket glasses. And for the Unreal light, as well, it's 1080p resolution and a 52 degree field of view. So I guess the idea of this project is kind of interesting. However, they should change their partnership to the Rezaio AR if they want to have more success, I guess. But they are not released yet, so maybe it's just temporary. And they are focusing on the AR laptops instead of the glasses, of course. So that's what I would like to say about this. Apparently, people are welcoming these kind of devices quite well. So I think they are not that uncomfortable to wear or to use. So, well, let's see. But yeah, it's a big step towards the AR adoption, I guess. And it is linked to one of our first discussions with our podcast. And it was that 2020. This year would be the assisted AR, assisted reality breakthrough. And it seems to be heading that way indeed. One thing worth mentioning, also, it's the Unreal light. If you have glasses, you need a specific… You need to buy lenses specific to your eyes issue. Okay. You can't wear your prescription glasses? No. Okay. You need to buy some, yeah. Well, it's a new business. I guess. And it should be the same with the Apple AR headset that they announced that you can't wear your medical glasses, the prescription glasses as well. Oh, okay. So maybe they'll be selling your prescription lenses for a fair price. Yeah. I think it would be interesting to see if they… I mean, Apple moves this way as well. Because for this concept of a space top, they have to sell the laptop and the glasses. But with Apple, a lot of people already have iPhone or iPad. So if they find a way to connect the two and to have… I don't know, you can connect your iPad and work like this in the same concept. Maybe kind of a success. We'll see with the price as well, which seems to be quite high. Yeah. About the price, did you see that the very light, small version of VR headsets, I can't remember the name of the company, but they are now proposing kind of a leasing solution when you can get the VR headsets for $32 per month. So it would be a financial solution for you to get the headset because it's around $1,500 to get. So at the price of the AR Apple headsets. You could think of a financial solution like when you are buying your iPhone, which is the same price. So it could be a solution for people to get these devices without spending those $3,000 cash upfront. Yeah. Yeah. Okay. So next topic or do you have anything more to add? Yeah, I think we're good. Okay. So Seb, it's your turn. Yes. So on my side, I wanted to talk about what we already talked about in previous podcast which is about having more tools to create a VR environment. So this one is about that. It's a rocket lab that released a video showing that they released a tool to allow you to generate VR 360 pictures with some control so you can draw quickly where you want things to be displayed and then add some text to describe what you want. And then you can see that it's using your shape to generate a 360 environment that uses your gap that you sent to the tool. So quite a nice improvement on what was already available. There was already tools to generate 360 environment but without any control except text. But now with that tool that allows you to control what you want in the design of your 360 picture. So I don't know if you want to comment on that. It's maybe not allowed to comment. Yeah, maybe. I heard that this solution is still free to test. Did you try it? Not yet, no. But I saw a lot of people using it for different use cases and it seems to work very well. So it's encouraging to look at. I guess you should hurry because there are some rumors that it will be paying services in a few days or weeks. Just for you to know. But yeah, I didn't try this either. As always, I was concerned about the reality of this kind of video because it seems to be too fast to be true. But everybody is witnessing that it's a real technology and I guess this company will get more and more attention. It's been a few weeks already that they are showcasing their solution on LinkedIn especially. Yeah, I guess I'll try to test that as well. Okay. Very interesting. One thing I'm wondering is the use case because this is a 360 picture, right? Yes. So it's not really usable for VR. I mean, it's usable for a static point of view VR but not really for something you can move in. So it's super nice for concept generation and ideation for building an environment for the creatives. But I wonder if there is real applications like direct applications for the VR games for that. Not yet, no. I think that's missing. The steps that needs to be done. Maybe EDX VR environment like that and then generation of 3D model after that. I guess this is what we are thinking about when we are seeing the video. We are thinking 3D generative AI and it's not. It's 2D. So yeah, it's impressive as you said for showcasing or maybe for some light rendering if you want to create special reflections and so on. But yeah, when we get to the point that this kind of results will be in 3D, I guess this would be a game changer for VR. Definitely. But I had the same kind of, what? What was I doing? And I looked twice at the video to see that it was only 360. But it's still impressive the results that I get and adding some controls is very nice. Because before you were not able to really control anything or randomly getting something nice. But most of the time you had to tweak only the text to get something nice. Do you know if they are using open AI or is this their own AI? I'm not sure. I can look at it and let you know. We can add that to the YouTube video as well. That's a good question. Okay. The second subject about NERF. We talked about that. I think also when we talk about NERF, we were talking about how to look at it. And we were thinking about using AR to augmented reality to be able to look at the NERF you have created. And that's now done. That's available. So that's a sample. And the first example here, they are showing that you can control the camera position inside your shot to be able to, instead of having to control each point and the camera pass on the UI, now you can use augmented reality to be able to directly control the pass you want to have in your final video. So it's something more smooth, more natural. But something that is generated with only points on screen. That's the first example. And here, another example where they show that they scan in a museum a dinosaur. And they are able to move inside this NERF. They are not in location, but it's still impressive to be able to explore the environment this way. So I don't know what is important now, but I think that's really interesting the way you can generate a 3D model this way and be able to explore it from another place. You can go Fabien if you want. Okay, thanks. Yeah, I really think it's super interesting. Like the applications for, I mean, so many things actually. Industry, you know, we can imagine like recording a video and having someone else take the time to look into it like remotely for museum, as you said. Even for personal use, like you can look at your holiday pictures and using this kind of technology. So yeah, it's a really nice innovation, I think. I really like it. What about you? Yeah, I'm really questioning myself about the rendering because when I tried the latest 3D scan application with the iPhone lighter technology and just the point cloud itself is very good. And of course with the NERF technology, it's a very good result as well. But on the rendering part, I'm not sure if they are using a classical triangular like mesh with the polygons. This is the tilt rendering. I don't know what kind of rendering they are using because it seems too good to be true at the meshing part of, even for the NERF generation is not that good. And I guess this is the intermediate kind of rendering which is called the tile rendering. And they are just creating a lot of small planes and they are moving in the space for us to have this perspective view. And this look like this. And the issue with that is that you can't interact with this kind of rendering as you would like. So it just for you to visualize if you want to create some collision or interaction with a 3D scan, you can't do this. So, or unless you are doing some rough boxes around your 3D elements. So despite that, I think this is very, very nice Do you know they have an online app for that? As I saw, it's called AI... Luma Labs allow that now. And from what I understand, they really took the camera position from the slab and they display the NERF directly. So that's why the table looks like this and very natural. So there is no 3D meshing right now. So like you said, there is no interaction. I guess that's good to learn you too. I'd like the mesh only for the collision and display the NERF this way. It would be interesting to see the kind of exports they are proposing, because if you can't do anything expect using the app itself. It's quite limited because you can see your scan AR is cool, but usually you want to bring it in another application or another use case. So let's see what they are doing. But do you know if it's a free app or do you have a subscription? There is a GitHub available. So just a quick thing. It's very funny how the NERF interpreted the content of the TV as a negative space. It's a funny mistake of the AI. It's a tough one to resolve, I guess. It's computer vision. And the last one is already a tool using the Photoshop AI technology to be able to directly. So it's an Instagram feature that has been developed that allows to use different picture that has been generated with Photoshop AI and be able to look at a painting on your wall and by placing your hand in front of some area you can directly, randomly see different animals and change the scene and the environment. So I think that's a great use of the AI and much greater. So I just wanted to share the idea. Okay, that's great. But once again, we don't have the device for that because it would be great if you have your glasses in your house and you have your whole painting on the wall for the augmented walls. But I can't see how you can move around your house with your smartphone or tablets and just looking around your white walls. Yeah, it's still the same. Yes, the idea is interesting. With a headset that does correctly mixed reality that would be much more impressive. But still, I like it. Yeah, the news with that as well is I think we mentioned that many times already is a lot of AI services are now behind paywalls. So Firefly, the name of the AI at Adobe is now available for Creative Cloud subscriber as a beta version in Photoshop. So that's a very huge news for designers who are using Photoshop every day. I'm very curious about the regulation and copyrights behind it. I don't know what images they trained on. Maybe they have been secretly collecting all the Photoshop images since the... No, it's a joke. But yeah, it's interesting to see that AI is now available in the most popular graphic software ever. So very curious to see what's coming up. Right. So that's it for me. Okay. Anything visual to share? It's more about what's in the air right now. I don't know if you are seeing this but the latest news is that Meta is getting closer and closer to the Magic Leap company because they would like to have some licensing partnership with them. And I think there are too many news related to AR to be completely innocent. As we talked about Snapchat opening their engine for retail, Meta did the same. We can see that AR and AI are merging together to provide more and more experiences. And now this with a bigger big player being interested in maybe buying AR companies. I get that we don't have all the information yet to be really aware of what is going on with Apple headsets, for example. We saw that partner Loki tried it. Some experts tried it as well. And it seems to be like this last iteration of the device is really proposing something interesting, I would say. So maybe they have... I can't imagine that people at Meta don't have any information about the Apple headsets. They would be the first one to get the leaks and have some kind of industrial spies, quote unquote. But yeah, so maybe they are smelling that this device will get some attention. And they are preparing for a new emotional wave as we had with the AI and chat GPT. Maybe we'll have an AR wave this summer or in September when people get back from their holidays. But yeah, I guess there's something in the air. There's so many hints right now. And we'd like to have your views on this topic as well as AR experts. Yeah, it's very interesting. So what I've read is... So it's like a kind of IP discussion that they are working on and manufacturing discussion. So Magic Leap will provide some lens IP to Meta and Meta will use Magic Leap to produce the headsets. So I don't think this partnership will produce like a new headset combined like a Meta Magic Leap. But I totally agree with you that the water is starting to boil. And well, it seems obvious, but the secrecy around the Apple headset is really heating up everybody. Even us, we are like every week almost talking about what could it be. So I hope that it's actually something and that it will be good and to the level of what we expect from Apple. And there is also one thing about Meta is it seems like their financial situation is maybe not very good and doing partnership is one way of getting back on track. So maybe this is what they are looking for. Yeah, it could only be some technical partnership for them to improve their next iteration of the Quest because we saw that the Quest 3 is officially available in their market or marketplace. You can see that it's compatible or not with the Quest 3. But yeah, just for a quick rebound on what you said is that what am I afraid of is the Magic Leap effect when they had all this mystery around their AR headsets and bringing the same kind of secrecy and stuff like that. But yeah, one of the mistakes is that they presented the whale and it put the expectations to the top and when they released their headsets there was a complete disillusion of what they were doing. Apple didn't do that. I guess they are smart about this because, well, it's our fault if our expectations are so high and they can just say, well, we didn't show anything. So if you are not that happy, it's your fault. Yeah, okay, it's a good strategy. But yeah, when we saw all the buzz around the Magic Leap back in the day when people were from Google were putting some billions in investments in this and everyone saw that they finally had the solution to the AR headsets and I hope it won't be that disillusioned as it was with the Magic Leap. Yeah, I agree. Like you said, I think Apple didn't went to Weta to create fake videos, so that's a better way to go. And I saw this week that Samsung bought Imagine, a company that does micro OLED display and I know that they are close to Microsoft to create their next iteration of HoloLens. So I guess it's remorse, but it seems to be closing and going to that direction. So yeah, like you said, a lot of companies are moving forward and buying some other companies to catch up and get new technology sooner than the other. So yeah, it's interesting to see that everyone is maybe freeing the competition without trying to catch up. And yeah, we can see the difficulty of being a tech company right now because they have to play on different fields at the same time. We can see that the big player of the GAFA, like Meta or Google, they seem to have lost some tracks on some technologies because as you said, they are buying or trying to get the technology they shouldn't invest on that much. Yeah, I guess this is so difficult to play hard on the AI, the VR and the metaverses, especially when you have these big layoffs lately and they don't have any innovation teams anymore as they used to have Microsoft as well. Because if you're seeing that they are getting closer to Samsung, I guess they are maybe doing what they did with the mixed reality headsets back in the days when they were licensing these mixed reality headsets and Samsung, Asus, Acer, they were just picking up the blueprints and doing their own headsets for Microsoft. And I guess this is not that bad idea, especially for AR, because the HoloLens, as we saw when it was made only by Microsoft, was just for professionals and maybe getting this kind of help from other big manufacturers will get us to new AR uses. HoloLens remains the best at tracking its location in space, so it can provide all the specifications for the 3D tracking, how to lay out the laser and the infrared camera on the device to have a good tracking. It could be great. Yeah, so a lot to say. I guess it's just two or three weeks until they announced that the Apple conferences will start on June 5th. So we'll see if, one, they are presenting the AR headsets and, two, if we have these iPhone effects that we are talking about and how we could buy this, because it's super expensive. I just made the calculation yesterday and for me it would be more than 4,000 Canadian dollars, and it's the price of a small car, so I don't know. Yeah, so we'll see. I guess we'll talk about it again in the next two podcasts. So anything more to add for today? No. One, two, three. Okay, so it's a wrap-up. So good morning, good afternoon and good evening.

Lost In Immersion

Episode #{{podcast.number}} — {{podcast.title}}

Transcript

Show Transcript
{{transcript}}

Subscribe

Spotify Apple Podcasts Google Podcasts Amazon

Episodes

#{{ podcast.number }} – {{ podcast.title }}

Credits

Podcast hosted by Guillaume Brincin, Fabien Le Guillarm, and Sébastien Spas.
Lost In Immersion © {{ year }}