Welcome to episode 49 of Lost in Immersion, your weekly 45-minute stream about innovation. As VR and AR veterans, we will discuss the latest news of the immersive industry. So, sorry about last week. I was sick and I couldn't speak. So, no episode last week. But we are back and with great news. I guess, Fabien, what do you have to present to us today? Yeah, thanks. So, finally, we got our hands on the most famous Apple Vision Pro. I've had it for a couple of days now. So, I've been able to test it in different conditions and different apps that are available. So, yeah, we can dive right in. So, first of all, I want to say a lot of comments that we have heard on the first reviews was that it was very heavy and the weight distribution was like all in the front because you have the battery that I have now in my back pocket. And so, you have this strap that you can adjust with the, I don't know how you get like, yeah, adjustable thing here. And for my perspective, I think it's actually pretty good. It sits quite well on the head. Of course, it's heavy. And if you put it for like five or ten minutes, you have this red bar on top. And it's kind of different shape than the Quest. So, it's much smaller than I actually expected. The Quest feels a bit like larger in the hands than the Vision Pro. And while the build is pretty good, of course, not pretty good, sorry. It's pretty impressive, like the materials and the way it's built is really like Apple type of design and quality. And the only thing I would say is the face cushion here attached like with a magnetic to the headset and it gets off very easily. So, at the beginning, I was holding the headset like this and a few times like I got scared that the headset will fall down because the the face cushion will get off pretty easily. So, yeah, I'm a bit less critical than I thought on the weight distribution. I didn't try it yet with the headband. So, I need to try this. But yeah. Oh, and I didn't mention, but like the battery strap attaches here. Yeah. Before I put it on, do you have any questions or notes on this? Yeah. Did you try to like put it on quickly and put it back very fast? Because we know when we are developing, we are often putting it and back and in and back and in and back. It's very bothering when you have to strap it, be sure that it is on your head. And then is it easy to maybe can the strap be take off for you just to have the glasses available? And in that way, I guess the magnetic part is could be bothering as well. I don't know. Don't destroy your vision purchase for that. But yeah, we know. OK, great. Yeah, it's it turns on. Well, the thing is, I have to hold it. And of course, because everything is using the touch, sorry, the hand control. I think it's better to keep the head strap. Yeah, I guess holding that way, you are blocking the cameras as well because they should be on the on the lower part. OK, so not the best for developing. Well, yeah, no, you can. OK. It's not that bad, but yeah, it's I think it's the whole lens tool that has this visor that you can go up and down. And this is obviously much better. Yeah. Yeah. So what I do is like I. I think for the first three, I'm doing it like this. Yes. Yeah. And putting up. Yeah. Yeah. So what I do is I have this on my desk and when I put it back, I just slide the headset right in here. So in case it falls down to protect it a bit. And. So let me just open it. So what I'm doing right now is I'm entering the passcode because you have to enter the passcode and let me share my screen. It's not recognizing your iris to set the code. And so I didn't set up that yet. Maybe it's something I need to do. Yeah. OK, so first of all, I think what is most important and recognizable and characteristic of this device is the quality of the pass-through video. It's the best that I have ever tested. It's much better than the Quest 3. As you can see, when I put my hands in front, there is no distortion visible. There are very, very small distortions that you can see on the edges of the screens. But yeah, overall, it's a very good pass-through, very impressive. And if we go into low light, like here, I only have this light here. And I don't know if it's visible on screen, but there is a bit of grain that starts to appear. And when I move fast, the head, it gets blurry. So the video is blurry. So it's the better quality than the Quest 3. But, you know, it's not the perfect one-to-one with reality that we were sold. So yeah, especially in low light condition, which is pretty often the case, I would say, for this kind of usage, then you have this kind of a bit of graininess. And especially what bothers me is the blurriness. But still, I mean, it seems to be a bit critical, but it's still a very, very high quality. So yeah, do you have any notes on the loop now? Do you have any notes on the pass-through? No, I did test it at Mobile World Congress too, in the same kind of lighting condition, but it was at the event directly on site. And I had the same experience as what you are describing. And maybe about the field of view, you can see that it's narrow when you are used to use the Quest 3. You can see that you are losing a lot on the side. But like you said, what you are seeing in front of you, it's very nice in low light condition. Yet there is some grain, like you said, which is to be expected with this kind of camera. But I feel like with the Quest 3, there is like a bit of readjustment on the color that are not done apparently on the Vision Pro. So it gets grainy quite soon, like you said. You said you are often in this low light condition. Yeah, I guess with the Quest 3, you can go lower in terms of lighting condition to get to that grain effect. But the fact that you don't have the distortion on your hand is very amazing. Yeah, I only see like here, I don't know if you can see here, but there are very small distortion on the door. But it's like it's almost unnoticeable. So actually, this is the first time with a headset that I have when I removed the headset and I was doing something else. And I look back at the place where I thought there was something, but I didn't have the headset. It's the first time that happened to me. I don't know if it's because of the quality of the pass-through or because the UI is so familiar that I am using a Mac as well. So maybe it's just because it was the same UI and my brain was confused. I don't know. But it was a pretty disturbing experience. And you may need to precise that here, we are only talking about the quality of the pass-through, but not the quality of the icon and everything that is virtual inside the headset. It's perfect. You can't see any pixels. So that was actually a very nice transition, Seb, thanks. My next topic is the quality of the virtual element, which, as you say, is very, very good. There's a bit of blur as well when you move fast, but let's see. I can show one of the most famous demo, which is this one. And you can see the Formula 1 model here. So one interesting thing is by default, and I didn't search for this yet, but I don't see any occlusion. So you see it's over my table, but if I... Only your hand, apparently. Yeah, so you can see the hands are occluded. And if I go through the 3D model, my hand is still visible, but it gets faded. And if I move, you can see like the quality is super good. And I think they are using the similar technology that are on the ARKit, which is the estimating the lighting of the room. So the integration into the space is actually very good. Like if I place it so there is no occlusion. Yeah, so it looks like it's here. Yeah, even the shadow on the floor. It's nice. Okay. And same here, it's really stick to the place you're placing it without moving. Even if you walk around, you don't see any jig or movement. I will try. I'm sticked. I am tied to my computer, so I'll try not to. So there is a bit of, how do you say? Wobbling? Wobble, yeah. Thanks. I don't know if it's visible on the video. No, it's hard to see. Yeah, there is a bit of... Actually, I think like the tracking of the Quest 3 seems to me... So it's really, I'm not sure of it, but it seems to be a better quality. Like it sticks better than the Apple Vision Pro. Like when you move like this, there is a bit of wobble and maybe it's due to the actually quality of the pass-through. It's so good that... The 12 milliseconds feedback or time to process and display the video seems to be so short that even the end tracking, apparently, maybe it's one of your topic. I don't want to jump too fast on that, but they are saying that even the end tracking seems better on Quest 3 because it has the same latency as the video pass-through compared to here where it's a bit delayed. So you feel like all your interaction is delayed when at least you are manipulating with your hand an object. Maybe, can you grab it and try to move it in space? Yeah, in real time, like try to... Yeah, so I don't know if you can see the latency here. So yeah, there is definitely... Well, I think there is a bit of like delay on the filter on this, but it feels pretty natural to me, but there is certainly a delay here. How do you feel if you need to do a precise location? Do you feel that it's easy to handle or... No, no, it's not easy. I don't know, maybe they have some kind of... Yeah, it's gone. So yeah, I don't know if... Oh, it's here. Nice. So I don't know if it's due to the app itself or the Apple Vision Pro, but yeah, there is definitely something a bit difficult with that. Anyway... We can see that the 3D interaction is not that easy. Maybe they're lacking some, you know, our infamous laser or something that you can point parts at. You don't have much feedback as well on when you are getting close to the different parts. So, well, it's application related, but yeah, you can see that the interaction could be better. Yeah. And well, actually, maybe it's a good time. I was going to talk about it a bit later, but it's a good time to mention. And again, this is not because of the Apple Vision Pro, it's because of the developers, but because everything is manipulated with hands and the eye, it's very often that I was in a situation where I had to quit the app, to press on the main button to go back to the main menu, because I wasn't able to actually quit the app. So for example, if I put it like this, you see the menu to quit the app is hidden by the 3D, and I just like cannot continue. So again, this is not the Apple Vision Pro, it's just a design problem, but it's pretty disturbing sometimes. And it's quite often in many of the applications that I tested and I had this issue. Maybe let's see, I will go to the settings. So a word about the persona. So hopefully you can see it. And at the beginning, when the Vision Pro was first released, there were some people who were saying that it's a bit creepy or it's a bit like uncanny. So yeah, of course it's not perfect, but I don't know what you think, but I think it's pretty good. Like the movement of the eyes and the mouth, it's, I don't know, I find it very impressive. So when I scanned it, it got some details like the details of my neck. So yeah, what do you think? How do I look? You scan yourself with glasses, right? No, you can choose your glasses afterwards. So that's the 3D glasses that you select. And, well, the hands are totally fake, of course, but I don't know, it's pretty good. Well, I guess you don't have long hair, so it's working perfectly. You're the perfect, you have the perfect face for the persona, but yeah, we can definitely recognize you. And as I said, I don't, well, I understand, but I guess people are a bit picky because it's a really good integration for facial tracking regarding the time that it's taking for you to scan your face. It's really fast, I guess. Can you confirm that? It's about a few minutes for you to get the persona in place. Oh yeah. I would say one minute, maybe two minutes max. It's very quick. Yeah. Yeah. For this kind of scan, it's awesome to get these results. Yeah. I can't say anything about this. I guess it's great. So a note about the UI and the user experience of having to look at something and to pinch, to interact. So it definitely gets a bit of training, I would say, to get used to it, and especially, I think we mentioned that in one of our previous episodes, but with computers, we are pretty used to look at where we want to click. And even before actually doing the click, we are already looking at somewhere else on the screen. But here, you cannot do that. So it's quite often that I click on something that I didn't want to click, or I have to do it multiple times to actually get it right. So I'm not sure that it's really a problem. Maybe it's just getting used to the UI. Yeah. It definitely gets some time to get used to it. Oh, and I often have, when doing scoring like this, I often get stuck in scoring. So you really need to pinch and really do large movements in order to do it correctly. Otherwise, it will get stuck into the scoring state. The HoloLens 2 and the Quest 3 has a similar issue with the hand tracking interaction when you interact like that. So yeah. And like you at the Mobile World Congress had also some issue because when you try the headset, either you turn on the option to automatically adjust the IPD, or you don't. And they were not doing it because it takes like two minutes per person to adjust and do the calibration. So at the Mobile World Congress, they were just showing their demo like that without recalibrating for each user. And selecting buttons and menus, you need to look when it's not calibrated to you like on the side of the button to be able to interact with them. And yeah, that's really not enjoyable. And plus, like you said, Fabien, it feels like it's constraining you on always thinking on how to interact in a way that is not traditional, I would say. So yeah, the same thing, the constraint of looking at where I want to trigger something is kind of a pain after a while. Yeah, yeah. Yeah, totally agree with that. Also, I found that the highlight state of buttons is not that obvious. So oftentimes, I like to really concentrate to make sure that I'm actually looking at the correct button. So maybe it's something that they will improve in the future. But yeah, I would have liked the highlight state of a button to be a bit more contrasted. Yeah, I agree completely. I had the same feeling when I was trying to look on the side of the button to make sure I'm on it. I was not able to easily see which one was selected. So yeah. Oh, I forgot to show. I cannot do it. Okay, it's funny. During that experience, I cannot switch to the immersive setup. I don't know what's the kind of fraction they are expected. Cool. So this is the demo that I think we all saw during the presentation of the Vision Pro. So again, previously, we were a bit critical of the pass-through events, and though it's amazing, the graphic quality here is... So I don't know if it really works, but it's so I don't know if it really shows on the stream, but it's amazing. The graphic quality is really the best that I've experienced so far. Does it react really to you? And are you able to really work around it or on the side at least? So yeah, there is definitely depth, but I cannot go in. Okay. For something that I didn't show, I don't know if it will... There is a warning that shows when you are too close to an object. Okay, yeah. Move back, you're too close. And does it mention also when you are in low-light condition? Is there a warning? So I had a tracking fail when I was in low-light, but I didn't see any low-light warning. Yeah, what I wanted to show is... Yeah, this. Oh, it's super dark. Let's see. Is it because it takes the time while you're moving to adjust the light? Oh, yeah. Okay. So yeah, by using the... I don't know how you say this. Knob. Knob, yeah. Thanks. You can switch the immersion level where you are at. Oh, and something else that I wanted to talk about is... So I won't be able to do the demo, but I can show you only the UI. There is a guest mode. So the Apple Vision Pro is mostly a personal device. If you want to hand it to someone, as you mentioned, Sep, the eye tracking will not work and it will be a very painful experience. So they built that guest mode. And what it does is you can choose the apps that you are allowed to use. And when you hand it off, it will start with the calibration experience. So we calibrate the hands and you have to do the eye calibration, and then you can use the headset. So as soon as you take it off, it goes back to the settings of the owner. So yeah, it's a personal or private demo kind of device. And how long does it take to adjust to a guest? Do you know? Let's see if we can do it because it shows mirroring. So hopefully the mirroring will stay. Let's see if it goes back. No, it didn't. No, it didn't. Well, anyway, so here I'm doing the calibration. So it asked me to show my hands. And then you have to look at dots and then pinch to calibrate the eyes. So it goes firstly to a dark mode. And then to a brighter mode of five dots to calibrate. Six, sorry. Okay, no. Once more. And it's very funny because once the eye setup is done, you can see the view changing as the calibration gets applied to the pass-through. Okay, so here it is. I am in the guest mode. And so I can use the apps as a guest. So not that long when you know what is the process then. Yeah. Okay, so as you can see, the usual red bar VR headset, post-VR headset. Anyway, so yeah, it's a very impressive device though. I know we are usually very critical here, but still it's one of the best pass-through. It's an amazing VR quality, virtual quality. So yeah, we didn't mention and we didn't talk about the amount of apps that are available and the type of apps that are available for now. Still it's a lot of, well, special computing experience. So you have something in your space and you can interact with it. And a lot of the experiences that I tried are this type of user experience. There are apps like you can put like sound in your space, but it didn't actually really stick to the places I've placed them on. So when I put the headset off and put it on again, the sound was another place. So I don't know, I didn't explore more. And one last thing that I forgot to mention is, so here I was here in my space. So if I leave a window here, for example, I don't know, the messages window, and then I go into another room, I would, from the other room, I would see the back of the window. The back of the window. So I'm really, I didn't make my mind on that yet. I don't know if it's good because you know where your windows are, or if it's a bit weird to not have like occlusions. I'm not sure yet on that. I'm going for the weird part on my side. Okay. And yeah, I had, I don't know if you saw earlier on the Quest 3, there was a release of a tutorial on Unity, a sample application of an escape game that you can place and play together with someone else. And there is an automatic room calibration where stuff are positioned in your room, on your wall that you have created. And they do the occlusion. So they do a different rendering when something is behind the wall. So you see only an outline, like in video games, the standard video games. And when you go inside the room, then you will see the furniture or the thing that is inside the room. And yeah, seeing the way the Vision Pro is doing the calibration of the room and scanning the room, apparently it should be feasible. So maybe they will update the DOS for that, to handle that. But like you said, I feel it's weird when you see the back of a window floating in another room and moving strangely because it's behind a wall. So in 3D, it doesn't even make sense for the brain. Yeah, I tell you that's a very good point, is the absence of occlusion makes things look a bit weird sometimes. Like the brain knows that something is a bit wrong. Yeah, sorry. Did you try Xcode with this? Is the developer mode easy to access or is it like some phones or older iPhone or iPad where it's a pain to get the developer mode activated? That's a very good point. I didn't try that. Yeah, I need to try that. Because I guess it should be through Unity at the mansion. But yeah, I'm curious to see, to know if you can access it easily or not. And in this stage, it's really more of a showcase device more than a developer one. So if you can keep us posted on this, it would be great. Yeah. It could be great to know if it's as easy as plugging a cable to the battery on your laptop and being able to stream directly the Unity window to the headset and be able to iterate like that before building. Yeah, this is clearly the kind of information we don't have yet. Most influencers and YouTubers, they showcase the device as we did. But it would be great if we can go further and be on the developer side, which is the darker one. Yeah, but yeah, like the HoloLens at the beginning, it was a nightmare. So I think there was not a lot of apps developed. And that's really blocking for having a headset where it's really hard to develop on. I think what is the real success of Oculus is really that they do a lot of work on their SDK and how you can really get directly easily the what they did in ARD and apply that to your game. One last thing that I forgot to mention is there is an app. It's a kitchen building app. And I think the model is pretty heavy and maybe a bit too much heavy for the Apple Vision Pro. So there was some lags. And again, it's not because of Vision Pro, it's just the app developers now. And the fans started to... well, started. It's audible, you can hear them, but I didn't find it too much bothering. Yeah, yeah. Oh, and I didn't mention also about the sound and I didn't show it because I'm sure that YouTube will turn off our video, but there is a VR concerts that were recorded specifically for this kind of experience. So you get very close to the singer. You are part of the choreography. It's a 180 experience, not a 360. On the one that is free to watch, the chroma key is a bit... I don't know, maybe it's our eyes. I could see that it was done with a chroma key. It's a bit strange, but still a very, very impressive video. Very good quality. Nice sound as well. So, yeah. Okay, great. So, one thing you can test, as you mentioned, there's a lagging issue. I know that we did that for the HoloLens and HoloLens 2 is to measure the number of triangles you can display during one screen and to see which is the limit of the number of triangle for this device. It would be very interesting to know this number, which was very low for the HoloLens 1, a bit better for 2, but we know that this kind of devices, especially when you're doing video see-through are very demanding. So, I'm curious to know if we have to be as optimized as we are used to with this kind of device. I have to mention that on all the other headsets that does pass through right now, they don't have a specific processor to process that. And so, each time I'm doing a development in mixed reality, I really have to consider even lower number of polygons that now we can render on mobile phones. So, that's still really limiting and the next generation of processor from Qualcomm will surely improve that. Or we'll go to Wi-Fi 7 to have better bandwidth for streaming directly content from PC. But right now, it's a huge pain in terms of user optimization on the 3D part that is taking a lot of time. So, that's it for Fabien, I guess. A quick news for us, Seb? Yes. So, I went to Mobile World Congress last week and I will try to gather all the information, but just I wanted quickly to talk to you about this company, Expansio, which are working on a contact lens for mixed reality. And what is interesting, it's not there yet. Clearly, it's still a lot of things to go through to have this working. But they have already like some working prototype to measure health information about the user. So, being able to do something that usually you go to a specialist to get info about. And there are like three or four different working prototype that is already working and being able to provide information like that. So, what I found very interesting is that they are not focusing only in displaying information, but also gathering information. They were even mentioning that through the eyes, they should normally be able to get brainwave information and so be able to trigger things directly from your eyes into your rendering, whatever you want to render. But it's a long way before we have anything that we can wear on our side, even the battery, the way to recharge the lens. It's very complex. They are thinking about using directly the eye movement and stuff like that. So, yeah. And they had a kind of a prototype, but all they were showing is mostly the same system as what we can see in the magic leap or in the HoloLens with a laser beaming directly to the contact lens and you put your eyes in front of it. So, it's just a form factor that was very small here compared to the glasses that we have in the HoloLens 2. So, I don't know if you want to react on that. It's interesting. I think if I'm correct that the eyes are directly connected to the brain. So, I guess that will explain why they can also get some brain information. But yeah, it's interesting to see what the future will look like. But as you said, I think as well, we are pretty far from it. Yeah, this is for me. I don't have any more questions about this. Right. I wanted to talk about this company, ScienceGlove, that works with HTC and provides a nice force feedback gloves that you can wear. Here, it was connected to the XRELIT, HTC Vive XRELIT. And the form factor is quite small, quite easy to wear. Of course, you need different size of gloves, but it's already made for different size of hands. It's mostly the glove that you use on which you put the small sensor here. That will be different. It locks to your hand with like some sensor and makes some feedback in the palm of your hand. So, when you are holding something, it's really feel that it's inside your hand because it's slowly pinch your hand like this. And they say that, you know, the cord, the small cord that you have on top of your finger here, they're able to simulate like holding a bag, a shopping bag in your hand. So, you can still close your hand, but with the feeling that you have pressing here and the feeling on your finger, it's really provide a nice force feedback that your brain gets tricks with. And it's wearable. You just have a big battery on top of your hand, but you can really move around. You are not attached to any cable like the previous version I tried for other gloves like that. So, like that. So, yeah, it's nice to see that it's progressing and start to be usable without a huge platform and calibration. The only thing that they need to work on is to use the Vive Ultimate Tracker, I think. Right now, they're using the wrist tracker, which use the Lighthouse. So, that's still a bit of a pain to install that. And I guess with a standalone VR headset, it's even harder to configure. But as soon as they got the Ultimate Tracker working with it, I think it will be really easy to deploy. So, if you have any feedback on that? Well, you're mentioning that the battery is on your hand and then you will have the Ultimate Tracker added to it. So, it's becoming to be quite huge and heavy as usual. I mean, I would say with the haptic gloves, when you want to put everything for it to be completely immersive, even if the design sounds to be light and small, at the end, you have something quite bulky. So, I don't know if we'll see the real solution at some point, but my feeling is that we are still spinning. We're just adjusting the cable and force feedback that is better than what we had in the past, but we still have the same problem. It's very interesting to see that we can't solve really the issue with the haptic gloves. We will still have something that is big and heavy. Yeah. I don't know if someone will have the solution at some point, but anyway, it's interesting to see that there's still some improvement in this area. I think that with the Vive Ultimate Tracker, you have some pogo pin on the back, and so you can gather some battery from there. So, you could maybe lower the size of the battery that they have right now using the Vive Ultimate Tracker. Okay. But, yeah, I saw that Meta is still working on the wristband and saying that they will have a prototype, well, a device, a working device for user and consumer in 2026. So, in two years. So, they'll talk about that since a while. So, I guess, maybe trying to find the form factor that works for every size of arm and stuff like that is tough. Yeah. The wristband that was announced by Mayo like 10 years ago. Yeah. Still, the idea is great, but yeah, it seems that the technical difficulty for us to make it work is beyond our expectation. So, yeah. I was surprised by the announcement that it will be in two years as they are working on it for quite some time now. And the other way they are announcing that they are Apple Vision Pro competitors in a few months. I guess this is in June or that they will be announcing it. It's at Connect. So, I'm very curious to see what they've been doing in this very short period of time. Like, it's one year and a half since the announcement of the Apple Vision Pro. So, very interesting to see what they can come up with. And the tagline for the model is Quest 2 Pro. That's it. Quest Pro 2, I think. So, more business device, I guess. Okay. And that's it for me. Okay. So, just to finish, just quick news here. Sorry. Sorry. We all see, I guess, the controversy... ...weeks. They announced Gemini and then they are chat GPT-like. There are not that... ...oh, sorry. First thing, I guess, is that now you can prompt to video games in short. In fact, they trained their model through 200,000 2G platform games. So, first of all, the question is about the patent and all the rights. I'm not sure that the 200,000 creators were aware that their games were used for AI training, but yeah, it's another question here. But it's very interesting to see that now we can create something that is interactive with the physique-like and the control than you can have in a 2D games. So, for now, it's still very basic, but it's really interesting to see that we saw the SORA by OpenAI for us to get video through prompt. It's not released yet, so we don't have any information about this. But this one seems to be quite interesting, but just to quote ImmersiveWire, we are far from being able to generate VR games, for example, because we would need a database that would be as huge as this one, meaning 200,000 different VR apps, and I'm not sure that we have this kind of database right now. But very interesting to see that AI is going all around and that we can do some interactive stuff with this now. So, I don't know what's your take and what's your feedback? Do you think that... I'm guessing that, of course, it's interesting, but yeah, what do you think the future of this technology could be? It's interesting to see that there are two approaches and maybe two ways of thinking in the AI world. One is to create more and more specific models that will be very, very skilled into one task, and the other way is like the AGI dream of creating an all-powerful AI. And, yeah, it's interesting to see these kind of two competing strategies. Which one is better? I don't know, but it's interesting to follow. Seb, any last words? Yes, I was looking at the news, and I see that there is also ability for Genie to generate from on-drawn sketches some video games, so apply some interaction that are standard to a drawing, which I think open up more possibilities there. So, yeah, it's really interesting to see how much new users coming to development will be able to create their own world and their own application this way. Yeah, I saw an interview of NVIDIA CEO, and he said like the miracle of AI is that everybody's becoming a programmer. And, yeah. A junior programmer, for now. Okay, so on this very wise words, I guess, this is it for today. So, we'll be back next week for another episode of Lost in Immersion, and maybe we'll have some news, some more feedback and news about the Apple Vision Pro as well. So, see you guys next week, and have a good night. Thanks. Bye. Bye. Bye.

Lost In Immersion

Episode #{{podcast.number}} — {{podcast.title}}

Transcript

Show Transcript
{{transcript}}

Subscribe

Spotify Apple Podcasts Google Podcasts Amazon

Episodes

#{{ podcast.number }} – {{ podcast.title }}

Credits

Podcast hosted by Guillaume Brincin, Fabien Le Guillarm, and Sébastien Spas.
Lost In Immersion © {{ year }}