And we are on, yes! Welcome to episode 11 of Lost in Immersion, your weekly 45 minute stream about innovation. As VR and AR veterans, we will discuss the latest news of the immersive industry. So, first of all, sorry for the delay, we had some technical difficulties, but everything is set up now. So, Fabien, if you want to start, please. Yeah, thanks. So, today I want to talk about and test in real time the new game from Niantic. So, the studio who is doing Pokemon Go, and it's called Peridot. So, I will share. Okay, let me know if you can see it. So, the concept of the game, it's an AR game, and the concept is really like the next generation of Tamagotchi. So, you have this very cute avatar, and each avatar is unique. You give it a name, and it has some, you know, like characteristics. Mine is very timid, which is funny. And so, you can, you know, feed it. It can do some like gesture, like that. So, Po, you can give it his food when he's doing the gesture that you trained him to do. So, I won't go into all the details, but it's very, very similar to like raising a Tamagotchi. And something that I found very nice in this game is, so you can, I don't know how they call it, but you can draw a circle here, and the pet will fetch some food or some objects and stuff like that. And what is very nice is they have some, I think, AI into the experience that can recognize what you are looking for. For example, if I dig on a plant, it recognizes that it's a plant, and I have special items that are matching what I dig into it. So, you can, you have some quests, and you need to show him a human, you need to show him a pet, you need to go on sand, on grass, and stuff like that. So, I think it's a very nice usage of AI, like image recognition into that game. And so, yeah, that's, I'm going very quickly on this. That's basically it, you have a map, you can see some friends. Yeah, I don't know why, but before when I tested, I had already some people around me. Sorry, here, for example. So, you see, the game was released today, but some people are already very far in their pet raising. So, that's really nice. And the other feature that I cannot test, because it seems that it's not available in Japan where I live, is there is merch. So, it's a partnership with Amazon, and you can buy merch with your Amazon account directly in the game. You say digital or real one? I think it's a real merch, yeah. Like t-shirts with your pet on? I don't know. I didn't go, because it's not available in Japan, so I don't really know. And I couldn't find any information about that, except that there is some merch. So, I'd be curious to know exactly what they have, yeah. Okay, so yeah, I think I did a very quick tour of all the features, but I'm curious what you think as a first impression. Isn't it cute? Yes, it is. I'm curious what you think as a first impression. Isn't it cute? Yes, it is. On the first evolution, not so fond of the next one, when you see the oldest pet. It's like a big creature. I guess this was a game on PC when you were creating your first cells, and then it grew to a much more complicated alien. The first versions are always better or cuter than the latest. So Seb, if you have any questions? No, no questions. I saw a couple of other videos where a user is near a pier and makes the pet appear on the boat. Then next to him, there is occlusion. So yeah, in terms of augmented reality experience, I think that's really nice. It really takes into account the environment, and I guess they used EPS, the virtual positioning system, to position even in a specific space, a specific environment that has been scanned, the pets correctly and accurately in the environment. With, like I said, occlusion and stuff like that. So I think that is pretty impressive. I also saw some ideas about having the pets interact with each other if you have someone else that comes near to you. So that sounds like a better experience than only the Tamagotchi that sends you some notification where it needs to be fed or stuff like that. That's not the kind of game I'm fond of. I don't see an added value or something that would help people to interact with each other. But the idea that you can make your pet meet another pet and maybe do stuff together with your own pet that you have trained could be interesting. Okay, so I have a question. Can it die? If you are not feeding it, like the Tamagotchi ones? Yeah, that's a very good question. I mean, the game is out only for one day, so I don't know if they can die in one day, but that's a very good question, yeah. Because this is the old goal of this kind of games. I told about Creature, the oldest version when you had eggs. I don't know if you can remember this game. You bought the game, it was a PC game in the 90s or so, and you had a specific floppy disk. And on this floppy disk, you only had three eggs, and once your creature died, your game was just garbage because you didn't have any eggs left. So it was kind of funny to see that your game had an expiration date if you are not taking care of your creatures. And Tamagotchi dies as well, and it would be interesting to know if this is integrated in this as well. This was one of my questions. And the other one, yeah, you said you talked about occlusion, but does it have collision? Can it hop on furniture? You have a counter here, can it go on this? Yes, on the video, the user was showing it too. It was jumping from the place near him to release the boat, and you can see that on the boat, the animal was smaller and accurate in position. And even the distance from the user was displayed, so you can see that the distance was correct. I guess it's a famous place in Los Angeles where there's an army boat on a pier that is always there, so I think it has been scanned. And therefore, they know at this location what is the 3D environment. They have a 3D scan of the environment so they can place everything correctly. Now, at your location, where it's using only SLAM, I don't know how accurate that would be. Great. This is a great evolution since Pokemon Go. We can call it an augmented reality game, which Pokemon Go was not that much. Don't you think that by creating those quests, you talked about going to the beach and finding some objects. Don't you think this is a smart way for them to make their algorithm learn more and more different situations, objects, and textures? They are auto-feeding their AI with that. I will be very honest saying that I did like everybody else. I said yes, yes, yes to everything that was asked when I launched the game. I don't know what are the privacy terms and conditions here, but it's very probable that they are using the information that we are giving to learn more. That's a huge probability. Okay, great. Maybe we will have some updates on your pets growing up in the following weeks. It could be great if they link that to a garbage collection where a user needs to grab garbage to feed their animals and throw them to accurate trash. There is a red color. We have different colors here. This is the blue one for the old recyclable. We have the green one and the purple one for the organics. I don't know if you… It could be a good idea, yeah. Okay, so Seb, if you want to share our subject as well. So, I had two subjects today. One is… Yeah, it's on. Okay. So, the first one is about… We were talking a couple of weeks ago about optics gloves that use air compressor to make the actual force feedback on the user end. And we were all agreeing on the fact that it's huge and too heavy and there is not a lot of use case where it can be used. And there is a new kind of gloves that are coming up. And this one is the Bifrost Pulse VR gloves. And it's very small, without cable, everything inside the glove. And it gives… With actuators, it allows the user to constrain his hand when he's holding an object and also feel the kind of material the object is made of. If it's something that is flexible or hard, it will change its behavior. And there is also vibrator inside the finger, so you can feel some vibration inside your hand, the last part of your finger. So, I think the device is quite nice. It's a Kickstarter. So, right now, there is not a lot of feedback on how well it performs. But the way the device is, it's way smaller than what I've seen before. It's very nice and makes me want to try it. So, I don't know if you have any feedback on that, if you saw the news or what is your thought about it. That's new, but… Do you have the video? Because it's on pause. Stop. Okay, great. Because I couldn't see the link between the actuators and the fingers, so I didn't understand. It's kind of a small cable that is constrained with the model. Okay, so it's a smaller version of the cable that are going from the wrist and with some mechanics architecture on top of it. Yeah, okay, great. And it's using a battery also, so it's not with a cable. So, you're more free in your movement, which is something that is really constraining you when you are on a cable or in VR. That blocks the visual and the immersion. So, yeah, I think that's great that it's coming out because it's really missing this part with the haptic feedback. It's something that I'm really missing when I'm doing VR experiences. And it can just break your finger that way. No, I don't think so. So, yeah, that was the first one. And the other one… I don't know if Fabien has something to say about this. One question is, so is it only like the fingers? So, is it moving the fingers or is there a difference with like, for example, when you touch a surface or different materials? Like wood or stone, can it give that sensation as well? Or is it just like, you know, pressure, like catching something? Yeah, it's tension. Pushing or something. I think something that is soft, you will feel that. And it depends on how soft it is. So, you will be able to feel the pressure and making it harder and harder if you press a volume button. Air volume. You can feel that it's soft at the beginning, but then when you squeeze it, it's really soft. I don't think it will bring your finger backwards if something is moving. I don't think it will bring your finger backwards if something is hitting your finger, for example, or if you slap a wall to make it harder. I don't think it will bring your finger backwards if something is hitting your finger, for example, or if you slap a wall to make it harder. It's just when you're doing your code wrong and you have a bug. Yeah, but I don't think the amount of force will be that much for something that is like that embedded inside. Well, if it's actuators or I don't know, sometimes we can be surprised by the force that is given by this. But yeah, this is always a question with this. If you have someone that is very strong, if the force that he or she is giving can break the motors, is this fragile or not? This is the biggest question about this device, I guess. Yeah, that's right. And the amount of power that it consumes, if you have to recharge it every five seconds, that doesn't make sense either. We'll see when it comes out and when the user tests it, see the feedback. Yeah, and if it comes out, because it's a Kickstarter campaign. Do you have the price of this? They announced $299. Oh, so it's not that cheap, not that expensive. No, for two blocks. Okay, great. Great. All right, so that's it. And the other one is the game that was previewed of the game that is in development in France from the company called, I forgot the name, sorry. Dharma, sorry. Feedback this week, so they are using the Unreal Engine and they did something a bit different. Seb, you have an issue with the sound. It's echoing. I don't know if Xavier has the same issue. No, I don't hear any echo. Okay, let me check something. Okay. Okay, we're seeing your video, just switching to this. Okay, great. You can speak now if you want. Yes, so that game is in production, but they released some video last week and they had a lot of buzz around that. The first thing that people said was that it was a fake video that was recorded by a user, so they had to make a video where they are showing the video inside the Unreal Engine editor to demonstrate that it was a real video game and not something that they shot in real. And then what I think is interesting and made everything really realistic, but first it's Unreal Engine, but also the fact that they changed the way it's done in video games, usually you have the camera on the head of the virtual player. But here they did a body cam, something more realistic, more something that you are used to seeing in movies, and they removed all the UI, so you really feel like you are inside a real camera on someone. And the other complaint they had is that it's too realistic and it's scary and people are afraid it will, as always with video games, it will bring ideas for people to make the same video. It's a bit surprising that people are not okay with the fact that it's too realistic, because I guess this is the goal for many, many years. This has been the goal for many, many years. And once you get to the goal, people will say, oh, well, no, finally, it's not what we want, it's too realistic. Yeah, I agree. That's funny to be at that point and then realize that might not be what users want. At least not everyone. Maybe some people would like that it's realistic. To be honest, when I saw the video for the first time, I was also a bit suspicious, because there was already some kind of demo two or three years ago, and it was based on 3D scans as well. And it was really very realistic. And the developer just disappeared just after posting this video. I don't know if it's the same or if it's another team. But yeah, it was mainly the same comments about it, that it was fake, it was too real. It was scary. Of course, they didn't choose a very enjoyable environment to do so. It's a very creepy one. But maybe if they are doing something in a green, you know, with more light and something more likely or funny, maybe the comments won't be the same. What do you think about this Fabien? Yeah, I think it's a really, really impressive result. And yeah, if we dive a bit technically, what technique are they using? Is it a scan of a real warehouse and they use the scan in Unreal Engine or is it like 3D modeling? What's behind? I'm a bit curious of that. And but yeah, it's really impressive. I think that's the idea. They scan an environment and they remodel it and adjust the result in Unreal Engine to make it nicer. Because even though you have a point cloud or scan of an environment, you still have to O&O, you still have to redo some stuff in Unreal. But like the floor, the different shapes and stuff like that, I think that's only for dust and particles, everything that is inside, like the barrels and stuff like that. I think that's an asset that you can buy directly on Unreal. That's already available. Yeah, Unreal is really pushing for their reality scan company that they bought several years back. And they are creating this very huge library of 3D scan assets and environments. And they are really, really pushing towards this direction. And their algorithms are working great. So this is the result that we can expect from their technologies right now. And on top of the asset, all the LOD and stuff like that, that are usually made by him in other tools. But here, because you use their library, everything is embedded. So it makes things a lot easier to make an environment large with a lot of polygons. Because everything is, all the media that you embed inside your environment are already made for taking into account the different LOD and stuff like that. So it simplifies the process for an artist. Okay. So that's my turn. Okay. And here we are. Yes. So, perfectly. So the subject for today is a video that appeared two days ago. And the idea behind it is what can ChattyPT do in the future? If we ask it to recreate Beat Saber in Unity. So the... Sorry for the sound. So... So here is the result at the end. Just for you to show what it's done. So it's... Sorry. It's a very Beat Saber clone-like. And... Well, the main... The name of the video is a bit clickbait, to be honest. Because it's not ChattyPT that has done all the game all by itself. The whole process is that at first he asked ChattyPT how to create this game. So it gave all the steps to achieve that. And then the creator, Valium, he came back to ChattyPT every once in a while when he had technical difficulties. So he used ChattyPT more like a smart companion to solve his development issues. And it's funny because he asked about scripts and shaders and all kinds of different technical difficulties that you can have while developing this kind of application. And what is very funny is that at first sometimes ChattyPT simply said that he couldn't answer the question. But by asking it again and again, ChattyPT finally gave the right answer. So it's very funny to see that if you are pushing hard on ChattyPT, you can get the answer that you want. But I think the main idea here is that at this point ChattyPT could be a very good companion for juniors or people that are just starting developing. Because you still need to have the whole project in mind and see what you are seeking. It's not a magical black box. I want a game doing this and that and the AI is creating it in Unity. You still have to manage the whole project and see when you need the AI or not. So this is very interesting because I think for newcomers and people that are willing to try Unity, this is a very good way and a fun way to learn and create great applications. I think the next step now is to see at this point what can AutoGPT do. Because if ChattyPT can do the step-by-step technical things, we can hope that AutoGPT could create the whole project all by itself. So it has to be tested. I think that the creators are on the subject and maybe we will release another video about this in the following weeks. So here it is. What do you think about this? I have a lot of things to say. First that I agree with you. ChattyPT and other generative AI are very good companions, very good assistants. When you have an issue, when you need a bit more ideas, when there is a problem, they are very, very good assistants. Then I want to... Well, it's my opinion, it's indeed the title of the video and especially how it's shared and seen on social networks. The creator of the video is a VR developer. So he knows what questions to ask and he actually shared all the prompts that he gave to ChattyPT. I had a look at a few of the questions and there are questions that he already knows... The answer, yeah. He already knows the answer. He already has a lot of questions to ask. And he just needs some help with the details. I want to nuance a bit. It's not a revolution that will make all the VR developers obsolete. It's a very good assistant, as you said. And something that I've seen, it's a bit of another topic, but Unity is teasing some kind of AI generation tool in Unity. So I don't know if they are using OpenAI behind, I mean, GPT models behind or if they are their own. I'm very curious to see what they will bring to the table with that and when. But yeah, it's... And something else is that GPT4 was trained... I mean, what I want to say is that Beat Saber was already out when ChattyPT was trained. So if you ask ChattyPT, how can I remake Beat Saber? He knows Beat Saber. It's inside the training. So it's not like, how can I make a new game? But still, it's a very impressive way to work in a more efficient way. I agree it's made for companions and to help you, but you need to have some knowledge right now of what you ask and how you prompt the ChattyPT to get the best answer for you. We are on the waiting list for Unity also. So we are waiting to see how they will release that and how it will be implemented and how we can use that. So that's definitely something. And we had some lately on one project. We used that for a shader, to generate a shader. And inside the code, we realized that there was some issue after delivering because we had a crash in the app coming from time to time. And it was directly coming for the shader because of one part of the code that ChattyPT put inside the code. So even though you know what you ask, I think there is still a lot of work to do on your side to analyze what has been delivered by ChattyPT and make sure everything is accurate and you do what you want. It sounds like magic, but there is still a couple of stuff where it could go completely wrong like what we have done. Okay. Great. So I think that's it for today. Do you have anything more to add? One, two, three. Okay, great. So see you next week. Good morning, good afternoon and good evening.

Lost In Immersion

Episode #{{podcast.number}} — {{podcast.title}}

Transcript

Show Transcript
{{transcript}}

Subscribe

Spotify Apple Podcasts Google Podcasts Amazon

Episodes

#{{ podcast.number }} – {{ podcast.title }}

Credits

Podcast hosted by Guillaume Brincin, Fabien Le Guillarm, and Sébastien Spas.
Lost In Immersion © {{ year }}