Welcome to episode 29 of Lost in Immersion, your weekly 45-minute stream about innovation. As VR and AR veterans, we will discuss the latest news of the immersive industry. Hi everyone. Fabien, if you want to start, please. Hello, thanks. So today the topic will be about Apple. So there was a conference from Apple last week with the release of new products, new iPhones, new watch, airpods and a lot of news. And there was quite interesting ones that also relate to what is of our interest, the Vision Pro. So we can start with the Apple Watch and the major new feature is the double tap. So without touching the watch and just by pressing the index and the thumb, as you can see on the video here. Sorry, once more time. Yeah, like this. It can trigger an action on the watch itself. So it's very similar to the gesture that we saw in the press release of the Apple Vision Pro. And also it starts to make us think of is the Apple Watch becoming an accessory for the Vision Pro? We talk here that the Vision Pro is missing accessories, missing controllers. So is the Apple Watch becoming a controller for the Vision Pro? So that's very interesting that they are starting at this new way of capturing actions. So they capture touch, the voice, of course. And now maybe the actions. Currently the name of exactly the scientific name of this is slipping my mind. So that was the first announcement. The next one is a bit more technical. But still, the AirPods Pro will feature a lossless audio and they can connect to the Apple Vision Pro. So if you've already used a VR headset with speakers, it can be very loud and it can bother if you are using it in the train or whatever. So it's better to have devices like that that have a good noise cancellation. So that was interesting as well. And on top of that, if in the future, as we saw a few episodes back, they implement the patents that they have of having the capture of brain activity using the AirPods Pro, that might also serve as an interface for the Vision Pro. So that was the second news. And the latest one is the iPhone Pro has the capability to capture what they call special videos. So these are like the videos in 3D that can then be replayed on the Vision Pro. So yeah, that was the three very worthwhile news from last week. And we can discuss that. What do you think, Seb? I think it's nice to see the ecosystem that they are building around their device. That's always the case with Apple. They always try to make some interaction which is between all their devices. And they seem to move forward. And like you said, they implement some interactions and some components that can be interesting to use in sync with the Vision Pro. So yeah, we'll see how they make things easy for developers to implement. Cool. What about you, Guillaume? Well, several things to say about this. First of all, for the Apple Watch, I can't help myself but think about the Mayo bracelet. I don't know if you remember this. It was a startup that created some kind of forearm bracelet. You put it on your arm and it was supposed to detect the muscle contraction for you to have a very high definition hand tracking. Unfortunately, this bracelet didn't work because when you received the bracelet, they simply said to you that it was your job to make the knowledge base for it to work. So they were just providing the device and not all the software behind it. So it's a very nice piece of art that is on my shelf right now. So I'm very glad to see that this kind of technology... I don't know how it works, this Apple Watch, but I guess it should detect some modification with your finger movement or whatever. So I've always thought that this bracelet thing and detecting the finger movement should be a great way of tracking your hands. So I'm very glad that they are working in this direction. So very nice. For the airport one, I saw that they activated the low latency Bluetooth that is not available right now with the current airports because people tried to use them with VR application and they said that there was too much latency between the sound and what you get in the airports. However, there's some kind of controversy as well because it seems to be just like a software upgrade for it to work. However, Apple wants you to buy the new version of it. So yeah, some users are not that glad of the announcement because I guess there are some airports too and they just bought them and this feature is not available for those who just bought the new airport. So maybe some software updates for them in the future, but right now, if you want the new features, you have to buy the Airports Pro. So not so nice, but we are used to this kind of business model with Apple. So not very surprised about this. And finally, about this 3D scan, I saw that there is also an update with the iOS 17 with some object scan application. It looks like the reality capture by Epic Games, meaning that you have to select some kind of bounding box around the object you want to scan. So you are turning around as much as is needed and then you have the 3D model available for you to place in AR. But I don't know if you can export this in any new, which kind of format, I don't know. But yeah, it's very interesting to see that Apple is really embracing the 3D scan ideology for several years now. And I don't know if you remember, but Google was the one that was working on this back in the days with the Project Tango. And we can see that Google completely forgot about this at some point and Apple just got the ball on the 3D scan and volumetric 3D and so on. So very interesting to see, as Seth just said, that they are building much more than just the Apple Vision Pro. They are enhancing their old devices as well. And we can easily see that there will be a very strong interconnection between all these devices at some point. So it's very, very interesting. Yeah, I totally agree on that. It's very nice to see Apple moving their pieces on the chess of special computing one after the other. It's a very smart move. And we now know why they waited so long. It's because they wanted all the technologies to be mature enough to make a whole at some point. Okay. Cool. So anything more to add, Seb? Otherwise, you can talk about your topic. All right. So my topic was about Roblox. Adding some AI assistant to their game, which will allow users to make, for the first time in a metaverse environment, create directly by text the environment they want to be in and share that with others. We talked about that, but this was missing because before that, you had to create everything yourself or buy some assets and integrate that into apps before being able to share your environment. And now, with the help of AI, you can really generate some really cool environment in a matter of minutes within the app. So you can't export what you did, but you can share that with others and have people joining the game that you create. It's not only for the 3D, it's also for the interaction and the script that you can generate that will control how your object behaves. So it seems very powerful. And as soon as it will be available, I think I will check that to see how far we can go with the AI for this kind of environment. Any thoughts on that, guys? Yeah, so it's a very impressive demo, very impressive video. And as usual with AI and generative AI, the concern that I have is, is it actually as good as they advertise? Or would it actually take someone a lot of different iterations? And are we seeing here in the video the best case? And so, yeah, that's my concern. But otherwise, if it works as advertised, yeah, it's a pretty impressive way for Roblox to expand and to open up. They have their interest as well to open up a new creator economy in their platform. I think one of the predictions of the Roblox CEO a few weeks back during his keynote was that a Roblox developer would be valued up to 1 billion, I think. Or 1 million or 1 billion, I'm forgetting the scale, but impressive numbers. So, yeah, I think Roblox is really pushing into creating a real creator economy similar to the creator economy that we can see on YouTube or Instagram. So, yeah, but again, I hope it works as advertised. Yeah, I have pretty much the same remark as Fabien. I'm very curious to see if it's a hit or miss kind of implementation and if it creates the same effect as mid-journey where you spend a lot of time iterating. But they have new tools now, but yeah, at some point you had to redo more for like a very long loop of iteration to find something that is satisfying. Once again, can you move some objects or customize the scene or it's just the prompt that is giving you the output you want. And as Fabien said also, we all know what happened with the Minecraft community. If you give the community the right tools, they can create some very impressive results. And I guess the Roblox one could give us the same kind of result as well, especially with their community, which we know was very young at some point. But they are getting older and I hope they are getting to the developer slash creator side as well as they are growing up. So those who were 12 or 10 are now 14 or 16 and if they kept their passion for the platform, we could see very, very interesting results in the upcoming months or years if we are giving them the tools to do so. So I guess we are kind of old and we didn't really understand the Roblox situation, but I guess at this point we can't close our eyes to what is going on with this platform. And we should be very, not careful, but yeah, a lot of attention to what's going on because it may be the future of the Metaverse at some point. Because one, they have a huge community and two, now they are sort of strong partnership with bigger companies like Meta. And they are supporting the VR side as well. They have the financial part with the microtransaction. So yeah, they are ticking a lot of boxes toward the Metaverse implementation. So yeah, very interesting to see what is coming up in the future with this. Okay, so it sounds like we will all dig in as soon as it's available and try on our side and come back to you about it. One thing that is really interesting for me is that doing a lot of VR experience, there is a lot to taking on the development side and optimizing everything, all the mesh. And I wonder, and that's I think one of the tests I will do as soon as it's available, but I wonder how it handles the LOD, all the tools that you use and all the techniques that you use to optimize your game. So it works great on all the different platforms, optimize the shader, etc. So yeah, that will be also something I will test a lot because if it's optimized all that part, then it's a huge improvement on the development side of this kind of environment. And so following that subject, I wanted to talk about what we talked about several weeks now is 3D Gaussian splatting coming now to Unity. So being able to take, like we already described, some picture of a complete environment from multiple position, generate a point cloud from that, and then use that new technique, the 3D Gaussian splatting to use that 3D point cloud to generate a 3D model that needs to be rendered not as triangles as what we are used to do in this 3D model, but as really the point cloud itself, but with some reworking on each points to make some splatting, some way to arrange the point cloud to make some 3D shape out of that, but not triangle, small oval shape. And so here we can see how it's rendered inside Unity. So we were saying, I think it was last week or the week before that we were missing maybe a tool to be able to display 3D Gaussian splatting in real time. And that's it, that's coming. So this one is a bit blurry, but there's several ones that came out, which are quite impressive compared to what we had before with the scan, what you could have with a scan with a phone. Using that technique, it's really to improve everything, make it available on small device, because you can render the point cloud and have this kind of rendering apparently. There is some test that has been done to display environment in VR. So the 3D Gaussian here is displayed in VR. And you're able to move around and see really the environment as it is. So same as the AI tool, it seems like this new way of generating 3D is moving forward very fast and improving a lot the ability to generate a 3D model of anything, actually. Because with that technique, you don't have the issue that we had before with transparent glass or stuff like that, or reflective material that are always tricky to get correctly. Here it seems to work perfectly. And just for information, all the research and development on this technique seems to come from a paper coming from the INRIA of Rennes in France. So it's nice to see that the French community is moving a great deal on this new technique. What's your thought on that, Guiz? Let's go Fabien, sorry. Okay, sorry, thanks. So it's amazing and it's really nice to see that very quickly, this technique can be used in a real-time engine. And it also seems to work at a high frame rate. So it's very nice. I don't have much to add to that. Maybe one question is, so this is static, I guess, for now. It's like a static environment. And I wonder if at some point, a moving environment could be captured, like a video, and then how this could compete with what we just talked about, the Apple Vision Pro spatial videos. So, yeah, it's interesting to see both the hardware. So on Apple's side, it's using two cameras. And on this side, it's like more software solutions. So which one will win? We'll see. I'm not sure it's compatible. You could generate this way with their phone, the same kind of environment, but even better because you have two cameras. So each time you take a picture, you have two point clouds that can be more precise. And use the same way, the Gaussian splitting technique to have that in the Vision Pro. So that might even be the way they already do things, kind of. What about you, Guillaume? Yeah, just as a joke, it's the infamous round pixel that we've been talking about for years. So it's a bit funny about this. But yeah, the question I have is about the input data. Do we have to turn around with your cameras for several hours? Or is this instant? What kind of movie? Because I think this is a video you need for this. So what kind of input data we are talking about to have this kind of result? I'm seeing that on the other videos, they are talking about the Insta360, which could be cool. Because it means that it is a 360 camera. So I guess less video incoming for this to be able to project the 3D results that we are seeing. But yeah, as a 3D scan user, we know that it can be very time consuming to have the correct input data. Especially the video or photograph for us to have the quality 3D scan at some point. So I don't know if you have any information about the amount of data needed to have this kind of result. But they are very impressive. Yeah, I can share with you a video later that shows exactly how this guy did this 3D character capture. And it seems like it doesn't take a lot of time, but you need, of course, all the angle to make sure you have all the texture and all the 3D points. Covering the wall and surrounding of the character you want to take. Yeah, for the object, I understand, but for the street, we can see that even the trees are well defined. The building upstairs are very well defined as well. So how can they do that? Is this a drone intake? It seems like a drone intake because I can't see how you can... It's able to fly, yeah. But you can see also that they did not go too high because here you can see that they don't have the information so nothing is on there. But if you are using a 360 camera with a stick, you could probably get this kind of result. I don't know. We should try this. Yeah, that's the next step. Well, the advantage of NERF or 3D Gaussian splatting is that you can have a different point of view than the one that you took. So I'm not really surprised that they can fly because this is what this kind of technology allows. But yeah, I'm really curious to understand how they did the capture. The best would be, as you say, Guillaume, with Insta360, just walk around in the area that you want to capture and you're done. It could be very nice. What seems amazing to me is they managed to get that place completely empty in Paris. I don't know at what time they went there or if it was close to public. I wonder also how it works with people moving around in your capture. If it's breaking completely the capture because you have some points that are moving through space. Yeah, but your remark is very interesting because it should mean that you can't spend hours walking around with your camera. Because as you mentioned, Paris is a bit crowded. So there's people everywhere. Unless it's at 3am in the morning and even at 3am there are still people around. Or maybe they raised them at some point, I don't know. Same in a natural environment if you have some animals or some leaves that are moving on a tree. I wonder how the point cloud is generated and how good the Gaussian splatting works when you use this kind of scene. And I think that's it for me. Okay, great. Any last words on this? Okay, so for my part, I would like to talk about the Unity... Maybe you can call it a Unity drama. To make a bit of context at this point, we already know that Unity had some kind of controversy with their AI implementation, which is called Muse if I can remember correctly. Because they announced that all the data that are used as input or everything that would be created with their artificial intelligence would be used as input data for their learning data set. So meaning that all the creation part would no longer be part, would be your property. And some content creators were not very happy with it. As we can understand. And in the latest news about Unity, they changed their licensing structure and how they are making their fees about Unity. And they announced this, well, without any preparation of the community, and it impacts is very important for some structure. Meaning that some licensing configuration are not available anymore. And smallest company have to, they have to purchase the bigger licensing fee that they were not used to. And for some startups, I heard that their licensing fee just double in a matter of weeks. So they are just like, they're not very, very happy about this, especially when it's not announced a very long time. Because we all know that when you have a startup, this kind of fees can be very, very impactful on your everyday activities. And on top of that, they added the fact that when you are getting, when you are passing a certain threshold of installation of your apps, you have to pay a seat fee, which is 20 cents per seat when it's over 1 million. And once again, it impacts some medium studio that are becoming bigger, because when you are getting some success with your app, at some point you are getting a lot of installation. And it should very impact the evolution of this as a company at this point as well. And after this, so to be clear, a lot of forum and LinkedIn as well, it just exploded with messages and posts of different kind of professional. There was like developers and company owners as well. And everyone was like, okay, this is the end. The trust is completely broken with Unity. They are changing their policy and the way they are working with us at any point. And there is no dialogue between the community that is using Unity and their board. So at this point, a lot of people were just, yeah, they were just fed up with Unity and they are trying to find other platform. So just to show you the action of Unity is just plunging since the announcement of this, despite the fact that they had very good results after they announced their partnership with Apple. Now they kind of lost all the benefits they had with this announcement. And seeing what they've done, they are trying now to apologize, to clarify the situation with the developers, how harassing the harm is done. And a lot of studios that were beginning some projects are now switching to other platform. Just for the fun of it, we can see that some executive at Epic Games, they are just posting this kind of thing just to troll or just to poke Unity around. They are just reminding us that Unreal is free to download, to use and ship and that their fees and revenue are very, very clear and that they didn't change for many, many years. So it's kind of funny, this engine war right now between Epic Games and Unity. And when I was searching for alternative to Unity, a lot of people were talking about GoBot, which personally I didn't know at this time. And it seems to be a fair competitor to Unity. It reminds me of the beginning of Unity back in the days. So it's still a very young platform with a lot of things to do. But as they have some inspiration with Unity or Epic Games, I guess this kind of platform could be improving very, very fast with the community. Especially if I see correctly, their assets, most of them are free with the MIT licensing and it's beginning to have all the assets and blocks that Unity is currently having. So very interesting to see and we should check out the GoBot at some point to see if this platform is not a viable alternative to Unity or despite the fact that you can switch to Epic Games if you want. So very interesting to see how a giant like Unity can fall very, very quickly just because they didn't please or they didn't listen. Just simply as that, they just listened to their customer base and their community. So I don't know what you are thinking about you guys. Yeah, so it would be very interesting to see what happens because as you said, in the next following days, they will do another press release. So we don't know what will happen. But like what just happened over the and not only last week, I think over the course of the past years, it's just a blueprint on how to lose trust from the clients. And that's the door that will never open again, except maybe if there is a huge change in leadership at Unity, if they really change the way that the people, unfortunately, that are in the leading team at Unity. So, yeah, I think there was, you mentioned the issue of AI, there was a partnership a few months or years back with a company that was a bit shady. So they are forcing some company to buy some plugins as well, like Pixies. I don't know if you heard about this, but yeah, they bought the company a few years back. And now if you are like some activities in the industrial field, they are kind of forcing you to buy this plugin, which is a 3D point cloud management in Unity and 3D CAD models simplification as well. So they are forcing some plugins for you to buy if you want to use Unity at some point. Yeah. Yeah. So, again, very curious to see what will happen in the following days and even more in the following weeks and months. Just, I think, to nuance a bit for a studio like us to switch from Unity or our complete pipeline and plugins and things that we developed to another engine is a huge, huge work. So it's not something that is done, can be done in days. But we'll see, yeah, in the next month and years to come, how this will affect the game engine world, as you say, the game. Yeah. And if you add on top of that, the fact that it has been four years that they released new version and new version, which was not optimized and not being able to render things in good quality or not consistent through time. Every time they change version, everything, a lot of things change. Not everything, but a lot of things change and you have to rework your project. And it's even a nightmare to find the URP solution for the rendering. And currently on mobile devices, it's a nightmare to find which version of Unity to use with which version of URP. There is a lot of forum talking about that. Nobody from Unity seems to move forward. They still release a new version, but there is no optimization done. So I'm not trusting the platform right now as I was before. We are still switching back every time to built-in Render Pipeline, which they are not making progress on because it's stable and at least it works correctly. So it's sad to see that they are pushing a lot of things forward, but not fixing a lot of bugs that they have inside the platform. And compared to what you can do now with Unreal and the results, we can start to see even on mobile platform with Unreal. The switch to Unreal seems to be something we need to consider soon. Definitely. Yeah, I guess this stability issue that is very linked to Unity for the last years is becoming like a nightmare right now. I've seen lots of posts and comments that they are not succeeding in getting something stable at this point. And that's why they have lots of different iterations of their engine right now. And something I discovered a few days back is that you can't have the possibility to convert your project from one version to another. You have to download the given version of your project. Let me explain. If you, for example, receive a new project from a colleague that made it in a given version and you don't have the given version. We used to have this kind of conversion to the newer version of Unity and now it's not available anymore. You have to download the specific version of the project. On one hand, it guarantees more stability because we know that the conversion was not perfect in the past. However, it means that you will have not like five versions of Unity. You have 10 now on your laptop or PC, which has been a symptom of Unity users. You can see if you are a Unity dev, you know that you have like dozens of Unity versions on your PC. And yeah, we got used to that. But yeah, maybe it's not that professional anymore because I don't know if you can remember for old-timers as us. But when this changed the way they license, sorry, the version, their engine, at some point it was just a number and now it's a year plus the version. And they promised us that there won't be any more than three updates per year of the engine. And we can see that there are updates like every day or every week. Once again, the trust is not that good with them and we are used to that. But yeah, now that it's a major player like Epic Games, they should have a more stable or less version of Unity coming. That's my point of view. So how many versions do you have of Unity on your... 26. 26. Yeah. I have to make some cleanup. So yeah, we'll see what this damage would be doing because we know that they have some exclusivity partnership with Apple and Meta, I guess now. So normally they shouldn't be impacted with this because if people want to develop for Apple Vision Pro, they have to use Unity. So yeah, we'll see if it saves them from losing too many developers and getting their community completely shrink at some point. Okay. Anything more to add? No? Okay. So it's a wrap up for today. Thank you guys for all these interesting topics and we'll see you next time.

Lost In Immersion

Episode #{{podcast.number}} — {{podcast.title}}

Transcript

Show Transcript
{{transcript}}

Subscribe

Spotify Apple Podcasts Google Podcasts Amazon

Episodes

#{{ podcast.number }} – {{ podcast.title }}

Credits

Podcast hosted by Guillaume Brincin, Fabien Le Guillarm, and Sébastien Spas.
Lost In Immersion © {{ year }}