Speaker 1: Hey, we’re here together in Horizon, and this is the first time that we’ve done this at Connect, and this is a preview of our next generation of avatars. They’re so much more expressive and detailed than anything else today, and they have this unique meta style to them. Now it’s a lot of work to build AI to auto generate these for billions of people, and then give everyone the tools to make sure that your avatar feels like your own. But I’m excited to start rolling these out later next year on Phones, [00:00:30] VR headsets, and more. So while I’m excited for this next generation of avatars, we are also seeing a lot of developers and partners building great experiences with our current generation of avatars as well. So now let’s meet up with IRI who leads our avatars work to discuss the latest on our avatars and what you can do with them. Hey, iri.
Speaker 2: Hey, Mark.
Speaker 1: Hey, wanna update everyone on the latest with the avatars ecosystem?
Speaker 2: Yes. We have so many exciting things going on. [00:01:00] Our first milestone is making it so you could use your avatar across all of our apps. And in vr, the new Me account center, it already allows you to have one or multiple avatars so you could show up however you want. And if you have one avatar you really like, you can now easily sync it across all the apps and in Horizon.
Speaker 1: Nice. All right. Let’s talk a bit more about the integrations that we’re building for the apps
Speaker 2: Already see avatars showing up in stickers, stories, comments, and more across Instagram, Facebook, Messenger, [00:01:30] and in the future, WhatsApp. And we’re working on bringing them more to lives animation and to more places like for example, reals. We want you to be able to use avatars anywhere you want to express yourself.
Speaker 1: And we’re also bringing avatars to video chat.
Speaker 2: That’s right. Starting with Messenger and WhatsApp. If you want, you’ll be able to show up as an avatar. It’s going to add a whole new dimension to video chat.
Speaker 1: Yeah, I think avatars and video chat are gonna be like this third mode between video on and video off. You can still express yourself [00:02:00] and react, but you’re not on camera. So it’s kinda like a better camera off mode.
Speaker 2: Yeah. We’re working on this now and expect to launch it next year.
Speaker 1: And in virtual reality, there’s already head and hand positions, so your avatar moves and sounds like you. And now we wanna bring that to phones and computers too, using cameras and microphones. So if you wink at your phone, your avatar on your phone should wink too.
Speaker 2: Making your avatar work across all kinds of devices and apps truly creates continuity across experiences.
Speaker 1: Yeah. You should be able to take your avatar [00:02:30] and virtual goods everywhere that you go.
Speaker 2: Absolutely. There are already dozens of apps on Quest that support meta avatars, and we’re building partnerships. So you can use your avatar across lots of different experiences. And in fact, today we’re announcing partnership with Zoom that will let you show up as your avatar on Zoom calls.
Speaker 1: So to make this easier, we’re extending the meta avatar’s SDK to iOS and Android on Unity. And I’m also excited to share that the meta avatar SDK will soon support [00:03:00] Unreal Engine and virtual reality too. These expansions will make it a lot easier for more developers to start building meta avatars into your apps.
Speaker 2: Yes. We’re testing this with a small group of partners now and plan to release this more broadly next year.
Speaker 1: Yes. So right now, a lot of developers are building their own avatar system since the tools that you can import are pretty basic, but over the next year or two, as all these new tools and styles become available, and as the hundreds of people that we have working on avatars keep improving the system, I [00:03:30] just think that more and more developers are going to find building with our interoperable avatar stack will offer a much simpler and better experience. And speaking of making it easy to add new styles, we have some news about the Avatar store too.
Speaker 2: The Avatar store is launching in VR later this year, so you’ll be able to shop for virtual closing in vr. We’re adjoining with a bunch of partners across sports entertainment and more. You’re gonna start seeing a lot more of your favorite [00:04:00] familiar brands, these integrations, they’re gonna be pretty awesome. And later this year, keep a lookout in the avatar store for new outfit releases from Netflix.
Speaker 1: Giving people more ways to express themselves through digital clothing will help kickstart the marketplace for interoperable digital goods. So if you buy a sweater, you’re gonna wanna be able to wear that on your avatar no matter what app you’re using. So we’re gonna see more creators, developers, and brands experimenting with this. And beyond clothes. We’re also improving the core avatars themselves across [00:04:30] the whole family of apps.
Speaker 2: Mark, I’m sure you’ve seen the new face shapes we introduced earlier this year, as well as co clear implants over the year, hearing aids and wheelchairs.
Speaker 1: Yeah. And we have some big improvements to representation coming too with more options for body types as well as shaders for more realistic skin. There’s one more feature coming soon that’s probably the most requested feature on our roadmap. Legs.
Speaker 3: Legs.
Speaker 2: I know you [00:05:00] have been waiting for this. I
Speaker 1: Think everyone has been waiting for this, but seriously, the legs are hard, which is why other virtual reality systems don’t have them either. And the perceptual science behind this is actually quite interesting. And we discovered early on that your brain is a lot more willing to accept a rendering of a part of you as long as it’s accurately positioned. But if it’s rendered in the wrong place, then it just feels terrible and it breaks the whole feeling of presence and immersion. So that’s where we started off showing just controllers, but [00:05:30] not your whole arms.
Speaker 2: Yeah. If the system showed my elbow in the wrong place, it would feel like my arm was broken. Yeah.
Speaker 1: But as we got better at tracking and predicting your arm position, then we can add your whole arms to the avatar stack in addition to your hands. And now we’re doing the same with legs,
Speaker 2: And now we’re getting ready to launch the first full body avatars.
Speaker 1: Yeah. So with standalone virtual reality headsets, understanding your like position is surprisingly difficult because of occlusion. So if [00:06:00] your legs are under a desk or if your arms block your view of them, then your headset can’t see them directly. And you need to build an AI model to predict your whole body position.
Speaker 2: And this really needs to work reliably, so it just feels natural.
Speaker 1: Yeah. So that’s where we’re gonna bring legs to Horizon first, and we’re gonna keep bringing them to more and more experiences over time as we improve our technology stack.
Speaker 2: And we will bring new tech that lets developers implement custom avatar actions and behaviors for the experiences they want to create. [00:06:30] That’s coming next year too, and I’m looking forward to seeing what all of you will build.