I was listening to Lex Fridman and Mark Zuckerberg's podcast on metaverse. Some insights/ideas I got out of the podcast. In a metaverse environment, the goal is to make it closely mimic your physical presence by paying close attention to representing/capturing your real self. The key vision use cases that would play a key role in a real-time environment/experience for users.
- Emotions tracking
- Facial Expressions
- Tiny gestures/remarks unique to the personality
- Face tracking
- More realistic presence for touch/feel senses
- Your AR / VR device is going to be enhanced to track these details
From a computer vision point of view
- Creating a deep fake representation/photo realistic / cartoon representation of yourself
- Representing your emotions, lip movements, expressions
- Adding your voice modulations in discussions
- Altering your face to sync with real-time expressions and movements
- Background changes, clothing, and presentations
- Creating your avatar for business, entertainment, family different groups
Revenue approach?
You might end up using it for different use cases like
- Gaming / Entertainment / Business / Collaboration Multiple avenues of opportunities
- Buying / selling / clothing / styles / accessories in Retail / Fashion domain
- Influencer-based sales/advertisements.
This could potentially lead to a long-term subscription-based model.
Today with affordable smartphones, high-speed connectivity Spotify, youtube we spend lots more time than 10 years back. In another 5 years, These could be replaced with affordable 50$ AR / VR where we might split our time spending across social media/youtube/metaverse, etc..
Keep Exploring!!!
No comments:
Post a Comment