Prioritizing which senses to hack in the metaverse

Nitish Reddy
4 min readNov 23, 2021

It is being said that Metaverse will be a layer on top of internet, which will fundamentally change the way we spend our digital time. In 2020, it mostly through computers, mobile phones. We watch videos, send messages, listen to music, participate in zoom calls etc

It’s going to be more immersive in metaverse over the existing (mobile) internet. But how? The answer is with devices like VR Headsets and AR Glasses. Secondly, the communication will be more synchronous, today we get in and out of zoom calls, if we stay for extended periods, we have Zoom Exhaustion. But to win that battle, the technology needs to improve and also the user behavior. So this technology should be as fatigueless as possible. That will happen!

Experiencing Present as it is

In a previous blog post, I touch the topic of what it means for our human system to be in present, as opposed to past or future. We need to be consciously sensing one of these 5 sensory information

  • Hear — Audio/EarPhones
  • See — Visual/VR/AR
  • Touch — Feel/Haptic
  • Smell
  • Taste

Audio Visual Metaverse

In 2020’s internet world, we see on laptop and mobile screens & hear through earphones(In-ear Apple Airpods, the best in terms of making it feel like part of human body. In Futuristic Metaverse, the claims are that, it will be even more immersive with VR/AR Gear in terms of Visual Senses.

Haptic Metaverse

If we think more futuristic, like in Ready Player One, we could get haptic suits and also hijack the touch sense as well along with Audio and Visual Senses.

Matrix Metaverse

Even more futuristic, Elon Musk’s startup NeuraLink is working on embedding chips in the brain that can hijack the signals in brain and even extend the metaverse to Smell and Taste as well at some point. Very much like Matrix. We are lying with some chips in your head and you are just living in a digital world where all our 5 senses are hijacked for a truly immersive feel.

Start with Hearing/Audio

This is the easiest of all, we already have in ear air pods. Let’s say there is a person who is blind, anosmic(can’t smell) and taste buds are unreactive. Let’s also assume that person has been kept in a sensory deprivation tank. Only sense that works is hearing. Now for this person, can we build an audio system, so they think that they have been truly living in a different place by modulating the audio. This is the problem statement.

Now in an Audio only world, these people connect to the metaverse and move around talking to each other, the audio signals need to be hyper realistic, based on their movement the amplitude of the audio should change in real time etc

If we can achieve this level of fidelity in audio. Then we know we truly hacked that sense in the metaverse. What technology is needed for this ?

  • Stable and High Speed Internet for transferring 8D Audio
  • Improved Protocols for Synchronous first Communication
  • Small Non Invasive Hearing and Listening Devices

Simple Game in Audio Only Metaverse

When I was in 6th grade, our sports teacher made us play this game. 100 people were given 100 cards. Each card had one of 25 animal names on them. So there are 4 players with the same card. These cards were randomly distributed and we had to run around, shout and find the other three members of our group. The first team which reports will win.

Now, in PubG before the game starts, all players in the same space, in the same audio chat room, depending on how close we are to a person, one should be able to talk with other players. The above mentioned game if can be implemented in an audio immersive manner. We’ve reached our Audio only Metaverse stage. Let’s see when this will happen!

Tinder’s update in Audio Visual Metaverse

In 2020’s internet world, Tinder is an app, which matches two people digitally. Then go on to real world dates. In Metaverse, Tinder will have options to chose for a virtual date. For example, on Saturn, breathtaking views of the moons, you are closed in a glass dome, where you can hear the buzzing sound of hailstorms along with the view. The more crazier the imagination of the designer, the more costly the Virtual Date. But then again, all of this will gain traction, if we can successfully hack the audio and visual senses in a hyper realistic manner.

Well Leo Dicaprio could be serving food for you. I mean the digital avatar, now this is where NFTs, AI should catchup to be part of the Metaverse Tech Stack. Another blog on it.

Next Senses to be Hacked

After Hearing, I think next sense to fall is Seeing. Next Feel. Next Smell and Finally Taste. If we can hack the brain with NeuraLink Chips, then all is taken care about being in Present. Not just that, Past and Future as well!

I was explaining all of this to a friend who is heavily into gaming. He says all of this NFT-Metaverse stuff already exists in gaming already.

Metaverse starts from gaming and sci-fi movies and everything that we discuss will culminate into the Metaverse.

--

--