Scent-based VR technology continues to evolve at a rapid pace.
A team of psychologists based out of Stockholm University has developed a piece of technology that allows players to smell the virtual world. According to a paper published in the International Journal of Human-Computer Studies, the Nosewise Handheld Olfactometer (NHO) can discretely house a variety of liquid scents that can be triggered at certain moments during a VR experience.
“Our olfactometer allows for concealed (i.e., unknown to the user) combinations of odors with virtual objects and contexts, making it well suited to applications involving active sniffing and interrogation of objects in virtual space for recreational, scientific, or therapeutic functions,” says the team in the paper.
Unlike other scent-based VR devices that attach to the headset itself, the NHO is designed to attach to the HTC Vive’s motion controller to “output scent at the hand, linking physical smells to a synthetic VR environment.”
In order to demonstrate the technology, the team developed a VR wine-tasting experience that tasked test subjects with guessing different wines based on eight unique aromas. Participants released the smells by pressing the trigger on their HTC Vive controllers, after which they could submit their answers via four virtual circles floating above the table.
Before each session, developers vented clean air through the device for 10 minutes in order to avoid any potential odor cross-contamination. The wine-tasting experience featured four difficulty levels in total; the higher the difficulty the more complex the odor mixture. Participants were exposed to eight unique scents in total: clove, blackcurrant, raspberry, chocolate, pineapple, almond, grapefruit, and pear.
“By developing new technologies that enable enactive smelling, and simultaneously articulating the potentials of smell training for recreational, scientific, or therapeutic uses, we hope to sketch new, more natural interactions that can enhance human olfactory experience,” added the team.
For more information check out the full paper in the International Journal of Human-Computer Studies.
Image Credit: Jens Lasthein / Simon Niedenthal, William Fredborg, PeterLundén, MarieEhrndal, Jonas K.Olofsson