Imagine running through a dark forest in a virtual reality video game and being able to smell the crisp scent of pine needles all around you. Or what if you were analyzing a complex data set, and could associate specific scents with data points in order to better track and recall the information?
University of Maryland researchers in the Human-Computer Interaction Lab (HCIL) are linking virtual reality with sight and smell to help people better process information. HCIL is jointly supported by the UMD iSchool and the UMD Institute for Advanced Computer Studies.
HCIM student, Biswaksen Patnaik and PhD student, Andrea Batch are exploring ways to convey information with scent as a complement to the visual representation of data sets.
Patnaik and Batch recently presented their research paper, “Information Olfaction,” which explores the sense of smell combined with information visualization at IEEE VIS in Berlin, the largest and most important conference on Scientific Visualization, Information Visualization and Visual Analytics.
“This was easily our most crazy idea to date,” said Niklas Elmqvist, HCIL director. Elmqvist is also the student’s adviser and co-author of the research paper.
Click here for more on this story.