Smell, Taste, and Temperature Interfaces

Call for Participation

The “Smell, Taste, and Temperature Interfaces” workshop addresses the burgeoning subfield of chemo- and thermo-sensory interfaces (smell, taste, and temperature) as well as their cultural contexts, usage, and resulting experiences.  This three-day workshop will offer an interdisciplinary forum of discussion for academics and practitioners interested in leveraging these sensations.

The Smell, Taste, and Temperature workshop covers chemo- and thermo-sensory interfaces. Above are three examples of such devices: (a) The Olfactory Assist Mask is a system that makes trace chemical gases interpretable to humans by substituting it with smellable odors. (b) Ranasinghe et al. explored augmenting flavor by adding digital gustation to utensils like chopsticks. (c) The ThermalBracelet is a device which provides fast-switching thermal feedback to the wrist using groups of four thermoelectric element.

Researchers and practitioners from academia, arts, and industry are invited to apply to the workshop by submitting a 1-4 page position or research paper in the ACM Master Article Submission Template single column format via the submission portal. The submission deadline is Friday, February 19, 2021 at 12:00pm (noon) PT. All applications will be reviewed by the workshop organizers and selection will be based on the paper’s quality, novelty, and fit for topics. Applicants will be notified of decisions on Friday, March 5, 2021 at 12:00pm (noon) PT. Accepted papers will have the option to be made available at the workshop webpage. Upon acceptance, participants are asked to produce a 5-minute introduction/paper presentation video, which will also be shared publicly on the workshop website. At least one author of each accepted paper must attend the workshop.

Suggested Topics/Areas

The topics of interest for the workshop include, but are not limited to, the following

  • Design and use of multi-sensory technologies.
  • Technologies pushing forward smell, taste, and thermal experiences.
  • Chemo- and thermo-sensory augmentation.
  • Cultural aspects and contexts of multi-sensory interactions in everyday life and history, which influence and shape both the technologies being developed and their societal adoption.
  • Techniques & recommendations for recording and stimulating chemo- and thermo-sensations.

We aim to additionally highlight and discuss open challenges in the field, which include, but are not limited to,

  • Power consumption: affects all chemo- and thermo-sensory devices, but especially thermo-.
  • Chemicals: affects all chemo-sensory devices (smell, taste, chemesthetic), often non-reducible, recordings and replay of experiences, precision, inter- and intra-modality effects (e.g. suppression effects).
  • Non-technical: complexity of dimensions, precision, cultural adoption, sharing data across HCI researchers, open sourcing of devices, simultaneously ongoing basic scientific research.
Bar graphs showing publication trends for smell-, taste-, temperature-, and mulsemedia related publications published by the ACM or IEEE since the 1980s. All four subjects have seen steady rises since the 2000s.

Schedule & Format

As CHI 2021 will be virtual, STT21 will be virtual and conducted over Zoom. We will additionally offer attendees access to an STT Discord Server with an #STT21 channel for asynchronous text-based conversation and some social fun. The server will be available beyond STT21 and hopefully used for future events.

Please note that the date and times are standardized to JST! Each date links to a time zone convertor, and participants in Europe, the Americas, and Africa should note that the workshop will be from Thursday, May 6, to Saturday, May 8.

Friday, May 7, 2021 - Future of Smell
Time Schedule Item
06:00 - 06:10 JST Introduction & Welcome
06:10 - 06:40 JST 10-minute provocations by Haruka Matsukura, Emanuela Maggioni, and Judith Amores.
06:45 - 07:15 JST Breakout discussions in groups of 4-5 participants with 1-2 organizers
07:15 - 07:45 JST Reconvene to discuss the challenges and possibilities of smell interfaces.
Saturday, May 8, 2021 - Future of Taste
Time Schedule Item
06:00 - 06:20 JST 10-minute provocations by Nimesha Ranasinghe and Marianna Obrist.
06:25 - 06:55 JST Breakout discussions in groups of 4-5 participants with 1-2 organizers.
07:00 - 07:30 JST Reconvene to discuss the challenges and possibilities of taste interfaces.
Sunday, May 9, 2021 - Future of Thermal & Chemesthetic
Time Schedule Item
06:00 - 06:20 JST 10-minute provocations by Roshan Peiris and Jas Brooks.
06:25 - 06:55 JST Breakout discussions in groups of 4-5 participants with 1-2 organizers.
07:00 - 07:30 JST Reconvene to discuss the challenges and possibilities of taste interfaces.

Important Dates

Submission deadline: February 19, 2021

Notification of acceptance: March 10, 2021 March 5, 2021

Camera-ready submissions due: April 30, 2021 March 12, 2021

Video recording due: April 30, 2021

Workshop dates: May 7-9, 2021

Dive Deeper

Interested in the workshop’s topics and want to dive deeper with some related videos? We’ll be curating an ongoing list of them here!

Workshop Organizers

Jas Brooks is a PhD student in the Department of Computer Science at the University of Chicago. Jas focuses on computer interfaces that directly modulate human chemosensation, such as smell, taste, and chemesthesis. Their most recent device leveraged chemical stimulation of the nose’s trigeminal nerve endings to induce temperature illusions. Jas’s research is supported by a National Science Foundation Graduate Research Fellowship and has been covered by media publications such as IEEE Spectrum.


Pedro Lopes is an Assistant Professor in Computer Science at the University of Chicago. Pedro focuses on integrating computer interfaces with the human body—exploring the interface paradigm that supersedes wearable computing. Some of these new integrated-devices include: a device based on muscle stimulation that allows users to manipulate tools they never seen before or that accelerate their reaction time, or a device that leverages the nose to create an illusion of temperature. Pedro’s work also captured the interest of media, such as New York Times or NewScientist, and was exhibited at Ars Electronica and the World Economic Forum.


Judith Amores is a Research Fellow at the MGH/Harvard Medical School and a Research Affiliate at the MIT Media Lab, where she did her PhD and master’s and helped run VR/AR at MIT as a co-president. She holds a multimedia engineering degree and has worked at Microsoft Research, URL Barcelona, and the Google Creative Lab. Her awards and publications include over 27 peer-reviewed research papers, two patents, a Facebook Graduate Fellowship, LEGO Foundation sponsored research, and was a finalist of the Innovation by Design Awards. She also received the Scent Innovator Award by CEW and IFF.


Emanuela Maggioni is a Research Fellow in multisensory experiences at the University College London (UCLIC) as well as director and co-founder of OWidgets. OWidgets is a university spinout establishing novel software and hardware solutions for smell experience design. Emanuela has a PhD in Experimental Psychology and is passionate about odors, emotions, and engineering solutions applied to HCI and AI perfumery. She works in both academic and industry projects (as Benetton Group, Facebook/Oculus VR), leading to over 40 publications and immersive experience collaborative projects (such as Tree VR, Fly). Emanuela was awarded an Enterprise Fellowship from the Royal Academy of Engineering (RAEng, 2019).


Haruka Matsukura is an Assistant Professor at Osaka University’s Graduate School of Engineering Science.  She completed her PhD at the Tokyo University of Agriculture and Technology in 2013 and was employed by Osaka University as an Assistant Professor in 2017. She was a JSPS Fellow (DC1) from 2010 to 2013. Haruka’s work includes the Smelling Screen and the Olfactory Assist Mask. Her research interests include olfactory displays, gas source localization robots, and body extension.


Marianna Obrist is Professor of Multisensory Interfaces at UCL, Department of Computer Science. She has established the SCHI Lab, an interdisciplinary research group, investigating touch, taste, and smell as interaction modalities. She is a co-founder of OWidgets Ltd, a University start-up developing novel software and hardware solutions for smell experience design. She is an inaugural member of the ACM Future of Computing Academy and was selected Young Scientist 2017 and 2018 to attend the World Economic Forum. She is a Visiting Professor at the Royal College of Art and has recently published a book on Multisensory Experiences: where the senses meet technology.


Roshan Peiris is an Assistant Professor at the School of Information at the Rochester Institute of Technology. He currently leads the Altered Reality research group that focuses on understanding the human perception to create new technologies that can alter and enhance our experiences. His research areas include haptics, multisensory mixed reality technologies, wearable computing, and accessibility. Roshan received his PhD from the National University of Singapore in Integrative Sciences and Engineering and his BSc in Electrical Engineering from the University of Moratuwa. His work includes ThermoVR, ThermalBracelet, LiquidReality, and LiquidVR, and has received several awards at international events.


Nimesha Ranasinghe is an Assistant Professor at the School of Computing and Information Science and directs the Multisensory Interactive Media lab (MIM lab) at the University of Maine. He completed his PhD at the Department of Electrical and Computer Engineering, National University of Singapore (NUS) in 2013. Dr. Ranasinghe’s research interests include Multisensory Interactive Media, Human-Computer Interaction, Augmented and Virtual Reality. He is well-known for his Digital Lollipop (a.k.a. Virtual Flavors) and Virtual Cocktail (Vocktail) inventions and featured in numerous media publications worldwide, including New Scientist, New York Times, Time Magazine, BBC Radio, Discovery Channel, and Reuters.