investigating unobtrusive edible tags using digital food fabrication
Yamato Miyatake, Parinya Punpongsanon
Human-Food Interaction (HFI) examines how digital integration can enhance dining experiences, with food tagging playing a crucial role in connecting physical dining environments with digital information. Attaching optical tags to the surface or sides of food often detracts from its aesthetics, negatively impacting the perceived taste and overall dining experience. To address this issue, we propose an unobtrusive food tagging approach that embeds tags inside the food, maintaining both its visual appeal and sensory qualities. We first developed a tagging method using a 3D printer and proposed an end-to-end pipeline for embedding and retrieving the tags. We evaluated this method in terms of tag detectability, concealability, and the eating experience. Additionally, we developed tagging methods using molding and stamping to extend acceptability to the traditional cooking environment. Through a workshop with three home chefs, we found that these methods are accessible and easy to adopt for novice users. Our findings demonstrate the potential of embedded food tagging to integrate digital information into the dining experience without compromising culinary integrity. This approach offers new directions for HFI research and practical applications.
Front. Nutr. 2025
Ko Kaki, Parinya Punpongsanon
Hand Redirection (HR) techniques have been widely studied in virtual reality (VR) to accommodate the hand position relative to the user’s real hand. While such HR can be applied in a VR since the real hand does not appear to the user, in augmented reality (AR), the user’s real hand is visible alongside the virtual hand, which might reduce the HR illusion using the VR technique. To this challenge, we conducted two exploratory experiments to evaluate whether VR-based HR techniques can be applied in AR without disrupting user performance or whether there are different elements to be used before the HR in AR. Results indicate minimal impact on behavior, supporting HR’s potential as lightweight pseudo-haptic feedback in AR. Our results provided the potential to apply HR in the AR environment.
IEEE GCCE 2025
Investigating Olfactory Display in VR using a Small Drone
Unoki Ryousei, Parinya Punpongsanon
Providing olfactory stimuli in virtual reality (VR) environments is a promising approach to enhancing users’ sense of presence. In this study, we proposed a system for delivering olfactory stimuli to users using a drone. We implemented a workflow where the drone autonomously moves away when the user approaches the olfactory spot in the different scenarios. Experimental results showed that closer retreat distances increased users’ awareness of airflow, suggesting that physical sensations influence overall experience.
IEEE GCCE 2025
A Preliminary Study on Minimal Interaction for Navigation Using Subtle Motion
Peng Bo, Parinya Punpongsanon
Accessible Virtual Reality (VR) navigation remains challenging, particularly using subtle hand gestures. We propose and evaluate SubtleNav, a system that enhances conventional hand tracking using real-time motion magnification. SubtleNav aims to enable precise and low-effort steering control in a VR environment. A preliminary study compared control success rates between SubtleNav and conventional hand tracking, showing that SubtleNav improves success rates for less dexterous fingers. However, it also introduces a trade-off, negatively impacting stability during fine control and state maintenance. This work demonstrates the feasibility of video magnification for enhancing subtle VR interactions while highlighting the key sensitivity-stability balance that requires further investigation.
IEEE GCCE 2025
Manipulating Virtual Liquid Weight Perception using Vibrating Illusion
Natthaphon Hatsakornkhanachok, Parinya Punpongsanon
Virtual laboratories (VLs) allow for a safe learning environment in the laboratory. However, they have some restrictions on haptic feedback, such as the perceived weight or force feedback, that limit users’ experiences. Our goal is to enhance the experience of VLs solely with a virtual reality (VR) controller, without requiring additional hardware. This paper explores the potential to utilize vibration feedback from a VR controller, combined with visual feedback, to influence the perceived virtual liquid weight in VR. Unlike prior work that focuses on simulating the weight of rigid objects, our work presents a simulation of virtual liquid weight perception using vibration frequency modulation. Through an exploratory study, our results showed that as vibration frequency increased, participants perceived the virtual liquid weight as heavier, and vice versa. As an initial step, we believe that our results provide the potential to enhance the perception in VLs through commercially available haptic devices and offer new opportunities to strengthen perception, realism and interaction in VLs.
IEEE GCCE 2025

MIT Ethics of Computing Research Symposium, May 2025
Yamato Miyatake, Parinya Punpongsanon
Previous work in Human-Food Interaction has investigated how to embed tags into food. One promising approach involves controlling the internal structure using food 3D printers. However, this method is not widely accessible due to the current limitations of food 3D printers. This paper explores alternative fabrication techniques, molding and stamping, for embedding unobtrusive tags inside foods. Our preliminary evaluations showed that the proposed methods can embed tags and have the potential to employ a wide range of materials with lower costs compared with the food 3D printing technique. These findings suggest that low-cost edible tagging technologies could become more accessible and versatile.
ACM SIGGRAPH Asia 2024

ACM UIST, Oct 2024

MIT CSAIL, Aug 2024
Atsushi Maki, Parinya Punpongsanon, Daisuke Iwai, Kosuke Sato
Perceived taste and flavor of food are greatly affected by visual information, and manipulating them in projection mapping (PM) has been explored. This paper focuses on another aspect of the gustatory sense‐food texture. We investigate whether perceived food textures can be modified by the perceptually deforming effect in PM. Through a user study, we found that the perceived softness of a pudding could be significantly increased by amplifying the apparent movement of the pudding. Interestingly, other gustatory perceptions unrelated to food textures were not affected. We believe that this result expands the potential of PM-based modulation of gustatory perceptions.
The Virtual Reality Society of Japan (VRSJ) 2024
Parinya Punpongsanon
Personalization of eating such that everyone consumes only what they need allows improving our management of food waste. In this paper, we explore the use of food 3D printing to create perceptual illusions for controlling the level of perceived satiety given a defined amount of calories. We present FoodFab, a system that allows users to control their food intake through modifying a food’s internal structure via two 3D printing parameters: infill pattern and infill density. In two experiments with a total of 30 participants, we studied the effect of these parameters on users’ chewing time that is known to affect people’s feeling of satiety. Our results show that we can indeed modify the chewing time by varying infill pattern and density, and thus control perceived satiety. Based on the results, we propose two computational models and integrate them into a user interface that simplifies the creation of personalized food structures.
The Virtual Reality Society of Japan (VRSJ) 2024

MIT Expanding Horizons in Computing, Feb 2024

MIT CSAIL, Sep 2023


Stanford Human-Computer Interaction Seminar, Apr 2023

MIT Morningside Academy of Design, Dec 2022
Color Changeable 3D Printed Objects using Bi-Stable Thermochromic Materials
Yuto Umetsu, Parinya Punpongsanon, Takefumi Hiraki
In this study, we propose a method to control the color andpattern on the surface of 3D printed objects using bi-stable ther-mochromic materials. We mixed UV-curing resin with bi-stablethermochromic materials and selectively heated the surface of the3D object with a laser (Fig. 1a). Unlike previous attempts that usesimilar bi-stable mechanism , our method allowsboth 3D printed objects and later selectively change the color of3D printed objects. In addition, we have fabricated various itemsusing our proposed method and explored its application scenarios.
ACM SIGGRAPH Asia 2022

ACM KDD Visualization in Data Science (VDS) Workshop, Aug 2021

ACM SIGMOD HILDA Workshop, Jun 2020
Creating Food Perception Illusions using Food 3D Printing
Ying-Ju Lin, Parinya Punpongsanon, Xin Wen, Daisuke Iwai, Kosuke Sato, Marianna Obrist, Stefanie Mueller
Personalization of eating such that everyone consumes only what they need allows improving our management of food waste. In this paper, we explore the use of food 3D printing to create perceptual illusions for controlling the level of perceived satiety given a defined amount of calories. We present FoodFab, a system that allows users to control their food intake through modifying a food’s internal structure via two 3D printing parameters: infill pattern and infill density. In two experiments with a total of 30 participants, we studied the effect of these parameters on users’ chewing time that is known to affect people’s feeling of satiety. Our results show that we can indeed modify the chewing time by varying infill pattern and density, and thus control perceived satiety. Based on the results, we propose two computational models and integrate them into a user interface that simplifies the creation of personalized food structures.
ACM CHI 2020

Harvard Radcliffe Institute, Apr 2019
extending the range of haptic feedback on virtual hand using drone-based object recognition
Tinglin Duan, Parinya Punpongsanon, Daisuke Iwai, Kosuke Sato
This paper presents a Head Mounted Display (HMD) integrated system, that uses a drone and a virtual hand to help the users explore remote environment. The system allows the users to use hand gestures to control the drone and identify the Objects of Interest (OOI) through tactile feedback. The system uses a Convolutional Neural Network to perform object classification with the drone captured image and provides a virtual hand to realize interaction with the object. Accodingly, tactile feedback is also provided to users’ hands to enhance the virtual hand body ownership. The system aims to help users assess space and objects regardless of body limitations, which could not only benefit elderly or handicapped people, but make potential contributions in environment measurement and daily life as well.
ACM SIGGRAPH Asia 2018
Visually Manipulating Haptic Softness Perception in Spatial Augmented Reality
Parinya Punpongsanon, Daisuke Iwai, Kosuke Sato
We present SoftAR, a novel spatial augmented reality (AR) technique based on a pseudo-haptics mechanism that visually manipulates the sense of softness perceived by a user pushing a soft physical object. Considering the limitations of projection-based approaches that change only the surface appearance of a physical object, we propose two projection visual effects, i.e., surface deformation effect (SDE) and body appearance effect (BAE), on the basis of the observations of humans pushing physical objects (Figure 1). The SDE visualizes a two-dimensional deformation of the object surface with a controlled softness parameter, and BAE changes the color of the pushing hand. Through psychophysical experiments, we confirm that the SDE can manipulate softness perception such that the participant perceives significantly greater softness than the actual softness. Furthermore, fBAE, in which BAE is applied only for the finger area, significantly enhances manipulation of the perception of softness. We create a computational model that estimates perceived softness when SDE+fBAE is applied. We construct a prototype SoftAR system in which two application frameworks are implemented. The softness adjustment allows a user to adjust the softness parameter of a physical object, and the softness transfer allows the user to replace the softness with that of another object.
The sense of softness is an essentialhapticcue that significantly af-fects the impressions of soft objects as well as the assessment of theirmaterial qualities. Softness properties should be designed carefullyfor various soft products, particularly those that must be comfortablefor users, such as furniture (e.g., cushions and sofas), clothes (e.g.,hats and shoes), and plush toys (e.g., dolls). For soft products that useimitation materials (e.g., artificial leather and fake fur), manufacturershave pursued not only textures but also softness of products that aresimilar to those of real materials to achieve high-quality products atlow prices. The food industry also attempts to optimize the softness offood products, which can significantly affect the taste of the products.In social human-robot interaction, the softness of a robot’s skin playsan important role because it determines the close physical interactionbetween robots and humans, and the impressions of the robot.
IEEE TVCG 2015