E-ElasticityAn Electrotactile Method to Simulate Elasticity
Overview
Accurately perceiving the elasticity of virtual objects is crucial in virtual reality; however, existing approaches are often cumbersome and impractical.
We introduce E-Elasticity, an electrotactile method for simulating elasticity in VR. The system determines whether the contact surface is in a static or slipping state based on the magnitude of stretching and the pinch force. It conveys this information through synchronized electrotactile and visual feedback, helping users understand and perceive the elasticity of the target object.
User studies have demonstrated that E-Elasticity enables participants to accurately distinguish predefined stiffness levels, offering a lightweight, efficient haptic solution for rendering elasticity in virtual environments.
Accurately perceiving the elasticity of virtual objects is crucial in virtual reality; however, existing approaches are often cumbersome and impractical.
We introduce E-Elasticity, an electrotactile method for simulating elasticity in VR. The system determines whether the contact surface is in a static or slipping state based on the magnitude of stretching and the pinch force. It conveys this information through synchronized electrotactile and visual feedback, helping users understand and perceive the elasticity of the target object.
User studies have demonstrated that E-Elasticity enables participants to accurately distinguish predefined stiffness levels, offering a lightweight, efficient haptic solution for rendering elasticity in virtual environments.
Previous Work
This research builds on my earlier research, Slip-Grip (CHI ’25).
View this research →
This research builds on my earlier research, Slip-Grip (CHI ’25).
View this research →
DurationNov. 2024 – May. 2025 (7 months)
InstructorTeng Han (Director of HCI lab, Institute of Software, Chinese Academy of Sciences)
Key WordsElectrotactile Technology, VR, Elasticity Perception
StatusSubmitted to CHI 2026
Haptic feedback is essential for enhancing realism and performance in virtual and augmented reality, teleoperation, and VR-based training. Elasticity, a fundamental property in object interaction, remains particularly challenging to render without mechanical actuation. Previous research has used grounded devices, exoskeletons, or pseudo-haptics to simulate elasticity. But these approaches either lack realism or rely too heavily on vision. Slip and contact cues, which are critical to elasticity perception, remain underexplored.
We introduce E-Elasticity, a method that simulates elasticity via electrotactile stimulation. The system continuously monitors lateral motion and normal force, determines contact states (static friction/slip/release), and generates corresponding visual and electrotactile feedback to modulate the perceived elasticity.
-
Hand motion is tracked using a Qualisys motion-capture system with three Arqus A5 cameras and QTM 2021.2.
- The state-determination and VR-rendering program was developed in Unity 3D with C#.
- Visual scenes are streamed to a Meta Quest 2 headset via Oculus Link over a wired connection.
The electrotactile system includes a driver unit, power module, fingertip interface, and pressure-sensor driver. The fingertip interface, worn on the dominant index finger, integrates two 7×7 FPC electrode arrays (index and thumb sides) and a thin-film SingleTact pressure sensor beneath the thumb-side array. All components are mounted on a rigid PLA frame with retroreflective motion-capture markers, while the pressure-sensor driver sits on the back of the hand. Each electrode array is driven by a high-voltage module built with a Raspberry Pi Pico RP2040, Microchip HV513 chips, and a CH9120 Ethernet controller. The HV513 provides high-voltage outputs for independent electrode control (three states), and the 64-channel array updates within 1 μs. Communication runs via UDP over 10Base-T Ethernet. The system is powered by a 12 V DC supply, with converters generating low-voltage logic rails and a high-voltage HV264 amplifier (0–200 V). Safety features include current monitoring through a voltage divider and self-resetting fuses/relays that cut off high voltage during overload.
Key Algorithm
The system continuously monitors the user’s lateral motion and normal force and determines the contact state (static, slip, or release) on every frame of the Unity program, providing corresponding visual and electrotactile feedback. The distinction between static contact and slip depends on whether the actual grip force 𝐹user exceeds the expected force 𝐹exp; the calculation of the expected grip force required to prevent slipping (𝐹exp) is shown in Fig.E. The release state occurs when there is no contact between the fingertip and the object, during which the cylinder returns to its initial position at a constant velocity.
We conducted a series of experiments to systematically evaluate the perceptual effectiveness of the E-Elasticity system.
A. Twisting a wet cloth
B. Stretching tissue
C. Bending a USB cable
Force and kinesthetic feedback enhance learners’ reasoning and conceptual understanding. With its rich force and tactile cues, the E-Elasticity system can support tasks such as:
A. Perceiving pulleys
B. Perceiving elastic materials
C. D. Perceiving gears
E-Elasticity enables realistic haptic perception in VR gaming and everyday VR interactions, enhancing the accuracy of material perception:
A. Helps users control slingshots precisely
B. Simulates fabric elasticity in virtual fashion
A. Thyrocricocentesis surgery
B. Cardiovascular interventions