<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Projects | Haokun Wang</title><link>https://wanghaokun.site/project/</link><atom:link href="https://wanghaokun.site/project/index.xml" rel="self" type="application/rss+xml"/><description>Projects</description><generator>Hugo Blox Builder (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Fri, 11 Oct 2024 00:00:00 +0000</lastBuildDate><item><title>Fiery Hands: Thermal-Tactile Glove for VR Object Manipulation</title><link>https://wanghaokun.site/project/fiery-hands/</link><pubDate>Fri, 11 Oct 2024 00:00:00 +0000</pubDate><guid>https://wanghaokun.site/project/fiery-hands/</guid><description>&lt;h2 id="overview">Overview&lt;/h2>
&lt;p>How do you make a virtual fire feel real on your hands? &lt;strong>Fiery Hands&lt;/strong> answers that question with a custom wearable thermal glove that delivers localized thermal &lt;em>and&lt;/em> tactile sensations to the palm and all five fingertips — without blocking the hand or preventing natural object manipulation in VR.&lt;/p>
&lt;p>Published at &lt;strong>ACM UIST 2024&lt;/strong> (the premier venue for novel interactive systems), this project represents a step change in how XR systems can deliver believable thermal touch.&lt;/p>
&lt;hr>
&lt;h2 id="the-problem">The Problem&lt;/h2>
&lt;p>Existing haptic gloves either:&lt;/p>
&lt;ul>
&lt;li>Cover the inner palm and fingertip surfaces, blocking touch and dexterity, or&lt;/li>
&lt;li>Place thermal actuators only on the back of the hand, limiting localized feedback&lt;/li>
&lt;/ul>
&lt;p>The challenge: thermal actuators are physically large (Peltier modules), slow (seconds to change temperature), and need direct skin contact. Placing enough of them to cover a hand while preserving freedom of movement seemed contradictory.&lt;/p>
&lt;p>&lt;strong>Research Question:&lt;/strong> Can we achieve the &lt;em>perception&lt;/em> of localized thermal feedback across the full hand using fewer actuators cleverly placed on non-obstructive body sites?&lt;/p>
&lt;hr>
&lt;h2 id="research-approach">Research Approach&lt;/h2>
&lt;p>We leveraged two perceptual phenomena from psychophysics:&lt;/p>
&lt;ol>
&lt;li>&lt;strong>Thermal Referral&lt;/strong> — the brain attributes a thermal sensation to a &lt;em>nearby&lt;/em> tactile stimulus site, not the actual thermal source. Heat felt elsewhere &amp;ldquo;moves&amp;rdquo; to where you&amp;rsquo;re touching.&lt;/li>
&lt;li>&lt;strong>Tactile Masking&lt;/strong> — a vibrotactile cue can suppress or redirect the perceived location of a thermal stimulus.&lt;/li>
&lt;/ol>
&lt;p>By combining strategically placed Peltier actuators on the &lt;em>outer&lt;/em> palm and back of fingers with vibrotactile motors at fingertip contact points, we could generate perceived thermal sensations &lt;em>at the fingertips&lt;/em> without physically touching them.&lt;/p>
&lt;hr>
&lt;h2 id="system-design">System Design&lt;/h2>
&lt;h3 id="hardware">Hardware&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Thermal actuators&lt;/strong>: 4 custom-fabricated Peltier modules (30 × 30 mm) mounted on the outer palm and finger dorsal surfaces&lt;/li>
&lt;li>&lt;strong>Tactile actuators&lt;/strong>: 5 coin-type LRA vibration motors placed at the inner fingertip&lt;/li>
&lt;li>&lt;strong>Controller&lt;/strong>: Arduino Mega with custom power amplifier board; Bluetooth LE to PC&lt;/li>
&lt;li>&lt;strong>Glove substrate&lt;/strong>: Thin spandex with 3D-printed actuator mounts — allows full grip&lt;/li>
&lt;/ul>
&lt;h3 id="unity-vr-integration">Unity VR Integration&lt;/h3>
&lt;ul>
&lt;li>Built in &lt;strong>Unity 2022 LTS&lt;/strong> with &lt;strong>OpenXR / XR Interaction Toolkit&lt;/strong>&lt;/li>
&lt;li>Custom C# &lt;code>HapticFeedbackManager&lt;/code> subscribes to XR physics collision events and maps contact surface temperature to actuator commands&lt;/li>
&lt;li>Real-time thermal rendering: fire = sustained warm + rhythmic vibration; ice = sustained cool + gentle pulse; metal = rapid ramp-up on contact&lt;/li>
&lt;li>Deployed on &lt;strong>Meta Quest 2&lt;/strong> via Quest Link (PC-tethered for full Peltier power budget)&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="user-evaluation">User Evaluation&lt;/h2>
&lt;h3 id="study-1--thermal-localization">Study 1 — Thermal Localization&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 12 participants&lt;/strong>, within-subject design&lt;/li>
&lt;li>&lt;strong>Task&lt;/strong>: identify which finger perceived the thermal stimulus while only the dorsal actuators were active&lt;/li>
&lt;li>&lt;strong>Conditions&lt;/strong>: palm-only thermal, 4× Peltier positions × 3 temperature levels (warm/hot/neutral)&lt;/li>
&lt;li>&lt;strong>Measure&lt;/strong>: accuracy of localization, JND (just-noticeable difference)&lt;/li>
&lt;/ul>
&lt;h3 id="study-2--vr-interaction-plausibility">Study 2 — VR Interaction Plausibility&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 16 participants&lt;/strong>&lt;/li>
&lt;li>&lt;strong>Task&lt;/strong>: interact with three virtual objects (glowing coal, ice block, metal rod) and rate realism&lt;/li>
&lt;li>&lt;strong>Conditions&lt;/strong>: thermal-only, tactile-only, thermal+tactile (Fiery Hands), and no-feedback baseline&lt;/li>
&lt;li>&lt;strong>Measures&lt;/strong>: NASA-TLX, immersion subscale, perceived temperature match (7-pt Likert)&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="results--key-findings">Results &amp;amp; Key Findings&lt;/h2>
&lt;ul>
&lt;li>&lt;strong>Localization accuracy: 84%&lt;/strong> — participants correctly identified the stimulated finger using only dorsal Peltier placement, validating the thermal referral strategy&lt;/li>
&lt;li>&lt;strong>Plausibility rating&lt;/strong> of thermal+tactile condition was &lt;strong>significantly higher&lt;/strong> than any single-modality condition (F(3,45)=18.4, p&amp;lt;.001, η²=0.55)&lt;/li>
&lt;li>Users reported the coal interaction as &amp;ldquo;surprisingly convincing&amp;rdquo; — qualitative themes: warmth buildup over time felt organic, not mechanical&lt;/li>
&lt;li>Power consumption reduced by &lt;strong>60%&lt;/strong> vs. placing individual Peltiers at each fingertip while achieving comparable perceptual quality&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="impact">Impact&lt;/h2>
&lt;ul>
&lt;li>📄 Published: &lt;strong>ACM UIST 2024&lt;/strong> — &lt;em>Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology&lt;/em>&lt;/li>
&lt;li>DOI:
&lt;/li>
&lt;li>Inspired follow-on work on thermally-integrated wearables for extended wear XR sessions&lt;/li>
&lt;/ul></description></item><item><title>Let It Snow: Cross-Modal Cold &amp; Touch for VR Snowfall</title><link>https://wanghaokun.site/project/let-it-snow/</link><pubDate>Wed, 15 May 2024 00:00:00 +0000</pubDate><guid>https://wanghaokun.site/project/let-it-snow/</guid><description>&lt;h2 id="overview">Overview&lt;/h2>
&lt;p>&lt;strong>Let It Snow&lt;/strong> is a hands-free, wearable-free haptic experience: users hold their bare hands over a custom mid-air display that simultaneously fires focused ultrasound pressure points and directed cold airflow to simulate individual snowflakes landing — or rain drops splattering — on their palms.&lt;/p>
&lt;p>Published in &lt;strong>ACM IMWUT 2024&lt;/strong> (Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies), the project explores how cross-modal cold–tactile pairing creates emergent sensory illusions greater than either cue alone.&lt;/p>
&lt;hr>
&lt;h2 id="the-problem">The Problem&lt;/h2>
&lt;p>Simulating precipitation in VR is a classic immersion gap. Visually, snow and rain can look photorealistic. But without &lt;em>feeling&lt;/em> the cold, the wet, the gentle impact — users never quite believe it. Existing approaches require worn devices, which break the &amp;ldquo;bare hand in the weather&amp;rdquo; fantasy entirely.&lt;/p>
&lt;p>&lt;strong>Core Questions:&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>Can cold airflow and ultrasound pressure co-localize in mid-air to synthesize a snowflake or raindrop percept?&lt;/li>
&lt;li>Do cold and tactile cues mask each other, or can they be independently perceived at the same skin location?&lt;/li>
&lt;li>How should aggregated stimuli be rendered for heavy snowfall / rainfall?&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="research-approach">Research Approach&lt;/h2>
&lt;p>We drew on &lt;strong>cross-modal sensory integration&lt;/strong> theory: cold and tactile channels are processed by separate neural pathways (thermoreceptors vs. mechanoreceptors), so two signals can coexist without mutual interference — unlike, say, two sounds at the same frequency.&lt;/p>
&lt;p>Key hypothesis: a brief cold puff + simultaneous pressure focus = snowflake percept; a sharp cold burst + faster pressure = raindrop percept.&lt;/p>
&lt;p>We also designed an &lt;strong>aggregated haptic scheme&lt;/strong> for particle-dense scenes: rather than rendering every particle individually (physically impossible), we modulate cold intensity and pressure density proportionally to particle count, exploiting temporal summation in both sensory channels.&lt;/p>
&lt;hr>
&lt;h2 id="system-design">System Design&lt;/h2>
&lt;h3 id="hardware">Hardware&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Cold array&lt;/strong>: 6 Peltier modules (20 × 20 mm) mounted in a ring, each with a micro-fan to direct cold air toward the focus point; temperature range: 5°C–15°C above ambient&lt;/li>
&lt;li>&lt;strong>Ultrasound haptic display&lt;/strong>: Ultrahaptics STRATOS Inspire — 256 transducers at 40 kHz, creating mid-air pressure foci up to 200 mN at distances up to 22 cm&lt;/li>
&lt;li>&lt;strong>Depth tracking&lt;/strong>: Intel RealSense D435 hand tracking, integrated into Unity for palm position → focus point mapping&lt;/li>
&lt;li>&lt;strong>Control PC&lt;/strong>: Custom C++ driver for thermal timing; Unity handles audio, visuals, and hand tracking&lt;/li>
&lt;/ul>
&lt;h3 id="unity-vr-integration">Unity VR Integration&lt;/h3>
&lt;ul>
&lt;li>Built in &lt;strong>Unity 2021 LTS&lt;/strong>, standalone VR scene with Oculus Integration SDK&lt;/li>
&lt;li>Particle system drives two managers:
&lt;ul>
&lt;li>&lt;code>SnowRenderer&lt;/code>: handles visual particles with collision callbacks to trigger haptic events&lt;/li>
&lt;li>&lt;code>HapticAggregator&lt;/code>: accumulates per-frame particle counts, applies transfer function to Peltier intensity and ultrasound amplitude&lt;/li>
&lt;/ul>
&lt;/li>
&lt;li>Snowflake percept: 150 ms cold puff + 40 Hz pressure burst; Raindrop: 60 ms sharp cold + 200 Hz single-pulse&lt;/li>
&lt;li>Scene contains interactive environments: snowy mountain valley, rainstorm on a city rooftop&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="user-evaluation">User Evaluation&lt;/h2>
&lt;h3 id="perceptual-study--cold--tactile-independence">Perceptual Study — Cold × Tactile Independence&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 14 participants&lt;/strong>&lt;/li>
&lt;li>&lt;strong>Design&lt;/strong>: 2 (cold present/absent) × 2 (tactile present/absent) × 5 repetitions&lt;/li>
&lt;li>&lt;strong>Measure&lt;/strong>: detection accuracy per modality, reported interference rating&lt;/li>
&lt;li>&lt;strong>Finding&lt;/strong>: No significant cross-modal masking — participants detected cold and tactile independently (d&amp;rsquo; &amp;gt; 2.5 for both modalities combined)&lt;/li>
&lt;/ul>
&lt;h3 id="experience-study--aggregated-rendering-comparison">Experience Study — Aggregated Rendering Comparison&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 20 participants&lt;/strong>, within-subject&lt;/li>
&lt;li>&lt;strong>Conditions&lt;/strong>: (1) no haptics, (2) tactile-only, (3) cold-only, (4) Snow (cold+tactile sparse), (5) Snow (cold+tactile aggregated)&lt;/li>
&lt;li>&lt;strong>Measures&lt;/strong>: presence subscale (IPQ), realism rating, preference ranking&lt;/li>
&lt;li>&lt;strong>Task&lt;/strong>: 3-minute free exploration of snowy mountain scene, 3-minute rainstorm scene&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="results--key-findings">Results &amp;amp; Key Findings&lt;/h2>
&lt;ul>
&lt;li>&lt;strong>Aggregated scheme rated significantly more realistic&lt;/strong> than sparse individual-particle scheme (p&amp;lt;.01) for heavy snowfall&lt;/li>
&lt;li>Cold+tactile combination rated &lt;strong>+1.8 points&lt;/strong> on 7-pt presence scale vs. tactile-only (p&amp;lt;.001)&lt;/li>
&lt;li>18/20 participants preferred the full cross-modal condition; primary qualitative theme: &amp;ldquo;it actually felt cold and real, like being outside&amp;rdquo;&lt;/li>
&lt;li>System achieved stable cold delivery at ±0.3°C variance across a 10-minute continuous session&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="impact">Impact&lt;/h2>
&lt;ul>
&lt;li>📄 Published: &lt;strong>ACM IMWUT 2024&lt;/strong> — &lt;em>Proc. ACM Interact. Mob. Wearable Ubiquitous Technol.&lt;/em>&lt;/li>
&lt;li>DOI:
&lt;/li>
&lt;li>Framework for aggregated haptic rendering has been adopted in follow-on multi-particle VR haptics research&lt;/li>
&lt;/ul></description></item><item><title>Thermal Masking: When the Illusion Takes Over the Real</title><link>https://wanghaokun.site/project/thermal-masking/</link><pubDate>Sat, 11 May 2024 00:00:00 +0000</pubDate><guid>https://wanghaokun.site/project/thermal-masking/</guid><description>&lt;h2 id="overview">Overview&lt;/h2>
&lt;p>&lt;strong>Thermal Masking&lt;/strong> is a newly characterized perceptual illusion: when a vibrotactile stimulus is applied near a thermal stimulus, the &lt;em>perceived location&lt;/em> of warmth completely jumps to the tactile site — the original thermal signal vanishes from conscious perception. This &amp;ldquo;masking&amp;rdquo; is distinct from previously known thermal referral and has profound implications for wearable haptic design.&lt;/p>
&lt;p>Published at &lt;strong>ACM CHI 2024&lt;/strong> (the flagship venue for human–computer interaction research), this work provides the first systematic characterization of thermal masking on the human arm.&lt;/p>
&lt;hr>
&lt;h2 id="the-problem">The Problem&lt;/h2>
&lt;p>Thermal feedback in wearables is expensive: Peltier modules are bulky, power-hungry, and slow. Covering a large body area (like the back or full arm) with thermal actuators is impractical. Prior work showed that thermal sensations can be &amp;ldquo;referred&amp;rdquo; — stretched toward a tactile stimulus. But we suspected a more dramatic effect existed.&lt;/p>
&lt;p>&lt;strong>Hypothesis:&lt;/strong> A single vibrotactile cue could &lt;em>completely suppress&lt;/em> the original thermal percept, not just shift it — enabling sparse thermal + dense tactile arrays to cover large body areas affordably.&lt;/p>
&lt;hr>
&lt;h2 id="research-approach">Research Approach&lt;/h2>
&lt;p>Three controlled psychophysical experiments on the forearm, each systematically varying one factor:&lt;/p>
&lt;table>
&lt;thead>
&lt;tr>
&lt;th>Experiment&lt;/th>
&lt;th>Manipulated Variable&lt;/th>
&lt;th>Key Question&lt;/th>
&lt;/tr>
&lt;/thead>
&lt;tbody>
&lt;tr>
&lt;td>1&lt;/td>
&lt;td>Temperature level&lt;/td>
&lt;td>Does masking occur more at warm vs. hot vs. cold?&lt;/td>
&lt;/tr>
&lt;tr>
&lt;td>2&lt;/td>
&lt;td>Thermal-to-tactile distance&lt;/td>
&lt;td>How far can masking propagate?&lt;/td>
&lt;/tr>
&lt;tr>
&lt;td>3&lt;/td>
&lt;td>Actuator placement (same side vs. opposite side of arm)&lt;/td>
&lt;td>Does masking cross body-part boundaries?&lt;/td>
&lt;/tr>
&lt;/tbody>
&lt;/table>
&lt;p>Participants reported where they felt the temperature (thermal site, tactile site, or both) on each trial. Masking was defined as reporting &lt;em>only&lt;/em> the tactile site despite the thermal actuator being active elsewhere.&lt;/p>
&lt;hr>
&lt;h2 id="apparatus">Apparatus&lt;/h2>
&lt;ul>
&lt;li>&lt;strong>Thermal actuator&lt;/strong>: Single Peltier module (40 × 40 mm), range 20°C – 45°C (cold, neutral, warm, hot conditions)&lt;/li>
&lt;li>&lt;strong>Tactile actuator&lt;/strong>: ERM (eccentric rotating mass) motor, 180 Hz, 1.5 G amplitude&lt;/li>
&lt;li>&lt;strong>Placement rig&lt;/strong>: 3D-printed sliding rail on the forearm allowing 2–24 cm inter-actuator distance&lt;/li>
&lt;li>All stimuli were synchronized via Arduino with 1 ms timing precision&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="user-evaluation">User Evaluation&lt;/h2>
&lt;ul>
&lt;li>&lt;strong>Total N = 48 participants&lt;/strong> (16 per experiment), all naïve to the hypothesis&lt;/li>
&lt;li>&lt;strong>Design&lt;/strong>: fully within-subject with counterbalanced ordering&lt;/li>
&lt;li>&lt;strong>Trial structure&lt;/strong>: 3 s baseline → simultaneous thermal + tactile onset for 5 s → localization report → 30 s ISI for thermal recovery&lt;/li>
&lt;li>&lt;strong>Measures&lt;/strong>: localization accuracy (thermal site / tactile site / both), response time, confidence rating&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="results--key-findings">Results &amp;amp; Key Findings&lt;/h2>
&lt;p>&lt;strong>Experiment 1 — Temperature Level:&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>Thermal masking rate: &lt;strong>warm: 73%&lt;/strong>, hot: 41%, cold: 38%&lt;/li>
&lt;li>Warm conditions produced significantly higher masking than hot or cold (χ²(2)=24.3, p&amp;lt;.001)&lt;/li>
&lt;li>Implication: warm stimulation (≈35–38°C) is the optimal operating zone for masking-based designs&lt;/li>
&lt;/ul>
&lt;p>&lt;strong>Experiment 2 — Distance:&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>Masking persisted up to &lt;strong>24 cm&lt;/strong> from the thermal actuator — nearly the full forearm length&lt;/li>
&lt;li>Masking rate decayed logarithmically with distance (r²=0.91)&lt;/li>
&lt;li>Practical finding: one thermal module can plausibly cover the entire forearm with tactile array assist&lt;/li>
&lt;/ul>
&lt;p>&lt;strong>Experiment 3 — Opposite Side:&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>Masking also occurred when the tactile actuator was placed on the &lt;em>opposite&lt;/em> side of the arm (dorsal vs. volar)&lt;/li>
&lt;li>Rate: 58% — lower than same-side but still above chance (p&amp;lt;.001)&lt;/li>
&lt;li>Opens door to through-limb sensing designs (e.g., armband with actuators only on outer surface)&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="design-implications">Design Implications&lt;/h2>
&lt;p>These findings directly shaped the actuator placement strategy in
and
: by exploiting thermal masking, both projects placed thermal actuators exclusively on non-obstructive surfaces while delivering perceived localized warmth at the inner contact points.&lt;/p>
&lt;hr>
&lt;h2 id="impact">Impact&lt;/h2>
&lt;ul>
&lt;li>📄 Published: &lt;strong>ACM CHI 2024&lt;/strong> — &lt;em>Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems&lt;/em>&lt;/li>
&lt;li>DOI:
&lt;/li>
&lt;li>Citation: Haokun Wang, Yatharth Singhal, Hyunjae Gil, Jin Ryong Kim. &amp;ldquo;Thermal Masking: When the Illusion Takes Over the Real.&amp;rdquo; CHI &amp;lsquo;24.&lt;/li>
&lt;/ul></description></item><item><title>Fabric Thermal Display: Ultrasound-Heated Wearable for VR</title><link>https://wanghaokun.site/project/fabric-thermal-display/</link><pubDate>Fri, 01 Sep 2023 00:00:00 +0000</pubDate><guid>https://wanghaokun.site/project/fabric-thermal-display/</guid><description>&lt;h2 id="overview">Overview&lt;/h2>
&lt;p>Standard thermal wearables rely on Peltier thermoelectric modules — rigid, thick, and power-intensive. &lt;strong>Fabric Thermal Display&lt;/strong> takes a different approach: weave thermally-conductive materials (copper, aluminum mesh) into a fabric glove and excite them with focused ultrasonic waves. The friction heats the conductive fibers, delivering warmth through the fabric itself.&lt;/p>
&lt;p>Published at &lt;strong>IEEE ISMAR 2023&lt;/strong> (IEEE International Symposium on Mixed and Augmented Reality), this project delivers a proof-of-concept for ultrasound-driven textile thermal displays and demonstrates their use in VR object interaction scenarios.&lt;/p>
&lt;hr>
&lt;h2 id="the-problem">The Problem&lt;/h2>
&lt;p>Peltier-based thermal gloves work, but they have hard constraints:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>Thickness&lt;/strong>: modules are 3–5 mm, stiff, and change the hand&amp;rsquo;s natural shape&lt;/li>
&lt;li>&lt;strong>Power&lt;/strong>: each Peltier draws 3–10 W continuously&lt;/li>
&lt;li>&lt;strong>Scalability&lt;/strong>: covering all fingers requires 5+ modules, complicated wiring, and custom PCBs&lt;/li>
&lt;/ul>
&lt;p>Could fabric itself become the thermal actuator — flexible, lightweight, and able to conform to any body shape?&lt;/p>
&lt;p>&lt;strong>Research Question:&lt;/strong> Which fabric materials respond best to 40 kHz ultrasonic excitation, and can combinations with conductive materials achieve perceptually meaningful warmth for VR?&lt;/p>
&lt;hr>
&lt;h2 id="research-approach">Research Approach&lt;/h2>
&lt;p>We started with a &lt;strong>material science study&lt;/strong> before touching user testing:&lt;/p>
&lt;ol>
&lt;li>&lt;strong>Characterization phase&lt;/strong>: apply ultrasonic energy to 5 fabric types (polyester, cotton, nylon, Lycra, carbon-fiber blend), measure temperature rise over 30 s at three amplitude levels&lt;/li>
&lt;li>&lt;strong>Composite phase&lt;/strong>: integrate the best fabric (polyester) with copper mesh and aluminum foil, compare thermal curves&lt;/li>
&lt;li>&lt;strong>Perceptual phase&lt;/strong>: user study on thermal detection and level identification with the best material combination&lt;/li>
&lt;li>&lt;strong>Application phase&lt;/strong>: integrate into a glove form factor, demonstrate VR use cases&lt;/li>
&lt;/ol>
&lt;hr>
&lt;h2 id="system-design">System Design&lt;/h2>
&lt;h3 id="ultrasound-setup">Ultrasound Setup&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Ultrasound driver&lt;/strong>: Ultrahaptics STRATOS board, 40 kHz carrier, amplitude-modulated 0–100%&lt;/li>
&lt;li>&lt;strong>Focus geometry&lt;/strong>: single focal point directed at 15 cm standoff, corresponding to palm contact zone of glove&lt;/li>
&lt;li>&lt;strong>Thermal measurement&lt;/strong>: FLIR A315 thermal camera captured surface temperature maps at 9 Hz&lt;/li>
&lt;/ul>
&lt;h3 id="fabric-samples">Fabric Samples&lt;/h3>
&lt;table>
&lt;thead>
&lt;tr>
&lt;th>Material&lt;/th>
&lt;th>Peak Temp Rise (100% amp, 30s)&lt;/th>
&lt;th>Flexibility&lt;/th>
&lt;th>Notes&lt;/th>
&lt;/tr>
&lt;/thead>
&lt;tbody>
&lt;tr>
&lt;td>Polyester&lt;/td>
&lt;td>+18.4°C&lt;/td>
&lt;td>High&lt;/td>
&lt;td>Best performance&lt;/td>
&lt;/tr>
&lt;tr>
&lt;td>Cotton&lt;/td>
&lt;td>+9.1°C&lt;/td>
&lt;td>High&lt;/td>
&lt;td>Poor — high thermal mass&lt;/td>
&lt;/tr>
&lt;tr>
&lt;td>Nylon&lt;/td>
&lt;td>+12.3°C&lt;/td>
&lt;td>Medium&lt;/td>
&lt;td>Acceptable&lt;/td>
&lt;/tr>
&lt;tr>
&lt;td>Lycra&lt;/td>
&lt;td>+7.8°C&lt;/td>
&lt;td>Very high&lt;/td>
&lt;td>Too low output&lt;/td>
&lt;/tr>
&lt;tr>
&lt;td>Carbon fiber&lt;/td>
&lt;td>+21.1°C&lt;/td>
&lt;td>Low&lt;/td>
&lt;td>Best thermal, too stiff&lt;/td>
&lt;/tr>
&lt;/tbody>
&lt;/table>
&lt;p>&lt;strong>Winner: Polyester + Aluminum&lt;/strong> — +22.6°C peak, flexible, washable&lt;/p>
&lt;h3 id="glove-design">Glove Design&lt;/h3>
&lt;ul>
&lt;li>Polyester base with 0.1 mm aluminum foil laminate on palm zone&lt;/li>
&lt;li>Total glove weight: 28 g (vs. 95 g for Peltier glove baseline)&lt;/li>
&lt;li>No wiring — ultrasound is contactless&lt;/li>
&lt;/ul>
&lt;h3 id="unity-vr-integration">Unity VR Integration&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Unity 2021 LTS&lt;/strong> + Oculus Integration SDK (Quest 2)&lt;/li>
&lt;li>Custom &lt;code>FabricHapticManager&lt;/code>: maps virtual object surface temperature to ultrasound amplitude via lookup table&lt;/li>
&lt;li>Demonstrated VR scenarios: picking up hot metal ingot, holding warm beverage, touching cold ice sculpture&lt;/li>
&lt;li>Haptic rendering loop runs at 90 Hz, matching display refresh&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="user-evaluation">User Evaluation&lt;/h2>
&lt;h3 id="study-1--detection-thresholds">Study 1 — Detection Thresholds&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 12 participants&lt;/strong>&lt;/li>
&lt;li>&lt;strong>Task&lt;/strong>: signal detection (yes/no) across 5 amplitude levels, 2-AFC paradigm&lt;/li>
&lt;li>&lt;strong>Measure&lt;/strong>: warm detection threshold (WDT)&lt;/li>
&lt;li>&lt;strong>Result&lt;/strong>: mean WDT = 38% amplitude (≈ +7.2°C skin surface delta)&lt;/li>
&lt;/ul>
&lt;h3 id="study-2--level-identification-thermal-jnds">Study 2 — Level Identification (Thermal JNDs)&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 16 participants&lt;/strong>&lt;/li>
&lt;li>&lt;strong>Task&lt;/strong>: categorize warmth into 4 levels (none, low, medium, high) from ultrasound-heated glove&lt;/li>
&lt;li>&lt;strong>Condition&lt;/strong>: fabric-only vs. fabric+copper vs. fabric+aluminum&lt;/li>
&lt;li>&lt;strong>Result&lt;/strong>: fabric+aluminum achieved &lt;strong>78% accuracy&lt;/strong> for 4-level identification, significantly outperforming fabric-only (54%, p&amp;lt;.01) and fabric+copper (66%, p&amp;lt;.05)&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="results--key-findings">Results &amp;amp; Key Findings&lt;/h2>
&lt;ul>
&lt;li>Polyester is the optimal base fabric for ultrasonic thermal generation among tested materials&lt;/li>
&lt;li>Aluminum lamination provides +4.2°C improvement over copper at the same power setting&lt;/li>
&lt;li>Users could reliably distinguish 4 thermal levels through the glove, meeting the threshold needed for meaningful VR thermal rendering&lt;/li>
&lt;li>No participant reported discomfort over 20-minute continuous wear sessions&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="impact">Impact&lt;/h2>
&lt;ul>
&lt;li>📄 Published: &lt;strong>IEEE ISMAR 2023&lt;/strong> — &lt;em>IEEE International Symposium on Mixed and Augmented Reality&lt;/em>&lt;/li>
&lt;li>DOI:
&lt;/li>
&lt;li>The material findings fed directly into the Fiery Hands glove substrate design and the broader thermal-wearables research program at MI Lab&lt;/li>
&lt;/ul></description></item><item><title>Mid-Air Thermo-Tactile Fire: Ultrasound Haptic Display for VR</title><link>https://wanghaokun.site/project/mid-air-fire-haptics/</link><pubDate>Wed, 01 Sep 2021 00:00:00 +0000</pubDate><guid>https://wanghaokun.site/project/mid-air-fire-haptics/</guid><description>&lt;h2 id="overview">Overview&lt;/h2>
&lt;p>Imagine reaching toward a virtual campfire and actually feeling the heat wash over your hands — no gloves, no controllers, nothing on your skin. &lt;strong>Mid-Air Thermo-Tactile Fire&lt;/strong> is a proof-of-concept system that delivers both thermal warmth and vibrotactile pressure to a free hand hovering above a custom device, using a combination of heated airflow channels and a 40 kHz ultrasound haptic array.&lt;/p>
&lt;p>Published at &lt;strong>ACM VRST 2021&lt;/strong> (ACM Symposium on Virtual Reality Software and Technology), this was the first system to simultaneously characterize thermo-tactile mid-air feedback thresholds and demonstrate them in a VR fire interaction scenario.&lt;/p>
&lt;hr>
&lt;h2 id="the-problem">The Problem&lt;/h2>
&lt;p>Mid-air haptics (ultrasound) had proven that focused pressure can be delivered without contact. Thermal mid-air feedback existed in industrial settings (heat lamps). But &lt;strong>simultaneously combining both&lt;/strong> — localized, controllable, synchronized — for real-time VR had not been demonstrated.&lt;/p>
&lt;p>Key unknowns at project start:&lt;/p>
&lt;ul>
&lt;li>What temperature range can be achieved mid-air at realistic interaction distances (15–25 cm)?&lt;/li>
&lt;li>Does the ultrasonic pressure signal interfere with thermal perception (or vice versa)?&lt;/li>
&lt;li>What warm detection threshold (WDT) and heat-pain threshold (HPDT) apply to mid-air vs. contact thermal stimulation?&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="system-design">System Design&lt;/h2>
&lt;h3 id="hardware-architecture">Hardware Architecture&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Ultrasound display&lt;/strong>: 16×16 transducer array (256 elements), 40 kHz carrier, capable of focusing pressure at 10–25 cm above surface&lt;/li>
&lt;li>&lt;strong>Thermal channel&lt;/strong>: open-top acrylic chamber with 4 heating coils; a low-speed centrifugal fan directs warm air up through the focus zone&lt;/li>
&lt;li>&lt;strong>Temperature control&lt;/strong>: PID loop via Arduino — thermocouple at the focal plane feeds back to heater PWM, ±1°C stability&lt;/li>
&lt;li>&lt;strong>Integration&lt;/strong>: ultrasound focus point and warm airflow column co-aligned within ±5 mm&lt;/li>
&lt;/ul>
&lt;h3 id="measured-system-specs">Measured System Specs&lt;/h3>
&lt;table>
&lt;thead>
&lt;tr>
&lt;th>Parameter&lt;/th>
&lt;th>Value&lt;/th>
&lt;/tr>
&lt;/thead>
&lt;tbody>
&lt;tr>
&lt;td>Peak achievable temperature at focal plane&lt;/td>
&lt;td>54.2°C&lt;/td>
&lt;/tr>
&lt;tr>
&lt;td>Ultrasound pressure at focus&lt;/td>
&lt;td>3.43 mN (100 Hz, 12 mm radius)&lt;/td>
&lt;/tr>
&lt;tr>
&lt;td>Temperature stability (mean error)&lt;/td>
&lt;td>0.25% over 10 min&lt;/td>
&lt;/tr>
&lt;tr>
&lt;td>Interaction distance range&lt;/td>
&lt;td>12–22 cm&lt;/td>
&lt;/tr>
&lt;/tbody>
&lt;/table>
&lt;h3 id="unity-vr-integration">Unity VR Integration&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Unity 2020 LTS&lt;/strong> with SteamVR / OpenVR SDK (HTC Vive)&lt;/li>
&lt;li>Custom C# bridge communicates over USB serial to Arduino controller&lt;/li>
&lt;li>VR scene: virtual campfire with particle system; hand proximity triggers thermal ramp (further = cooler, closer = warmer) while fire flicker drives vibrotactile modulation at 4–12 Hz&lt;/li>
&lt;li>Thermal latency from Unity event to onset at skin: ~120 ms (dominated by airflow thermal inertia)&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="user-evaluation">User Evaluation&lt;/h2>
&lt;h3 id="threshold-study--wdt-and-hpdt">Threshold Study — WDT and HPDT&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 14 participants&lt;/strong>&lt;/li>
&lt;li>&lt;strong>Protocol&lt;/strong>: method of limits (ascending/descending); 5 trials per direction, 3 interleaved staircases&lt;/li>
&lt;li>&lt;strong>Conditions&lt;/strong>: mid-air thermal only (no ultrasound) vs. mid-air thermal + ultrasound (thermo-tactile)&lt;/li>
&lt;li>&lt;strong>Measures&lt;/strong>: WDT (°C), HPDT (°C), response time to first detection&lt;/li>
&lt;/ul>
&lt;h3 id="haptic-pattern-recognition-study">Haptic Pattern Recognition Study&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 14 participants&lt;/strong> (same cohort, separate session)&lt;/li>
&lt;li>&lt;strong>Task&lt;/strong>: identify 4 spatial haptic patterns (dot, ring, horizontal bar, vertical bar) presented mid-air&lt;/li>
&lt;li>&lt;strong>Conditions&lt;/strong>: non-thermal (room temp) vs. thermal-on (heated airflow active)&lt;/li>
&lt;li>&lt;strong>Measure&lt;/strong>: identification accuracy, confusion matrix&lt;/li>
&lt;/ul>
&lt;h3 id="vr-experience-study">VR Experience Study&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 10 participants&lt;/strong>&lt;/li>
&lt;li>&lt;strong>Task&lt;/strong>: 5-minute campfire scene; ratings on warmth realism, presence, comfort&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="results--key-findings">Results &amp;amp; Key Findings&lt;/h2>
&lt;ul>
&lt;li>&lt;strong>WDT&lt;/strong>: mean 32.8°C (SD=1.12) — consistent with contact-based thermal WDT literature (validates mid-air stimulation as perceptually equivalent)&lt;/li>
&lt;li>&lt;strong>HPDT&lt;/strong>: mean 44.6°C (SD=1.64) — also matches contact norms; no elevated pain threshold from airflow delivery&lt;/li>
&lt;li>&lt;strong>Pattern accuracy&lt;/strong>: 98.1% (non-thermal) vs. &lt;strong>97.2% (thermal)&lt;/strong> — no significant degradation (p=.38); thermal channel does not interfere with tactile perception&lt;/li>
&lt;li>Thermo-tactile condition received significantly higher VR realism ratings than tactile-only (p&amp;lt;.05)&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="lessons--evolution">Lessons &amp;amp; Evolution&lt;/h2>
&lt;p>This project established the &lt;strong>core technical finding&lt;/strong> that underpins the entire MI Lab thermal haptics research line: thermal and tactile cues can coexist mid-air without masking each other, enabling richer multi-modal VR experiences. Every subsequent project (Snow, Fabric Thermal Display, Fiery Hands) built on these baseline thresholds and the dual-channel architecture proven here.&lt;/p>
&lt;hr>
&lt;h2 id="impact">Impact&lt;/h2>
&lt;ul>
&lt;li>📄 Published: &lt;strong>ACM VRST 2021&lt;/strong> — &lt;em>Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology&lt;/em>&lt;/li>
&lt;li>DOI:
&lt;/li>
&lt;li>First paper characterizing mid-air thermo-tactile thresholds; foundational reference for the lab&amp;rsquo;s subsequent wearable thermal haptics work&lt;/li>
&lt;/ul></description></item></channel></rss>