<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Mid-Air | Haokun Wang</title><link>https://wanghaokun.site/tags/mid-air/</link><atom:link href="https://wanghaokun.site/tags/mid-air/index.xml" rel="self" type="application/rss+xml"/><description>Mid-Air</description><generator>Hugo Blox Builder (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Wed, 15 May 2024 00:00:00 +0000</lastBuildDate><item><title>Let It Snow: Cross-Modal Cold &amp; Touch for VR Snowfall</title><link>https://wanghaokun.site/project/let-it-snow/</link><pubDate>Wed, 15 May 2024 00:00:00 +0000</pubDate><guid>https://wanghaokun.site/project/let-it-snow/</guid><description>&lt;h2 id="overview">Overview&lt;/h2>
&lt;p>&lt;strong>Let It Snow&lt;/strong> is a hands-free, wearable-free haptic experience: users hold their bare hands over a custom mid-air display that simultaneously fires focused ultrasound pressure points and directed cold airflow to simulate individual snowflakes landing — or rain drops splattering — on their palms.&lt;/p>
&lt;p>Published in &lt;strong>ACM IMWUT 2024&lt;/strong> (Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies), the project explores how cross-modal cold–tactile pairing creates emergent sensory illusions greater than either cue alone.&lt;/p>
&lt;hr>
&lt;h2 id="the-problem">The Problem&lt;/h2>
&lt;p>Simulating precipitation in VR is a classic immersion gap. Visually, snow and rain can look photorealistic. But without &lt;em>feeling&lt;/em> the cold, the wet, the gentle impact — users never quite believe it. Existing approaches require worn devices, which break the &amp;ldquo;bare hand in the weather&amp;rdquo; fantasy entirely.&lt;/p>
&lt;p>&lt;strong>Core Questions:&lt;/strong>&lt;/p>
&lt;ul>
&lt;li>Can cold airflow and ultrasound pressure co-localize in mid-air to synthesize a snowflake or raindrop percept?&lt;/li>
&lt;li>Do cold and tactile cues mask each other, or can they be independently perceived at the same skin location?&lt;/li>
&lt;li>How should aggregated stimuli be rendered for heavy snowfall / rainfall?&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="research-approach">Research Approach&lt;/h2>
&lt;p>We drew on &lt;strong>cross-modal sensory integration&lt;/strong> theory: cold and tactile channels are processed by separate neural pathways (thermoreceptors vs. mechanoreceptors), so two signals can coexist without mutual interference — unlike, say, two sounds at the same frequency.&lt;/p>
&lt;p>Key hypothesis: a brief cold puff + simultaneous pressure focus = snowflake percept; a sharp cold burst + faster pressure = raindrop percept.&lt;/p>
&lt;p>We also designed an &lt;strong>aggregated haptic scheme&lt;/strong> for particle-dense scenes: rather than rendering every particle individually (physically impossible), we modulate cold intensity and pressure density proportionally to particle count, exploiting temporal summation in both sensory channels.&lt;/p>
&lt;hr>
&lt;h2 id="system-design">System Design&lt;/h2>
&lt;h3 id="hardware">Hardware&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Cold array&lt;/strong>: 6 Peltier modules (20 × 20 mm) mounted in a ring, each with a micro-fan to direct cold air toward the focus point; temperature range: 5°C–15°C above ambient&lt;/li>
&lt;li>&lt;strong>Ultrasound haptic display&lt;/strong>: Ultrahaptics STRATOS Inspire — 256 transducers at 40 kHz, creating mid-air pressure foci up to 200 mN at distances up to 22 cm&lt;/li>
&lt;li>&lt;strong>Depth tracking&lt;/strong>: Intel RealSense D435 hand tracking, integrated into Unity for palm position → focus point mapping&lt;/li>
&lt;li>&lt;strong>Control PC&lt;/strong>: Custom C++ driver for thermal timing; Unity handles audio, visuals, and hand tracking&lt;/li>
&lt;/ul>
&lt;h3 id="unity-vr-integration">Unity VR Integration&lt;/h3>
&lt;ul>
&lt;li>Built in &lt;strong>Unity 2021 LTS&lt;/strong>, standalone VR scene with Oculus Integration SDK&lt;/li>
&lt;li>Particle system drives two managers:
&lt;ul>
&lt;li>&lt;code>SnowRenderer&lt;/code>: handles visual particles with collision callbacks to trigger haptic events&lt;/li>
&lt;li>&lt;code>HapticAggregator&lt;/code>: accumulates per-frame particle counts, applies transfer function to Peltier intensity and ultrasound amplitude&lt;/li>
&lt;/ul>
&lt;/li>
&lt;li>Snowflake percept: 150 ms cold puff + 40 Hz pressure burst; Raindrop: 60 ms sharp cold + 200 Hz single-pulse&lt;/li>
&lt;li>Scene contains interactive environments: snowy mountain valley, rainstorm on a city rooftop&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="user-evaluation">User Evaluation&lt;/h2>
&lt;h3 id="perceptual-study--cold--tactile-independence">Perceptual Study — Cold × Tactile Independence&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 14 participants&lt;/strong>&lt;/li>
&lt;li>&lt;strong>Design&lt;/strong>: 2 (cold present/absent) × 2 (tactile present/absent) × 5 repetitions&lt;/li>
&lt;li>&lt;strong>Measure&lt;/strong>: detection accuracy per modality, reported interference rating&lt;/li>
&lt;li>&lt;strong>Finding&lt;/strong>: No significant cross-modal masking — participants detected cold and tactile independently (d&amp;rsquo; &amp;gt; 2.5 for both modalities combined)&lt;/li>
&lt;/ul>
&lt;h3 id="experience-study--aggregated-rendering-comparison">Experience Study — Aggregated Rendering Comparison&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 20 participants&lt;/strong>, within-subject&lt;/li>
&lt;li>&lt;strong>Conditions&lt;/strong>: (1) no haptics, (2) tactile-only, (3) cold-only, (4) Snow (cold+tactile sparse), (5) Snow (cold+tactile aggregated)&lt;/li>
&lt;li>&lt;strong>Measures&lt;/strong>: presence subscale (IPQ), realism rating, preference ranking&lt;/li>
&lt;li>&lt;strong>Task&lt;/strong>: 3-minute free exploration of snowy mountain scene, 3-minute rainstorm scene&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="results--key-findings">Results &amp;amp; Key Findings&lt;/h2>
&lt;ul>
&lt;li>&lt;strong>Aggregated scheme rated significantly more realistic&lt;/strong> than sparse individual-particle scheme (p&amp;lt;.01) for heavy snowfall&lt;/li>
&lt;li>Cold+tactile combination rated &lt;strong>+1.8 points&lt;/strong> on 7-pt presence scale vs. tactile-only (p&amp;lt;.001)&lt;/li>
&lt;li>18/20 participants preferred the full cross-modal condition; primary qualitative theme: &amp;ldquo;it actually felt cold and real, like being outside&amp;rdquo;&lt;/li>
&lt;li>System achieved stable cold delivery at ±0.3°C variance across a 10-minute continuous session&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="impact">Impact&lt;/h2>
&lt;ul>
&lt;li>📄 Published: &lt;strong>ACM IMWUT 2024&lt;/strong> — &lt;em>Proc. ACM Interact. Mob. Wearable Ubiquitous Technol.&lt;/em>&lt;/li>
&lt;li>DOI:
&lt;/li>
&lt;li>Framework for aggregated haptic rendering has been adopted in follow-on multi-particle VR haptics research&lt;/li>
&lt;/ul></description></item><item><title>Mid-Air Thermo-Tactile Fire: Ultrasound Haptic Display for VR</title><link>https://wanghaokun.site/project/mid-air-fire-haptics/</link><pubDate>Wed, 01 Sep 2021 00:00:00 +0000</pubDate><guid>https://wanghaokun.site/project/mid-air-fire-haptics/</guid><description>&lt;h2 id="overview">Overview&lt;/h2>
&lt;p>Imagine reaching toward a virtual campfire and actually feeling the heat wash over your hands — no gloves, no controllers, nothing on your skin. &lt;strong>Mid-Air Thermo-Tactile Fire&lt;/strong> is a proof-of-concept system that delivers both thermal warmth and vibrotactile pressure to a free hand hovering above a custom device, using a combination of heated airflow channels and a 40 kHz ultrasound haptic array.&lt;/p>
&lt;p>Published at &lt;strong>ACM VRST 2021&lt;/strong> (ACM Symposium on Virtual Reality Software and Technology), this was the first system to simultaneously characterize thermo-tactile mid-air feedback thresholds and demonstrate them in a VR fire interaction scenario.&lt;/p>
&lt;hr>
&lt;h2 id="the-problem">The Problem&lt;/h2>
&lt;p>Mid-air haptics (ultrasound) had proven that focused pressure can be delivered without contact. Thermal mid-air feedback existed in industrial settings (heat lamps). But &lt;strong>simultaneously combining both&lt;/strong> — localized, controllable, synchronized — for real-time VR had not been demonstrated.&lt;/p>
&lt;p>Key unknowns at project start:&lt;/p>
&lt;ul>
&lt;li>What temperature range can be achieved mid-air at realistic interaction distances (15–25 cm)?&lt;/li>
&lt;li>Does the ultrasonic pressure signal interfere with thermal perception (or vice versa)?&lt;/li>
&lt;li>What warm detection threshold (WDT) and heat-pain threshold (HPDT) apply to mid-air vs. contact thermal stimulation?&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="system-design">System Design&lt;/h2>
&lt;h3 id="hardware-architecture">Hardware Architecture&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Ultrasound display&lt;/strong>: 16×16 transducer array (256 elements), 40 kHz carrier, capable of focusing pressure at 10–25 cm above surface&lt;/li>
&lt;li>&lt;strong>Thermal channel&lt;/strong>: open-top acrylic chamber with 4 heating coils; a low-speed centrifugal fan directs warm air up through the focus zone&lt;/li>
&lt;li>&lt;strong>Temperature control&lt;/strong>: PID loop via Arduino — thermocouple at the focal plane feeds back to heater PWM, ±1°C stability&lt;/li>
&lt;li>&lt;strong>Integration&lt;/strong>: ultrasound focus point and warm airflow column co-aligned within ±5 mm&lt;/li>
&lt;/ul>
&lt;h3 id="measured-system-specs">Measured System Specs&lt;/h3>
&lt;table>
&lt;thead>
&lt;tr>
&lt;th>Parameter&lt;/th>
&lt;th>Value&lt;/th>
&lt;/tr>
&lt;/thead>
&lt;tbody>
&lt;tr>
&lt;td>Peak achievable temperature at focal plane&lt;/td>
&lt;td>54.2°C&lt;/td>
&lt;/tr>
&lt;tr>
&lt;td>Ultrasound pressure at focus&lt;/td>
&lt;td>3.43 mN (100 Hz, 12 mm radius)&lt;/td>
&lt;/tr>
&lt;tr>
&lt;td>Temperature stability (mean error)&lt;/td>
&lt;td>0.25% over 10 min&lt;/td>
&lt;/tr>
&lt;tr>
&lt;td>Interaction distance range&lt;/td>
&lt;td>12–22 cm&lt;/td>
&lt;/tr>
&lt;/tbody>
&lt;/table>
&lt;h3 id="unity-vr-integration">Unity VR Integration&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Unity 2020 LTS&lt;/strong> with SteamVR / OpenVR SDK (HTC Vive)&lt;/li>
&lt;li>Custom C# bridge communicates over USB serial to Arduino controller&lt;/li>
&lt;li>VR scene: virtual campfire with particle system; hand proximity triggers thermal ramp (further = cooler, closer = warmer) while fire flicker drives vibrotactile modulation at 4–12 Hz&lt;/li>
&lt;li>Thermal latency from Unity event to onset at skin: ~120 ms (dominated by airflow thermal inertia)&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="user-evaluation">User Evaluation&lt;/h2>
&lt;h3 id="threshold-study--wdt-and-hpdt">Threshold Study — WDT and HPDT&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 14 participants&lt;/strong>&lt;/li>
&lt;li>&lt;strong>Protocol&lt;/strong>: method of limits (ascending/descending); 5 trials per direction, 3 interleaved staircases&lt;/li>
&lt;li>&lt;strong>Conditions&lt;/strong>: mid-air thermal only (no ultrasound) vs. mid-air thermal + ultrasound (thermo-tactile)&lt;/li>
&lt;li>&lt;strong>Measures&lt;/strong>: WDT (°C), HPDT (°C), response time to first detection&lt;/li>
&lt;/ul>
&lt;h3 id="haptic-pattern-recognition-study">Haptic Pattern Recognition Study&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 14 participants&lt;/strong> (same cohort, separate session)&lt;/li>
&lt;li>&lt;strong>Task&lt;/strong>: identify 4 spatial haptic patterns (dot, ring, horizontal bar, vertical bar) presented mid-air&lt;/li>
&lt;li>&lt;strong>Conditions&lt;/strong>: non-thermal (room temp) vs. thermal-on (heated airflow active)&lt;/li>
&lt;li>&lt;strong>Measure&lt;/strong>: identification accuracy, confusion matrix&lt;/li>
&lt;/ul>
&lt;h3 id="vr-experience-study">VR Experience Study&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>N = 10 participants&lt;/strong>&lt;/li>
&lt;li>&lt;strong>Task&lt;/strong>: 5-minute campfire scene; ratings on warmth realism, presence, comfort&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="results--key-findings">Results &amp;amp; Key Findings&lt;/h2>
&lt;ul>
&lt;li>&lt;strong>WDT&lt;/strong>: mean 32.8°C (SD=1.12) — consistent with contact-based thermal WDT literature (validates mid-air stimulation as perceptually equivalent)&lt;/li>
&lt;li>&lt;strong>HPDT&lt;/strong>: mean 44.6°C (SD=1.64) — also matches contact norms; no elevated pain threshold from airflow delivery&lt;/li>
&lt;li>&lt;strong>Pattern accuracy&lt;/strong>: 98.1% (non-thermal) vs. &lt;strong>97.2% (thermal)&lt;/strong> — no significant degradation (p=.38); thermal channel does not interfere with tactile perception&lt;/li>
&lt;li>Thermo-tactile condition received significantly higher VR realism ratings than tactile-only (p&amp;lt;.05)&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h2 id="lessons--evolution">Lessons &amp;amp; Evolution&lt;/h2>
&lt;p>This project established the &lt;strong>core technical finding&lt;/strong> that underpins the entire MI Lab thermal haptics research line: thermal and tactile cues can coexist mid-air without masking each other, enabling richer multi-modal VR experiences. Every subsequent project (Snow, Fabric Thermal Display, Fiery Hands) built on these baseline thresholds and the dual-channel architecture proven here.&lt;/p>
&lt;hr>
&lt;h2 id="impact">Impact&lt;/h2>
&lt;ul>
&lt;li>📄 Published: &lt;strong>ACM VRST 2021&lt;/strong> — &lt;em>Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology&lt;/em>&lt;/li>
&lt;li>DOI:
&lt;/li>
&lt;li>First paper characterizing mid-air thermo-tactile thresholds; foundational reference for the lab&amp;rsquo;s subsequent wearable thermal haptics work&lt;/li>
&lt;/ul></description></item></channel></rss>