4P-07
Dynamic FOV and Motion Parallax on 2D Displays by Temporal Head Pose Estimation
○張 哲銘,小池崇文(法大)
Realizing immersive motion parallax on standard 2D displays by a single RGB camera requires high-precision, low-latency tracking. Conventional geometric methods suffer from jitters, while smoothing filters introduce latency that breaks interaction. To address this, we propose a Motion-Guided Cascade GRU Network, trained on a custom Blender synthetic dataset optimized for seated scenarios. By explicitly modeling motion probability, our architecture mitigates drift while minimizing latency. Experimental results demonstrate superior stability (SD ≈ 0.005) and accuracy (MAE ≈ 6.9°). Finally, a Unity implementation demonstrates real-time Dynamic FOV rendering, validating the practical viability of our approach for glasses-free 3D.