Dynamic Projection Mapping Technologies Pioneered by High-Speed Vision

Authors

    Leo Miyashita, Tomohiro Sueishi, Satoshi Tabata, Tomohiko Hayakawa, Masatoshi Ishikawa The University of Tokyo, Tokyo 113-8656, Japan The University of Tokyo, Tokyo 113-8656, Japan The University of Tokyo, Tokyo 113-8656, Japan The University of Tokyo, Tokyo 113-8656, Japan The University of Tokyo, Tokyo 113-8656, Japan; Tokyo University of Science, Tokyo 162-8601, Japan

Keywords:

High-speed image processing, Dynamics matching, Vision chips, High-speed 3D measurement, High-speed projectors

Abstract

Dynamic projection mapping (DPM), in which projection mapping is applied to objects that are subject to motion and deformation, has been the subject of much research and has seen a wide variety of developments. This paper points out that projection mapping and DPM are critically different in terms of the required system speed, and introduces high-speed vision technologies such as high-speed vision chips, high-speed projectors and high-speed image processing. Furthermore, the paper describes applied research on DPM, which continues to evolve towards unconstrained, multidimensional representation using these high-speed vision technologies.

References

Ng A, Lepinski J, Wigdor D, et al., 2012, Designing for Low-Latency Direct-Touch Input. Proceedings of the 25th ACM Sympo. User Interface Software and Technology (UIST), 453–464.

Ogawa J, Sakaguchi Y, Namiki A, et al., 2004, Adaptive Acquisition of Dynamics Matching in Sensory-Motor Integration Systems. Shinrigaku Ron (D-II), J87-D-II(7): 1505–1515.

Ishikawa M, 2017, Architecture and New Development of High-Speed Vision. Journal of the Robotics Society of Japan, 35(8): 570–573.

Nose A, Yamazaki T, Katayama H, et al., 2018, Design and Performance of a 1 ms High-Speed Vision Chip with 3D-stacked 140 GOPS Column-Parallel Pes. MDPI Sensors, 18(5): 1313.

Yamazaki T, Katayama H, Uehara S, et al., 2017, A 1 ms High-Speed Vision Chip with 3D-Stacked 140GOPS Column-Parallel PEs for Spatio-Temporal Image Processing. Proceedings of the 2017 ISSC, 82–83.

Miyashita L, Yamazaki T, Uehara K, et al., 2018, Portable Lumipen: Dynamic SAR in Your Hand. Proceedings of the International Conf. Multimedia and Expo (ICME2018).

Miyashita L, Ishikawa M, 2020, Wearable DPM System with Intelligent Imager and GPU. Proceedings of the International Conf. Artificial Intelligence Circuits and Systems (AICAS 2020, Virtual Conf.), 129–130.

Watanabe Y, Narita G, Tatsuno S, et al., 2015, High-Speed 8-Bit Image Projector at 1,000 fps with 3 ms Delay. Proceedings of the International Display Workshops (IDW2015), 1064–1065.

Watanabe Y, Ishikawa M, 2019, High-Speed and High-Brightness Color Single-Chip DLP Projector Using High- Power LED-Based Light Sources. Proceedings of the International Display Workshops (IDW2019), 1350–1352.

Okumura K, Oku H, Ishikawa M, 2011, Optical Eye Control System for High-Speed Active Vision. Journal of the Robotics Society of Japan, 29(2): 201–211.

Sueishi T, Oku H, Ishikawa M, 2017, Lumipen 2: Dynamic Projection Mapping with Mirror Based Robust High- Speed Tracking Against Illumination Changes. PRESENCE: Teleoperators and Virtual Environments, 25(4): 299–321.

Mikawa Y, Sueishi T, Watanabe Y, et al., 2018, VarioLight: Hybrid Dynamic Projection Mapping Using High-Speed Projector and Optical Axis Controller. Proceedings of the 11th ACM SIGGRAPH Conf. and Exhibition Comput. Graphics and Interactive Techniques in Asia /Emerging Technologies, 1–2.

Mikawa Y, Sueishi T, Watanabe Y, et al., 2021, Dynamic Projection Mapping for Robust Sphere Posture Tracking Using Uniform/Biased Circumferential Markers, IEEE Trans. Visualization and Comput. Graphics, 28(12): 4016–4031.

Sueishi T, Ishikawa M, 2021, Ellipses Ring Marker for High-Speed Finger Tracking. Proceedings of the 27th ACM Symposium Virtual Reality Softw. and Technol. (VRST2021), 1–5.

Nitta M, Sueishi T, Ishikawa M, 2018, Tracking Projection Mosaicing by Synchronized Highspeed Optical Axis Control. Proceedings of the 24th ACM Symposium Virtual Reality Softw. and Technol. (VRST2018), 1–5.

Matsumoto A, Nitta N, Sueishi T, et al., 2022, Realization of Wide-Area High-Resolution Projection System Using Fast Gazing Point Estimation. Transactions of the Society of Instrument and Control Engineers, 58(1): 42–51.

Sueishi T, Miyaji C, Ishikawa M, 2020, Highspeed Projection Method of Swing Plane for Golf Training. Augmented Humans International Conf. (AHS 2020), Poster.

Oku H, Ishikawa M, 2010, High-Speed Liquid Lens for Computer Vision. Proceedings of IEEE International Conf. Robotics and Automation (ICRA 2010), 2643–2648.

Xu LH, Hu Y, Tabata S, et al., 2019, Dynamic Depth-of-Field Projection for 3D Projection Mapping. Proceedings of the ACM CHI Conf. Human Factors in Comput. Syst. (CHI’19), 1-4.

Wang L, Xu H, Tabata S, et al., 2020, High-Speed Focal Tracking Projection Based on Liquid Lens. Proceedings of the ACM SIGGRAPH 2020 Emerging Technol, 1–2.

Wang Y, Xie S, Wang L, et al., 2022, ARSlice: Head-Mounted Display Augmented with Dynamic Tracking and Projection. J. Comput. Sci. Technol., 37(6): 666–679.

Xu H, Wang L, Tabata S, et al., 2021, Extended Depth-of-Field Projection Method Using a High-Speed Projector with a Synchronized Oscillating Variable-Focus Lens. Appl Opt, 60(13): 3917–3924.

Miyashita L, Watanabe Y, Ishikawa M, 2018, MIDAS Projection: Markerless and Modelless Dynamic Projection Mapping for Material Representation. ACM Trans. Graphics, 37(6): 196:1–196:12.

Nehab D, Rusinkiewicz S, Davis J, et al., 2005, Efficiently Combining Positions and Normals for Precise 3D Geometry. ACM Trans. Graphics, 24(3): 536–543.

Tabata T, Noguchi S, Watanabe Y, et al., 2016, Segment Pattern Projection Type High-Speed 3D Measurement Based on Three Viewpoint Constraints. Transactions of the Society of Instrument and Control Engineers, 52(3): 141–151.

Miyashita L, Kimura Y, Tabata S, et al., 2021, High-Speed Simultaneous Measurement of Depth and Normal for Real-Time 3D Reconstruction. SPIE Optical Engineering + Applications (Hybrid with Virtual Conference), 11842-52.

Miyashita L, Kimura Y, Tabata S, et al., 2022, High-Speed Depth-Normal Measurement and Fusion Based on Multiband Sensing and Block Parallelization. J. Robotics and Mechatronics (JRM), 34(5): 1111–1121.

Nakamura A, Miyashita L, Watanabe L, et al., 2018, RIFNOM: 3D Rotation-Invariant Features on Normal Maps. Eurographics 2018, Posters.

Miyashita L, Nakamura A, Odagawa T, et al., 2021, BIFNOM: Binary-Coded Features on Normal Maps. Sensors, 21(10): 3469.

Kawabe T, Fukiage T, Sawayama M, et al., 2016, Deformation Lamps: A Projection Technique to Make Static Objects Perceptually Dynamic. ACM Trans. Appl. Perception, 13(2): 10:1–10:17.

Livingstone MS, Hubel DH, 1987, Psychophysical Evidence for Separate Channels for the Perception of Form, Color, Movement, and Depth. J. Neuroscience, 7(11): 3416–3468.

Fukamizu K, Miyashita L, Ishikawa M, 2020, ElaMorph projection: Deformation of 3D shape by dynamic projection mapping. Proceedings of the International Sympo. Mixed and Augmented Reality (ISMAR2020).

Miyashita L, Fukamizu K, Ishikawa M, 2021, Simultaneous Augmentation of Textures and Deformation Based on Dynamic Projection Mapping. Proceedings of SIGGRAPH Asia 2021 Emerging Technologies, 16.

Kadowaki T, Maruyama M, Hayakawa T, et al., 2019, Effects of Low Latency Images on Users in an Immersive Environment with a Gap Between Physical Sensation and Visual Information. Transactions of the Virtual Reality Society of Japan, 24(1): 23–30.

Downloads

Published

2022-12-31