Skip navigation

ECE Research Seminar: Holoscopic Micro 3D Gesture Database for Wearable Device Interaction

With the rapid development of augmented reality (AR) and virtual reality (VR) technology, human-computer interaction (HCI) has been greatly improved for gaming interaction of AR and VR control. The ?nger micro-gesture is a hot research focus due to the growth of the Internet of Things (IoT) and wearable technologies and recently Google has developed a radar based micro-gesture sensor which is Google Soli. Also, there are a number of ?nger micro-gesture techniques have been developed using Time of Flight (ToF) imaging sensors for wearable 3D glasses such as Atheer mobile glasses. The principle of holoscopic 3D (H3D) imaging mimics fly’s eye technique that captures a true 3D optical model of the scene using a microlens array, however, there is a limited progress of holoscopic 3D systems due to the lack of high quality public available database. In this paper, holoscopic 3D camera is used to capture high quality holoscopic 3D micro-gesture video images and a new unique holoscopic 3D micro-gesture (HoMG) database is produced. HoMG database recorded the image sequence of 3 conventional gestures from 40 participants under different settings and conditions. For the purpose of H3D micro-gesture recognition, HoMG has a video subset of 960 videos and a still image subset with 30635 images. Initial micro-gesture recognition on both subsets has been conducted using the traditional 2D image and video features and popular classi?ers and some encouraging performance has been achieved. The database will be available for the research communities and speed up the research in the area of holoscopic 3D micro-gesture. Yi Liu is supervised by Dr Rafiq Swash.

Presented by

Yi Liu

Key learning outcomes

Understand the implications of holoscopic capture for human-computer interfaces.

Select a date

You must login to see and book workshop dates.

Login