nimble: mobile interface for a visual question answering augmented by gestures
Published 4 years ago • 394 plays • Length 0:30Download video MP4
Download video MP3
Similar videos
-
0:31
lbw330: a comparison of surface and motion user-defined gestures for mobile augmented reality
-
1:43
helpviz: automatic generation of contextual visual mobile tutorials from text-based instructions
-
7:47
helpviz: automatic generation of contextual visual mobile tutorials from text-based instructions
-
0:27
kinemic wave: a mobile freehand gesture and text-entry system
-
8:08
handycast: phone-based bimanual input for virtual reality in mobile and space-constrained setting...
-
0:52
chi14 wip 10308: using 3d hand gestures and touch input for wearable ar interaction
-
11:28
ultimate tutorial: projection mapping for immersive 3d interactive dining tables in restaurants
-
17:53
magic mifare in a ring (and how to enable magic wake-up on a gen4 credential)
-
6:18
invoke 5.3 introduces a new select object tool and flux support for global reference images
-
0:31
handycast: phone-based bimanual input for virtual reality in mobile and space-constrained setting...
-
0:31
[preview] demonstrating taptype for mobile ten-finger text entry anywhere
-
0:31
counterpoint: exploring mixed-scale gesture interaction for ar applications
-
49:16
nimble 101: general mobile overview
-
55:21
ai tech talk from emza visual sense & alif: object detection with arm's ethos-u55