MultiModalHandGesture_dataset

 

Research  

 

Multi-Modal Hand Gesture Dataset for Hand Gesture Recognition

Description


This dataset was created to validate a hand-gesture recognition system for Human-Machine Interaction (HMI). It is composed of 15 different hand-gestures (4 dynamic and 11 static) that are split into 16 different hand-poses, acquired by the Leap Motion device. Hand-gestures were performed by 25 different subjects (8 women and 17 men). Every gesture has 20 instances (repetitions) per subject, performed in different locations in the image. For each instance, both infrared images and skeleton information were collected.

Dynamic gestures:

This set contains 4 hand-gestures, each one composed of two different hand-poses:

  • A: go down
  • B: go left
  • C: go right
  • D: go up

Hand-poses:

This set contains 16 hand-poses, used for both static and dynamic hand-gestures:

    • A: L
    • B: fist moved
    • C: index
    • D: ok
    • E: C
    • F: heavy
    • G: hang
    • H: two
    • I: three
    • J: four
    • K: five
    • L: palm
    • M: down
    • N: palm moved
    • O: palm up
    • P: up

 

 

Dataset structure:

The dataset is structured into two folders as described here, each one for a different type of information:

      • infrared imagery
      • skeleton information

Download


Citation


T. Mantecón, C.R. del Blanco, F. Jaureguizar, N. García, “A real-time gesture recognition system using near-infrared imagery”, PLOS ONE, vol. 14, no. 10, article e0223320, pp. 1-17, Oct. 2019.

For questions about this dataset, please contact This email address is being protected from spambots. You need JavaScript enabled to view it. or This email address is being protected from spambots. You need JavaScript enabled to view it..