Just as we use enactment to support the ideas and imagination of children under the larger scope of this research, we are also investigating the same concept for adults. While adults don’t typically engage in pretend play, they do embody ideas in their gestures, and this project aims to visualize those ideas from gestures and the speech that accompanies them. Imagine being able to, as you describe a story or idea, have a supporting visualization portray your descriptions for others to see and collaborate with you on. That is the vision this project aims to achieve, starting with the complex challenge of designing a system to interpret gesture features into meaningful concepts for visualization.
After conducting an analysis of image-describing gestures, we’ve started with a prototype that can visualize gestures that describe size, as seen below.
Dimension Gesture Prototype
PROJECT TEAM MEMBERS
- Dr. Sharon Lynn Chu (ELX Lab Director)
- Sarah Brown (Ph.D. Student, Computer Information Science & Engineering)
- Grace Nemanic (Undergraduate Student, University of Florida)
- Ranger Chenore (Undergraduate Student, University of Florida)
- Arnav Pangasa (Undergraduate Student, University of Florida)
- Brown, S. A., Chu, S. L., Quek, F., Canaday, P., Li, Q., Loustau, T., … & Zhang, L. (2019, November). Towards a Gesture-Based Story Authoring System: Design Implications from Feature Analysis of Iconic Gestures During Storytelling. In International Conference on Interactive Digital Storytelling (pp. 364-373). Springer, Cham. (Nominee for Best Short Paper)