A Model-Driven Method for Gesture-Based Interface Requirements Specification
Currently there are several software development suites that include tools for user interface design and implementation (mainly by programming source code). Some of the tools are multi-platform and multi-style; that is, al-lowing the specification of devices, e.g. computer, notebook, Smartphone, and user interaction styles, e.g. based on gestures, voice, mouse and keyboard. Among the styles, gesture-based interaction is neglected, despite the proliferation of gesture-recognizing devices. Given the variety of styles of human - computer interaction currently available, it is necessary to include information on these styles in software requirements specification to obtain a complete specification prior to code generation.