[vimeo http://www.vimeo.com/7338692 w=450&h=360]
The above video shows hands with and without a book in them. Books have a defined user experience: there is a consistent and uniform way to open and ‘read’ them; they have a particular structure encourage certain familiar gestures – e.g. flicking through to the index or chapter headings.
Gestures, therefore, meet the three tests for a semiotic domain (implicitly) laid down by Gee (2003:23):
- They allow us to experience (see, feel and operate on) the world in new ways.
- Such gestures are shared with groups of people who carry them on as distinctive social practices.
- They allow us to gain resources that prepare us for future learning and problem-solving in the domain (and perhaps related domains)
If gestures are semiotic domains, explains Gee, then they must have ‘design grammars’:
Each domain has an internal and an external design grammar. By an internal design grammar, I mean the principles and patterns in terms of which one can recognize what is and what is not acceptable or typical in a semiotic domain. By an external design grammar, I mean the principles and patterns in terms of which one can recognize what is and what is not an acceptable or typical social practice and identity in regard to the affinity group associated with a semiotic domain. (p.30)
The concept of ‘gestural literacy’, therefore, would seem to be somewhat of a misnomer. Grammar may be a semiotic domain and have both internal and external design grammars, but to consider it separately from the act of reading would be to miss the point. We need holistic views of literacy that take into account embodied cognition.
Gee, J.P. (2003) What video games have to teach us about learning and literacy (Palgrave Macmillan, New York)