Intentional and attentional dynamics of speech–hand coordination

Intentional and attentional dynamics of speech–hand coordination

Please download to get full document.

View again

of 57
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.

World Literature

Publish on:

Views: 0 | Pages: 57

Extension: PDF | Download: 0

  Intentional and attentional dynamicsof speech–hand coordination Paul Treffner  * , Mira Peter Complex Active Visualization Laboratory, School of Information Technology,Griffith University, Gold Coast, Australia Abstract Interest is rapidly growing in the hypothesis that natural language emerged from a moreprimitive set of linguistic acts based primarily on manual activity and hand gestures. Increas-ingly, researchers are investigating how hemispheric asymmetries are related to attentional andmanual asymmetries (i.e., handedness). Both speech perception and production have srcinsin the dynamical generative movements of the vocal tract known as articulatory gestures.Thus, the notion of a ‘‘gesture’’ can be extended to both hand movements and speech articu-lation. The generative actions of the hands and vocal tract can therefore provide a basis for the(direct) perception of linguistic acts. Such gestures are best described using the methods of dy-namical systems analysis since both perception and production can be described using thesame commensurate language. Experiments were conducted using a phase transition paradigmto examine the coordination of speech–hand gestures in both left- and right-handed individ-uals. Results address coordination (in-phase vs. anti-phase), hand (left vs. right), lateralization(left vs. right hemisphere), focus of attention (speech vs. tapping), and how dynamical con-straints provide a foundation for human communicative acts. Predictions from the asymmetricHKB equation confirm the attentional basis of functional asymmetry. Of significance is a newunderstanding of the role of perceived synchrony (p-centres) during intentional cases of ges-tural coordination.   2003 Elsevier Science B.V. All rights reserved. PsycINFO classification:  2330; 2720; 2340 Keywords:  Coordination; Dynamics; Speech; Hand; Gestures; Laterality; Attention * Corresponding author. E-mail address: (P. Treffner).0167-9457/02/$ - see front matter    2003 Elsevier Science B.V. All rights reserved.doi:10.1016/S0167-9457(02)00178-1Human Movement Science 21 (2002) 641–  1. Introduction The last decade has witnessed a dramatic rise in research investigating the com-mon neural and functional basis for speech perception, speech production, and man-ual gestures. Indeed, interest is rapidly growing in the hypothesis that naturallanguage emerged from a more primitive set of linguistic acts based primarily onmanual gestures (Corballis, 1998, 2002; Gallese, Fadiga, Fogassi, & Rizzolatti,1996; Goldin-Meadow, 1999; Iverson & Thelen, 1999; McNeill, 2000; Noble & Da-vidson, 2001; Place, 2000; Rizzolatti & Arbib, 1998). Further, it is now recognizedthat both speech perception and production have srcins in dynamical generativemovements known as  articulatory gestures . Thus, in articulatory phonology, the no-tion of a gesture is broader than its use for describing hand movements. Althoughtraditional linguistics assumes that phonological symbols are represented in a staticmanner, the units of speech, at all levels from phonemes to sentences, are temporallycontinuous and cannot be captured in symbol-based representations (Gonzales,French, & Treffner, 1990; Port, Cummins, & Gasser, 1995; Saltzman, 1992; Schmidt,Treffner, & Turvey, 1991; Treffner, 1997). The view that generative articulatory ges-tures drive the perception and production of speech is closely related to the dynam-ical systems-based modelling approach known as task dynamics and was pioneeredby researchers at Haskins Laboritories (New Haven, USA) (e.g., Browman & Gold-stein, 1986, 1990; Kelso, Saltzman, & Tuller, 1986a,b; Kelso, Tuller, Vatikiotis-Bate-son, & Fowler, 1984; Tuller & Kelso, 1984). This perspective shows that theprimitives of articulatory phonology are fully-fledged ‘‘gestures’’ of the vocal tract(sequences of articulatory openings and closures), rather than phonetic features.Hence, it is dynamical movement patterns composing the task dynamics that specifythe significant elements of speech (Saltzman, 1992; Saltzman & Byrd, 2000), and assuch, may be thought of as another instance of a complex dynamical system (Kelso,1995; Kugler, Kelso, & Turvey, 1980; Murray, 1990; Port & van Gelder, 1995; Tur-vey & Carello, 1995). From this perspective, studying gestural coordination is tanta-mount to examining the relation between the dynamical structure of gesturalphonology and the dynamical structure of manual gestures.However, issues remain as to how the classic problem of coarticulation in speechproduction is related to the ‘‘coarticulation’’ during speech–hand coordination. Toaddress this issue, a dynamical systems approach to gestural coordination has re-cently been called for (Goldin-Meadow, 1999; Iverson & Thelen, 1999; McNeill,2000). The strength of this approach is that it can provide insight into the temporalevolution of speech–hand coordination and its basis for language. Recent researchsuggests a common neural basis for perception–action coupling. It has been observedthat Broca  s area, although traditionally thought of as being the primary speech  pro-duction  centre for both overt and silent speech (Huang, Carr, & Cao, 2001) is alsoinvolved in speech  perception  (Price et al., 1996). Similarly, recent findings reveal thatthe neural systems supporting speech perception and production partially overlap inleft superior temporal lobe (Hickok, 2001; Hickok & Poeppel, 2000). Likewise, inspeakers of American sign language (ASL), Broca  s area becomes activated whileseeing ASL gestures (Hickok, Bellugi, & Klima, 1998; Neville et al., 1998). These 642  P. Treffner, M. Peter / Human Movement Science 21 (2002) 641–697   findings entail a common neural support for the perception and production of linguistic gestures.But how do speech and hand gestures interact? Recently it has been reported thatstimulation of a site in primary motor cortex of monkeys produced mouth openingand also caused the fingers to clench into a grip and move to the mouth (Graziano,Taylor, & Moore, 2002). It has also been shown that Broca  s area for speech produc-tion is activated by non-linguistic  hand movements  (Gallese et al., 1996) and in apha-sics picture perception and naming is improved by simultaneous hand movements(Hanlon, Brown, & Gerstman, 1990). The preceding implies that hand gestures facil-itate speech gestures. Conversely, speech gestures facilitate hand gestures throughincreasing the excitability of corticospinal pathways acting upon muscles of thepreferred hand (Tokimura, Tokimura, Oliviero, Asakura, & Rothwell, 1996). Themutual facilitation of Broca  s area and hand movements may hold the key to under-standing speech–hand coordination at a neural level. The remarkable discovery byGallese et al. (1996) of ‘‘mirror neurons’’ in Broca  s area that are selectively activatedby either observing or performing hand movements suggests a common neural basisfor the perception and production of speech and hand movements (for a review seeRizzolatti & Arbib, 1998). Recently it has been reported that for Broca  s area, whichis larger in the left hemisphere of humans, a similar asymmetry in the correspondingarea of the left hemisphere was found in three great ape species. This was taken tosuggest that the neuroanatomical substrate for left-hemisphere dominance in speechproduction was present early in human evolution and is not unique to humans (Can-talupo & Hopkins, 2001). Further, the existence of mirror neurons emphasizes thatthe human language system may be based upon  gestural   information (Corballis,2002; Rizzolatti & Arbib, 1998).Historically, studies of concurrent speech and hand movements have suggestedthat speech has an ‘‘interfering’’ effect on motor performance (Hammond, 1990; His-cock & Chipuer, 1986). The classic dual task studies and data of Kinsbourne andHicks on concurrent speech and manual activity (e.g., dowel balancing; Hicks,1975; Kinsbourne & Hicks, 1978; van Hoof & van Strien, 1997), although controver-sial, have been interpreted as showing that speech can interfere with and degrade mo-tor performance. In right-handers (RH) the effect of simultaneous speech on handperformance is typically more pronounced for right-hand performance than forleft-hand performance (Bathurst & Kee, 1994; Schmidt, Oliviera, Krahe, & Filgue-iras, 2000) and is interpreted as a consequence of both speech production andright-hand performance being controlled by the same left hemisphere. However,when the concurrent speech–hand task is performed using the left hand, given thatthe left hand is primarily controlled by the right hemisphere while speech is primarilycontrolled by the left hemisphere, it has been argued that, the two activities may pro-ceed in parallel with a lesser effect of speech on the performance of the left hand(Kinsbourne & Hicks, 1978). However, a functional  bidirectional coupling   betweenspeech and finger movement has been observed suggesting that the interaction be-tween concurrently active effectors might better be viewed as  coordination  rather thaninterference (Chang & Hammond, 1987; Kelso, Tuller, & Harris, 1983; Whitall,1996). A further advantage of a dynamics perspective on inter-effector coupling is P. Treffner, M. Peter / Human Movement Science 21 (2002) 641–697   643  that it can encompass non-intentional movements such as the vegetative processesunderlying the coordination of respiration and locomotion (Amazeen, Amazeen,& Beek, 2001; Goldfield, Schmidt, & Fitzpatrick, 1999), as well as a wide range of phenomena involving the coordination of perception and action (Treffner & Morri-son, 2001).Recently it has been shown that in right-handers (RH) both the right and the lefthand are influenced by the activity of the left hemisphere while the right hemispheredoes not influence the right hand (Schluter, Krams, Rushworth, & Passingham,2001). In left-handers (LH) such an asymmetry has not been observed (Singh et al.,1998) which may explain the lack of asymmetric effect of speech on hand perfor-mance typically observed in LH (Bathurst & Kee, 1994). Importantly, these findingssuggest that the left hemisphere is dominant not only for speech but for action ingeneral.Further evidence has shown that perception and manual production of rhythms inleft-hemisphere lesioned patients has been impaired suggesting that the left hemi-sphere is specialized for the production of movements in the temporal domain (Al-cock, Wade, Anslow, & Passingham, 2000; Hammond, 1982; Wittmann, vonSteinb € uuchela, & Szelag, 2001). A large body of research supports the notion of a lefthemisphere advantage for temporal resolution in both language and fine movements(Nicholls, 1996). Recently, it has been reported that rhythm activated the left hemi-sphere  s Broca  s area (Platel et al., 1997). There have also been reported a strong in-fluence of the left hemisphere in tasks requiring identification of consonant-vowelsyllables although the advantage was not observed in identification of steady statevowels (Shankweiler & Studdert-Kenedy, 1967). Similarly, a strong left hemisphereadvantage was observed for stop consonants presented at a rapid rate while forslower rates this advantage was not observed (Schwartz & Tallal, 1980). The prop-erty of left hemisphere involvement in fine temporal resolution is taken to be thebasis for language perception and production given that the production of speechrequires rapid movements of the articulators and that perception of speech requiresrecovery of fast changes in linguistic gestures.Rather than emphasizing sensorimotor interference (e.g., Kinsbourne & Hicks,1978), recent approaches to biological coordination have emphasized the coherenceof perception–action coupling via the concepts of synergies, synergetics, self-orga-nized coordinative structures, and coordination dynamics (e.g., Haken, 1996; Kelso,1995; Port & van Gelder, 1995; Turvey, 1990). Thus, a coordinative structure or syn-ergy consists of multiple biomechanically and dynamically constrained componentsacting as a single functional unit (Kugler et al., 1980). The many microscopic degreesof freedom (e.g., cells, muscles) become ‘‘enslaved’’ or controlled by an emergentproperty – the ‘‘order parameter’’ – such as the relative phase between two rhythmi-cally moving components (Haken, 1996). Order parameter dynamics simplifies theoverarching problem of having to explicitly control many degrees of freedom (e.g.,by a motor program). For the last two decades such a theoretical and experimentalapproach has driven numerous studies of interlimb coordination – both ipsilateraland contralateral between-limb coordination, and even between-person coordination(e.g., Haken, Kelso, & Bunz, 1985; Kelso, 1995; Kelso et al., 1983; Schmidt & Turvey, 644  P. Treffner, M. Peter / Human Movement Science 21 (2002) 641–697   1994; Turvey, 1990; Turvey & Carello, 1995). Importantly, the scale-independenttheory of synergetics has most recently been successfully applied at the neurophysi-ological level and has helped confirm that dynamically coherent patterns of cerebralactivity constrain human movement phenomena (e.g., Bressler & Kelso, 2001;Haken, 1996).In a complex motor task such as speech, the predominant way in which coherencemanifests is as rhythm. Rhythm can be seen as serving a coordinative function sinceit may be understood as a physical strategy whereby the parts of a system are con-strained in their relative timing by generic principles of non-linear dynamics (Cum-mins & Port, 1998; Kelso, 1995; Port, 2002; Port et al., 1995; Port, Tajima, &Cummins, 1999; van Lieshout, 2003). In linguistics, prosodic stress refers to a tem-poral relation between syllables, and variation in this relation constitutes the rhythmof an utterance. With reference to the findings of Treffner and Turvey (1993) on mul-tifrequency coordination, Port et al. (1995) and Cummins and Port (1998) indicatedthat a dynamical systems approach to speech rhythm can account for prosodic tim-ing. Employing a ‘‘speech cycling’’ paradigm in which speakers had to repeat a shortphrase in time with a metronome, it was demonstrated that speakers exhibit a ‘‘har-monic timing effect’’ whereby a stressed syllable occurred at harmonic points withinthe overall cycle of phrase repetition. Speakers exhibited a strong preference forthree distinct phases (1/3, 1/2, 2/3) and were incapable of placing the stressed syllableelsewhere. The constraints responsible have been interpreted from a coupled oscilla-tor dynamical systems perspective and may be identical to those found during re-search on within-person polyrhythm production (e.g., Kelso & DeGuzman, 1988;Peper, Beek, & van Wieringen, 1995; Schmidt, Beek, Treffner, & Turvey, 1991; Treff-ner & Turvey, 1993) as well as between-person coordination experiments (Schmidt &O  Brien, 1997; Schmidt & Turvey, 1994) that indicate the influence of dynamic con-straints described by integer ratios (Treffner, 1999). Cummins and Port (1998) tookthe emergence of preferred phases as evidence that the speech system is coordinatedsuch that subordinate processes are governed by higher-order dynamical constraints.A well-known example of experimentally exploring biological synergetic systemsis the paradigm pioneered by Kelso, Turvey, and colleagues (e.g., Kelso, 1995;Amazeen, Amazeen, & Turvey, 1998b). As we plan to exploit this procedure inthe proposed experiments, it is instructive to provide a summary through example.Consider an individual who is asked to rhythmically tap his or her right hand andright foot in synchrony (in-phase coordination) at the same tempo (1:1 frequency-locking) (e.g., Baldissera, Cavallari, & Tesio, 1994; Carson, Goodman, Kelso, & El-liot, 1995). Such ‘‘absolute coordination’’ is for most persons not difficult to achieve.However, if careful measurement was made, one would find that instead of perfectin-phase coordination (relative phase ¼ 0  ), one effector might be slightly ahead of the other (e.g., the hand might lead the foot) such that the phase difference or  relative phase  between limbs is no longer 0  . In this example a  phase shift  is said to have oc-curred from the potentially achievable state of perfect synchrony (e.g., relativephase ¼ 0  ) to a slightly asynchronous case (e.g., relative phase ¼ 10  ). Further, if the individual was asked to maintain such in-phase coordination between handand foot while a pacing signal increased in rate (e.g., a metronome was increased P. Treffner, M. Peter / Human Movement Science 21 (2002) 641–697   645
Related Search
Similar documents
View more...
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks