Face-to-face conversation is a type of “talk in interaction” (Sacks et al. 1974: 720) and a basic setting where language is used (Fillmore 1981; Chafe 1994; Clark 1996). People regularly produce meaningful, visible bodily actions, which may be combined with speech, to describe, depict, and index meanings across specific times and contexts as parts of composite utterances (Enfield 2009; see also Kendon 2014; Ferrara & Hodge 2018). There is a growing body of research that demonstrates the details of this multimodal integration (e.g., Goodwin 1981; 1986; Clark & Wilkes-Gibbs 1986; Bavelas 1990; Wilkes-Gibbs 1997; Abner et al. 2015; Cooperrider 2016; Keevallik 2018) as well as how people use such actions to regulate emerging interaction (e.g., Goodwin 1981; 1986; Parrill 2008; Enfield 2009; Jokinen 2009; Mondada 2014; Shaw 2019).1
The multimodal and semiotically diverse nature of face-to-face conversation has implications for language theory. However, in linguistics, theoretical work has generally disregarded this bodily aspect of conversation and has above all prioritized speech (and writing). In addition, there has been a preoccupation with only the most conventionalized and symbolic elements of composite utterances and their expression of referential, propositional meanings. The result has been theory building focused around the most symbolic instances of speech, while non-speech, bodily actions that are symbolic, as well as speech and bodily actions that express indexical or depictive meanings, have been ignored or selectively discussed (see Dingemanse 2017; Ferrara & Hodge 2018 for more discussion of this bias).
The current study challenges this theoretical bias with a study into the interactional (rather than propositional) meanings of pointing in Norwegian Sign Language conversations. Pointing is generally defined here as a meaningful bodily movement that directs attention toward an area of space (Clark 2003; Kendon 2004; Cooperrider et al. 2018b). In particular for this study, finger pointing will be in focus, which is known to be a frequent and multi-functional practice (in both signed and spoken language interaction). The referential (e.g., pronominal) meanings that finger pointing express are well documented across signed languages (see Section 1.2), but complementary interactional meanings have yet to be fully examined. Both the referential and interactive uses of finger pointing, however, affect how language conventions emerge over time and how these conventions intertwine with other semiotic actions. A survey of the interactional meanings, or functions, of finger pointing in face-to-face signed language conversations will be used here as further evidence that the language of conversation is highly indexical and that this indexicality contributes to the coordination of emergent interactions. In this way, the analysis here aligns with previous and current work that considers the “pragmatic” and context-dependent meanings ever present in human interaction as well as the role indexicality plays in language theory (see e.g., Malinowski 1923; Kress 1976; Silverstein 1976; Washabaugh 1981; Halliday 1985; Hanks 1992; Johnston 1992; Clark 1996; Langacker 2001; Hayashi 2005; Ginzburg & Poesio 2016; Keevallik 2018 for a selection of diverse examples of this position).
There is a large body of work that investigates manual pointing in signed languages, and this work shows that signers frequently point with their hands to refer to themselves and others, as well as other visible and invisible referents. In addition finger points can serve locative and determinative functions (e.g., Engberg-Pedersen 2003; Liddell 2003; Nilsson 2004; Johnston 2013; Nordlund 2019; for thorough reviews of the literature see Cormier et al. 2013; Meier & Lillo-Martin 2013).2 For example, in a corpus investigation of manual pointing in Auslan (the signed language used in Australia)—the largest empirical study of signed language pointing to date—Johnston (2013) investigated 5,797 tokens of manual pointing, which occurred across a dataset of questionnaire responses, retellings, and personal narratives, and found that the primary functions of these points were to identify a referent, identify a location, or specify a signed referent as somehow given or known.
A few additional observations in the literature suggest that signers also point to index and regulate an emerging interaction. For example, one early study on turn-taking in American Sign Language mentioned that manual indexing can be used for turn management and conversational feedback (Baker 1977). In a later study on Flemish Sign Language, Van Herreweghe (2002) found that signers use pointing for turn-taking functions in meetings. The goal of the current study was to investigate these, as well as other, interactional meanings of finger pointing in more detail, as they occurred across a corpus of naturalistic Norwegian Sign Language conversations.
To further contextualize and frame the study, it is useful to consider research findings on the social meanings that visible, bodily actions produced by speakers and signers can express in interaction. In particular, the following sections will present examples of research into how people use manual (and other bodily) actions while they talk and sign to serve interactional (rather than referential) functions. These functions, which have also been labeled as “pragmatic,” will provide the starting point for the current study’s analysis of finger pointing, the method of which is detailed in Section 2.
Similar to research priorities in linguistics, researchers of gesture have also been primarily focused on the referential meanings and uses of different kinds of manual and non-manual bodily actions (see Streeck 2009a and Kendon 2017 for useful reviews). Even so, some researchers have examined the “pragmatic” functions of gestures, such as, for example, to mark topic/comment structure or directive speech (Seyfeddinipur 2004), to mark focal discourse or to emphasize (Neumann 2004), to present or receive units of discourse (Müller 2004), to express socio-affiliational meanings (e.g., Enfield et al. 2007), or to express epistemic meanings or stance (Kendon 1995; Streeck 2009a; b; Deppermann 2013; Cooperrider et al. 2018a). Speakers (and signers) also use their hands and bodies to manage emerging interactions (which will be detailed further in the following sections).
These select examples are complemented by additional literature, which represents an interdisciplinary (e.g., linguistics, anthropology, gesture research) interest in the “pragmatic” meanings signers and speakers express with their bodies, which engages a range of scientific and analytical approaches (e.g., experimental, corpus-based, conversation analysis). One of the key findings from this research is that speakers and signers engage their whole body to coordinate their conversations and interactions, especially in relation to turn-taking (e.g., Baker 1977; Schegloff 1984; Heath 2004; Hayashi 2005; Iwasaki 2009; Deppermann 2013; McCleary & Leite 2013; Mondada 2013). In this way, visible bodily actions help speakers maintain the orderliness of conversation in spoken language contexts (Mondada 2007; Streeck 2009a; b; Kamunen 2018) and signed language contexts (e.g., Coates & Sutton-Spence 2001; McCleary & Leite 2013; Girard-Groeber 2015; de Vos et al. 2015). In addition, such actions can be polyfunctional, expressing either multiple pragmatic meanings or combined with referential meanings (e.g., Johnston 1992; Jokinen 2009; Streeck 2009a; Healy 2012; Lepeut 2018; submitted; Gabarró-López 2020; see also Goodwin 1986).
As a way to begin surveying the wide range of interactional meanings of co-speech gesture as well as some signed language practices, early work by Bavelas and colleagues (see below) will be used as an initial framework. Additional literature will be used to expand and comment on this initial functional typology. This limited review will focus primarily on the interactional functions of co-speech manual gestures (and to some extent some particular signed language practices), as opposed to other types of functions, such as some of those mentioned in the previous paragraphs.
Bavelas and colleagues (Bavelas et al. 1992; Bavelas 1994; Bavelas et al. 1995) investigated a type of manual gesture that took the basic form of the finger(s) or palm(s) being oriented towards an interlocutor (and as such represented a type of manual pointing action). They found that speakers would use these gestures not to refer to referents in the discourse, but rather to relate to fellow interlocutors and the interaction itself. Experiments showed that speakers used these gestures more often when they were with other people who they could see (Bavelas et al. 1992) and that these gestures elicited responses from interlocutors (Bavelas et al. 1995). These experiments demonstrated that interactional gestures serve social functions related to the coordination of a conversation, and that they are different from topic gestures (i.e., referential gestures that might depict how a referent looks or acts). These gestures functioned to deliver information, cite previous contributions, seek responses, and manage turns. Each of these functions will be described and supported by additional literature in the following sub-sections. It is this literature coupled with the annotation of the study corpus that led to the interactional categories identified and investigated for this study (detailed further in section 2 below).
Delivery gestures are those that hand over information from the speaker to an interlocutor. They also mark common ground and digressions, or signal information that an interlocutor should elaborate themselves (Bavelas 1994). These gestures were found to be more frequent in contexts with experimentally induced common ground (Holler 2009). Other research has shown that palm up actions also work to deliver information to interlocutors in spoken (e.g., Müller 2004; Streeck 2009a; Lepeut 2018; submitted; Shaw 2019) and signed language interaction (e.g., Lepeut 2018; submitted; Shaw 2019). However, the use of finger pointing for these delivery functions in signed language interaction has yet to be documented or described. This study begins to redress this gap in the literature by examining some examples of this type of pointing in Norwegian Sign Language.
Citing gestures refer back to previous contributions in the interaction. For example, interlocutors can either signal that the current point being made was made previously by another interlocutor or they can show that a response by an interlocutor to a turn at talk was understood. Finger pointing and palm up actions have been shown to be used for this function in both signed and spoken language interaction (Bavelas 1994; Kendon 2004; Lepeut 2018; Shaw 2019).
Seeking gestures aim to elicit a response from an interlocutor. They can be used to check if an interlocutor is following or agreeing with ongoing talk (Bavelas et al. 1992). Alternatively, such manual indexes can request help with finding what to say (Bavelas et al. 1992; Bavelas 1994; Streeck 2009a; Sikveland & Ogden 2012; Lepeut 2018; submitted; Shaw 2019). Similarly, additional studies have shown that speakers can also produce iconic, or representational, gestures (that somehow depict the referent) or word search gestures during moments of lexical retrieval (e.g., clicking or wiggling one’s fingers) (e.g., Goodwin & Goodwin 1986; Streeck 2009b; Holler et al. 2013). In the current study, the use of finger pointing as a way to seek responses from interlocutors is in focus.
A final function of interactional gestures identified in the early studies by Bavelas and colleagues (Bavelas et al. 1992; Bavelas 1994; Bavelas et al. 1995) related to the management of turn-taking. Turn-taking functions have received the most attention from researchers, especially within gesture research and multimodal conversational analysis. Studies have detailed how speakers and signers use manual and non-manual gestures to secure/self-select for a turn at talk (e.g., Streeck & Hartge 1992; Bavelas 1994; Mondada 2007; Streeck 2009b for spoken language interaction, and e.g., Baker 1977; McIlvenny 1995; Van Herreweghe 2002 for signed languages). Interlocutors also use bodily actions to hold turns, either to coordinate talk from other interlocutors or, for example, as a way to create opportunities for them to seek information (Streeck & Hartge 1992; Mondada 2007; Sivkeland & Ogden 2012; Groeber & Pochon-Berger 2014; Ryttervik 2015).
In addition to taking turns and holding turns, speakers and signers also produce bodily actions as a way to complete turns or give next turns, which may involve various forms of open palm or pointing.3 These practices vary depending on the setting, for example, classroom interaction (Kääntä 2012) and instructional contexts (Keevallik 2014), meetings (Van Herreweghe 2002; Mondada 2013; see also Keevallik 2014), and everyday conversation (Baker 1977; Streeck 2009a; Li 2014; Ryttervik 2015; Lepeut 2018). This research examines how interlocutors combine speech, hand, body, and face movements to coordinate turn transitions. In the current study, the use of finger pointing for this function in signed language interaction is examined in more detail.
An additional function that needs to be introduced, but that which was not observed in the Bavelas experiments (Bavelas et al. 1992; Bavelas 1994; Bavelas et al. 1995) relates to conversational feedback and backchanneling. In particular, speakers and signers not only seek following or agreement from others (outlined in section 1.3.3), but they also can give feedback responses through various bodily actions. These actions visibly indicate that someone is an active participant in the ongoing talk and show that they are following or even agreeing with what is being said (e.g., Baker 1977; Healy 2012; Ryttervik 2015; Mesch 2016; Gabarró-López 2020). These visible actions can be manual (e.g., as co-speech gestures, or signed lexical phrases) or non-manual (e.g., in the form of head nods or eye gaze). Spoken response particles, lexical phrases, or other vocal activities such as laughing can also be used (e.g., Coates & Sutton-Spence 2001; see also Deppermann 2013 for such actions at turn beginnings). In this study, finger pointing and its function as conversational feedback is considered, thereby adding to this area of research.
As mentioned in Section 1.2, most research on manual pointing in signed language has focused on referential functions. However, the review of the literature above demonstrated that signers and speakers also point to index aspects of the ongoing talk itself and serve various interactional functions. This study adds to this literature and supports the anecdotal reports in Baker (1977) and Van Herreweghe (2002) by examining how finger pointing serves interactional functions in signed language conversation. Findings will be compared with what has been done on spoken language interaction and will add to a broader understanding of how finger pointing works in signed languages. Before this though, in the following sections, the data used for the study are described and details about data annotation and analysis are provided.
The data for this study come from video recordings of 11 informal conversations about a variety of everyday topics in Norwegian Sign Language, e.g., stories of growing up, history of the deaf community, vacation travels. Recordings were made in various university locations (meeting rooms or classrooms) or at a local deaf association. The aim of the sessions was to elicit spontaneous, and as natural as possible, conversations in Norwegian Sign Language. Many of the participants knew each other and the research assistants. These recordings were collected during two earlier projects, which have been approved by the Norwegian Centre for Research data (#42133 and #55097). Participants have consented to this data being used for research, and they have given consent to using their data and images in research and teaching activities.
The filmed conversations involved between two and five Norwegian signers each, and the analysis here examined 3.4 hours of signing by 21 different signers (15 women and 6 men). Table 1 provides a summary of the data in the study corpus.4
|Conversation||Type||Minutes of recording (minutes of signing annotated)5||Location filmed||Participant||Gender||Age||Age of acquisition|
The data for this study represent in some ways the diversity inherent in the Norwegian Deaf community by including signers with various backgrounds, in terms of their age, where they live, and when they acquired Norwegian Sign Language (age of acquisition). Although this study includes signers who report learning Norwegian Sign Language between 8–12 years old—and are thus often considered to be late learners by linguists—all participants report that they consider themselves members of the deaf community and use Norwegian Sign Language in their daily private and public lives. They, along with the other participants who learned Norwegian Sign Language before the age of seven, are considered able to provide some preliminary insight into the use of finger pointing for interactional functions in the Norwegian deaf community.
The video recordings outlined above were reviewed and annotated in ELAN (Wittenburg et al. 2006). The use of ELAN facilitated the time-alignment of annotations, created on various user-defined tiers, with the video source, allowing the primary data to always remain in view (Crasborn & Sloetjes 2008).6 First, manual signs were identified and annotations were created on two tiers: a right and left hand gloss tier. Labels within these annotations identified tokens of fingerspelling, pointing signs, depicting signs, or manual constructed actions. Empty annotations indicated signs presumed to be fully lexical, and as such, await assignment of a unique identifier—or ID-gloss—from the Norwegian Sign Language lexical database, which is currently being developed (see Johnston 2010; 2016 for more information about the use of ID- glosses in signed language annotation as well as using labels to identify different types of signs).
After the data was tokenized for manual signs, all tokens of pointing were revisited and assigned a subtype, when possible, based on the sign’s meaning in context. These subtypes were initially based on the types of points identified in the Auslan Corpus (Johnston 2008) and described in the Auslan corpus annotation guidelines (Johnston 2016). As annotation of the current dataset progressed, several other subtypes were added, including the interactional points that are the focus of this study. Importantly, tokens of pointing were tagged for multiple functions when warranted, or when the context was ambiguous. For example, a token point could be tagged as both pronominal and locative, or as interactional and pronominal.
After tokens of pointing across the dataset were tagged for function, all the interactional points were again revisited and tagged for their particular interactional function(s) based on the context of the token. Additional tiers in ELAN were used to do this—two Main Function tiers and two Specific Function tiers (which accommodated tokens serving multiple interactional functions). The particular functions tagged on these tiers were based initially on the functions of interactional gestures in spoken English observed by Bavelas and colleagues (Bavelas et al. 1992; Bavelas 1994; Bavelas et al. 1995). These categories were then amended and expanded upon in the light of more recent literature (see review in Section 1.3) as well as the data encountered during annotation of the study corpus. This type of iterative and reciprocal interaction between the data annotation and literature/theory is common in studies employing corpus methods (McEnery & Hardie 2012: 158). The full list of interactional functions observed and tagged in the current dataset, grouped into main and specific functions, is provided in Table 2.
|Delivery finger pointing, as a group, refers to the delivery of information by a signer to an interlocutor:|
Shared information marks material that an interlocutor probably already knows—information that is part of their common ground (Clark & Brennan 1991). They mean, essentially, ‘As you know.’|
Digression marks information that should be treated by an interlocutor as an aside from the main point. Analogues to ‘By the way’ or ‘Back to the main point.’
|Citing finger pointing refers to a (previous) contribution by an interlocutor:|
General citing indicates ‘as you said earlier’/‘what you are saying,’ that is, the point the signer is now making had been contributed earlier by the interlocutor. This pointing action can also be produced by an interlocutor to respond to another signer ‘right, as was (just) mentioned (earlier)’ (as a group or as an individual).|
Acknowledgement of an interlocutor’s response indicates that the signer saw or heard that an interlocutor understood what had been said. Paraphrased, ‘I see that you understood me.’
|Seeking finger pointing aims to elicit a specific response from an interlocutor:|
Seeking help requests a word or phrase that the signer cannot find at the moment. A verbal paraphrase would be ‘Can you give me the word for…?’|
Seeking alone is a word-searching action that does not request help from an interlocutor.
Seeking agreement (/confirmation) asks whether an interlocutor agrees or disagrees with the point being made. Analogous to ‘Do you agree?’
Seeking following asks whether an interlocutor understands what is being said. Verbal equivalents include ‘you know?’ or ‘eh?’ at the end of a phrase.
|Turn-regulating finger pointing refers to issues related to turn management:|
Giving turn hands a turn over to another interlocutor. As if to say, ‘Your turn.’|
Taking turn accepts a turn from an interlocutor. Paraphrased as ‘OK, I’ll take over.’ These points can also be produced to self-select for next turn.
Turn open indicates that it is anyone’s turn, as if to say, ‘Who’s going to talk next?’
Look guides other interlocutors’ gaze/attention to the current signer (in the case they are looking at the wrong person).
Holding turn allows a current signer to continue their turn after a pause or to allow another signer to add a comment or say something. It can also help guide eye gaze.
|Feedback finger pointing shows involvement from non-current signing interlocutors (also known as backchanneling):|
Showing following indicates that an interlocutor understands what is being said.|
Showing agreement indicates an interlocutor agrees with what is being said.
The analysis and tagging of interactional finger pointing in the data occurred over multiple parses of the data by one annotator. All data were reviewed at least twice, in some cases, three times (in addition to the initial identification parses). Each token was scrutinized in context against the backdrop of the literature for potential interactional functions. This involved examining the signer’s bodily actions that occurred simultaneously and immediately surrounding the token, as well as the actions and reactions of other interlocutors. If there was uncertainty around a token in regard to either its main or specific function, this was indicated in the tag with a question mark.
After the annotation of the interactional pointing and their functions was completed, a second annotator was trained and then given a randomized 10 percent of the interactional pointing tokens (n = 40).7 They tagged these tokens for main and specific interactional functions. Cohen’s kappa (which is a measure for inter-rater reliability) between the main annotator and this second annotator on the main categories was calculated at 0.58 (95% CI: 0.44–0.72). This level of moderate agreement was judged as satisfactory here, given the exploratory, and non-experimental, nature of this study. It demonstrates that the interactional meanings coded here have some level of validity. As work into these types of interactional (rather than referential) meanings expand and develop, linguists will be able to collectively help to establish more robust standards into this type of annotation and analysis. This study presents but one contribution to this early effort.
In the following sections, findings from an analysis of the annotations outlined above are reported. In addition, examples will be detailed to demonstrate how signers use these finger points to coordinate conversational moves. The frequency and function of these points indicate that they are an important feature of (Norwegian) signed language conversation. As a final note, the data, ELAN files, and supplementary materials used for this study are openly available via the Open Science Framework at https://osf.io/g8zv6/.
The annotation of the data resulted in the identification of 21,265 manual sign tokens (as annotated on the dominant hand ID-gloss tier), of which 19.62% (n = 4,172) were tokens of pointing. These manual points served a variety of functions, as described above, with most points (n = 2,318, 55.5%) indexing referents. In addition, however, a number of points (n = 345, 8.3%) were observed to serve the interactional functions addressed in this study. See Table 3 for a full summary of the frequency and distribution of different types of pointing across the study corpus. As mentioned in Section 2.2, a token point could be tagged for more than one function, and this is reflected in the figures reported. In total, 4,172 points were tagged for 4,305 functions. Importantly, the 133 tokens tagged for more than one function were often pairings of an interactional function with either a determiner (10 tokens), locative (7 tokens), or pronominal (64 tokens) function. This multifunctionality is reflected in Table 3 and will be revisited in the discussion (Section 4).
|Function||Frequency||Percentage of pointing tokens (n = 4,172)|
While it is common practice in signed language corpus linguistics to calculate sign frequency using annotations on the dominant hand gloss tier, signers can of course produce signs independently on their non-dominant hand. Further examination thus revealed that signers do produce interactional points with their non-dominant hand, independently of the dominant hand. These tokens are included in the analysis presented below and raise the total number of interactional pointing tokens investigated here to 418.
In the studies of spoken English conversations, Bavelas and colleagues (Bavelas et al. 1992; Bavelas 1994; Bavelas et al. 1995) identified a variety of functions interactional gestures can serve without providing much detail about which functions were more or less common. While this is a preliminary study aimed at exploring some of the interactional meanings of finger pointing by signers in the Norwegian deaf community and exploring their theoretical importance, it can still be useful to see which types of functions were used more often than others. Table 4 presents the frequency of tokens for each of the main interactional functions. Note that these figures represent only the interactional pointing that was certainly identified for function (as judged by the main annotator across multiple parses). Uncertainly identified tokens with regards to either the main or specific function represented 2.4% (10/418) of the interactional pointing tokens. These tokens have been removed from the remaining analysis until they can receive further scrutiny. It is also important to reiterate that each token could be tagged for multiple functions (in a similar way that a point could be tagged as, for example, pronominal and interactional). Of the 408 tokens of certainly identified interactional finger pointing, 58 tokens were tagged with two interactional functions, which explains the total 466 main functions reported in Table 4. The most frequent pairs of interactional functions included turn-taking with citing (13 tokens), with feedback (10 tokens), and seeking (11 tokens). Some examples of such dual-function points will be detailed in the following sections.
|Main function||Token frequency||Percentage (n = 408)|
The figures reported in Table 4 show that Norwegian signers in the study corpus most often used interactional pointing to regulate turn-taking, e.g., taking and giving turns. However, there were also a number of tokens that functioned as feedback, both to show agreement and to show that an interlocutor was following/understanding the signer. Compare the fairly numerous tokens of interactional pointing for turn-regulating functions (n = 211) and feedback (n = 132), with the relatively few citing (n = 42). Furthermore, signers in this dataset rarely engaged pointing for delivery functions (n = 18). In the following sections, examples of these different functions of interactional pointing are detailed and discussed, presented in the order of least to most frequent.
As mentioned above, interactional finger pointing that serves delivery functions—shared (n = 16) and digression (n = 2)—were fairly rare in this dataset. However, in order to demonstrate the category, a token is described here from a conversation with two deaf and one hearing interlocutor. The two deaf interlocutors work together, and they were replying to a question from the hearing interlocutor about how to get to the kindergarten from their offices (which all are located on the same campus). First, the male deaf signer had provided his way of getting there (which entailed a path through buildings). Then the female deaf signer begins describing an alternate path that goes outside the buildings. She explains that the reason for this alternative path is that there had been renovations inside the buildings, and so they were not so easily passed through anymore. She explains that where one used to be able to walk through two buildings, now it is blocked off. The example in Figure 1 begins as the signer inserts an aside to mention that this particular part of her workplace used to be a carpentry workshop.
The signer TVG (who is the deaf, female signer on the left in the double video frame in Figure 1) signs PT:LOC ACTUALLY BEFORE IT-IS WORKSHOP, ‘actually, a long time ago that place was a workshop.’ During this utterance, she is looking at the hearing interlocutor (the woman in the black shirt in Figure 1). However, as she produces the sign WORKSHOP, she shifts her gaze to her male colleague and maintains this gaze direction as she produces a very brief (.12 seconds) interactional finger point towards him (shown in the video frame in Figure 1). She then shifts her gaze back to the hearing interlocutor and continues without pause to clarify the initial part of her comment, CARPENTRY FACTORY BEFORE, ‘or a carpentry workshop—factory—before.’
The male colleague (abbreviated as TJ in Figure 1) appears to respond to this point directed at him by providing additional information (YES AND KITCHEN, ‘yes and a kitchen’), which suggests he does indeed know what and where TVG is talking about. He says this while TVG is still signing, and his comment overlaps with her ongoing talk. TVG notices his movement and shifts her gaze back to him while she finishes signing BEFORE, while he produces the signs YES AND. However, TVG does not pause her signing but instead shifts her gaze immediately back to the hearing interlocutor and thus does not see TJ finish his comment. TVG continues describing the way to the kindergarten.
The interactional finger point in this example was analyzed as indicating shared information between the two deaf interlocutors (TVG and TJ). TVG knew that her colleague TJ was familiar with the history of their workplace and she indicated it as such. This token was not interpreted as a comprehension check, because this sequence was mainly directed at the hearing interlocutor who had originally asked the question (evidenced by TVG’s body orientation and gaze). Even in the brief moment that TVG re-directed her gaze towards TJ during her finger pointing, she did not wait to see if he responded and immediately directed her gaze back towards the hearing interlocutor. As we can see from this specific example, signers are capable of indexing their interlocutors to indicate shared common ground, although it does not seem to a very frequent behavior in this study corpus.
The Norwegian deaf signers in this study also used finger pointing to cite the contributions of other interlocutors, similar to what has been observed in some spoken language contexts (Bavelas 1994; Kendon 2004). Specifically, these finger points worked in most cases to cite things said earlier in the discourse (n = 29), or sometimes they were used to acknowledge a response from an interlocutor (n = 13). In some cases, these tokens simultaneously functioned as turn-initiators or even as feedback, as signers would link their upcoming talk with what had just been said or provide a backchannel response acknowledging what was being signed by another interlocutor.
An example of a point that cites previous discourse is illustrated in Figure 2 and occurs as part of a conversation with three deaf interlocutors. Earlier in the conversation, OIS (the deaf man sitting on the right in the double frame in Figure 2) asks the other two interlocutors if they had noticed differences in how different students signed when they were at a particular vocational training school for the deaf. EB (the deaf woman sitting in the middle in Figure 2) responded that the signing was very different indeed and that the students from Trondheim signed “ugly” while the students from Holmestrand signed “pretty.” The conversation continues and after a while TR (the deaf man sitting on the left in the double frame in Figure 2) begins telling about what he remembers from that time. He says that he did see differences among the students’ signing at the vocational deaf school, because they were coming from many different places. The example begins as TR explains how he reacted to seeing those strange signs, ‘There I remember seeing strange signs’ (top row in Figure 2). He then shifts his gaze to EB and signs ALSO PT:INT, ‘and also like you said.’ By pointing towards EB, TR effectively cites the comments that EB produced a minute earlier about some signing being ugly and some being pretty. Note that the finger point, although directed towards EB, is not interpreted to index EB as a referent, as a way to say ‘you.’ It mainly functions to index her previous comments. Then TR explains that after a while, the signing got mixed together and it was fine. He checks this assessment with EB by looking at her while signing FUNCTION OKAY. EB nods in response and agrees, ‘yes, fine, after a while’ (bottom row in Figure 2).
The interactional finger point produced by TR, coupled with a directed eye gaze towards EB effectively indexed EB’s comment that was made one minute before in the conversation. In this way, TR’s finger pointing was able to link his section of the talk with what had been going on before and allowed him to add to the discussion by emphasizing that in the end everyone was signing together.
Finger pointing produced by the participants in this study were sometimes used to seek information from themselves or from other interlocutors (n = 63). In most cases, these points functioned to solicit feedback related to whether an interlocutor was following (n = 32) and/or agreed with what the signer was saying (n = 14). In only a few cases, did signers point as a way to seek help with information (n = 5) or as a way to indicate they themselves needed time to think of what they wanted to say (n = 12).
In one example, a (deaf) signer is talking about a new highway that is being built (Ferrara & Ringsø 2017–2018, DPNTS_Tr_IGB.eaf, 6:58.457–6:59.582). Just after she says, ‘Now, they are building the highway,’ she points to her (hearing) interlocutor to confirm that the interlocutor knows where and what she is talking about. This point is co-produced with a head nod forward and a squinting of the eyes. This action elicits an immediate confirmation from her interlocutor in the form of a head nod and a silently mouthed ja (‘yeah’). In this way the signer confirms the common ground with her interlocutor, which can be used to orientate future conversational moves. These types of comprehension checks were fairly common in the data, as signers frequently checked to see if their interlocutors were following what they were saying.
Signers also pointed as a way to signal that they needed time to think of what to say. An example of this is provided in Figure 3, which begins with a question that TR2 (the woman on the right of the video frame, who is hearing) asks to EMN (the woman on the left of the video frame, who is deaf) about where EMN works. EMN replies, ‘in Ranheim,’ which involved a locative point on her right hand (PT:LOC, shown on top row, Figure 3). EMN holds this point while TR2 looks up as if thinking about where this location is in town and after which she then nods and mouthes the Norwegian word for ‘yes.’ Perhaps because of TR2’s slightly delayed response, EMN produces an interactional point on her left hand to start a new turn and index the question TR2 had just posed (an example of citing as well as turn self-selection, see the leftmost image in the middle row of Figure 3).9
Then EMN shifts her gaze upwards and to the side while she signs KNOW and points again to her interlocutor (also with her left hand, see middle and rightmost images in middle row of Figure 3). This second point lasted 1.5 seconds and functioned to index her interlocutor and give the signer time to think of how to explain the location of her workplace. The shift in gaze away from TR2 is analyzed here as a cue about to how the information search process should be negotiated between EMN and TR2, namely that EMN is not seeking TR2’s help in the process (see Goodwin & Goodwin 1986 regarding interlocutor participation in word searches during spoken interaction). EMN continues looking away as she begins further clarification with the sign BY. Then she shifts her gaze back to TR2 and explains that her work is by the new Kiwi grocery store. EMN then asks if TR2 knows the big hill there, after which TR2 responds with an interactional point of her own (see bottom row in Figure 3), while nodding, confirming that she is following now where EMN is talking about.
It should be noted that the interactional point in this example also functioned as a pronominal (i.e., the second person pronoun ‘you,’ glossed as PT:INT/PRO2 in the middle row of Figure 3). It was interpreted as part of a question ‘you know…?’). However, here the additional function of seeking information was profiled (as well as holding a turn). This finger pointing effectively allowed EMN to hold her turn, while she thought of what to say. These indexical, interactional meanings were co-expressed with referential meanings. In this way, the finger point contributed to the coordination and advancement of the emerging talk.
Throughout the study corpus, signers produced interactional finger pointing that indicated that they were following (n = 75) and, in some cases, agreeing with what another interlocutor was signing (n = 57). These points contrast with those produced by a signer to seek following/agreement, which can be summarized as ‘you know?’ or ‘do you agree?’ Instead, these showing following/agreement finger points tell another signer ‘ah, I see’ or ‘yes, I agree with what you are saying,’ and thus are examples of conversational feedback or backchanneling. Interactional finger pointing that served these functions were common in the data (n = 132, 28.3%), second only to turn-taking functions (presented in the next section).
Figure 4 shows examples of finger pointing as feedback that occurred in a conversation with three deaf interlocutors, although this sequence involves only two of the signers. ES (the woman on the left in the video frames in Figure 4) produces a series of points that show that she is following ULA’s (the woman on the right in the video frames in Figure 4) signing. The example comes after ES recounts her experience learning to vocalize different sounds, and how some sounds would cause a paper placed in front of her mouth to blow over, while others would not. The example begins as ES signs ‘I remember that the letter p [would cause paper to blow over]’ where she ends up looking at ULA. ULA had tried to add a clarification to ES’s story before the example begins, and as she gets ES’s gaze, ULA waves her hand and again begins, ‘hey, the letter d doesn’t make the paper blow down’ (top row of Figure 4). During ULA’s production of the sign QUIET (top row Figure 4), ES begins to produce two interactional finger points towards ULA, coupled with two head nods and a mouthing, ahh, showing that she understands ULA’s qualification to her story (see the first and second image on the left in the middle row in Figure 4).
While ES produces these two finger points, ULA looks at her hands, fingerspells the letter t, and then gazes back to ES to explain that the letter t will cause a paper to blow over. ES responds with YES and another interactional finger point (see the rightmost image in the middle row of Figure 4), again showing that she is following these additional clarifications. This utterance is accompanied with a series of head nods that continue while ULA explains that the letter p will result in the same effect. ULA then contrasts this with the letter b, which will not blow paper down. ES catches this contrasting example and responds with an overlapping interactional point (see bottom row in Figure 4) and a waving action with the palm facing ULA (to indicate negation). Throughout this sequence, ES indicates that she is following ULA’s explanation and that she now understands/agrees with which letters (sounds) go with which effect (also signaled through her frequent head nodding that accompanies this sequence). This example demonstrates how finger pointing as conversational feedback guides the trajectory of moves across the interaction.
In the study corpus, signers most frequently used interactional finger pointing to coordinate turn-taking (n = 211, 45.3%), a function that has only very briefly been mentioned in any signed language linguistics literature (Baker 1977; Van Herreweghe 2002). Specific turn-regulating functions included giving turns to other signers (often used in question contexts) (n = 85) and taking turns (both as a response to another signer giving a turn, but also for self-selection) (n = 76). In addition, signers used finger pointing to indicate that the turn was open for someone to take (n = 7), or if the current signer simply wanted to pause their turn (n = 23), for example, to allow for a small insertion or comment from another interlocutor (similar to an example from spoken French described in Mondada 2007). Finally, in some contexts, signers would use finger pointing to help guide the gaze and attention of other signers (n = 20). For example, it might happen that interlocutor A is looking at interlocutor B, while another interlocutor (C) is the one signing. So, interlocutor B might point to interlocutor C as a way to indicate to interlocutor A that they should shift their gaze to interlocutor C.
An example of using finger pointing to regulate turn-taking occurs in a conversation with three interlocutors (two deaf and one hearing), although this specific sequence only involves the two deaf signers. Prior to the start of the example, TVG (the deaf woman sitting on the left in the double video frame in Figure 5) has been talking about the layout of offices at her work. She explains that the building is a square shape, with the interior being an open space. The example starts as TVG explains that the office of TJ (the deaf man sitting in the middle of the double video frame in Figure 5), is located along the hallway that goes around the open area (see the top row in Figure 5, where TVG signs ‘between your office, which is on the other side of the hall’).
As she goes on to immediately start depicting where the rooms are placed in relation to this open area (top row in Figure 5, DS:ROOM), TJ produces an interactional finger point (PT:INT) followed by the sign ACTUALLY. In this way, TJ indexes what TVG is saying by pointing to her/her signing, while simultaneously self-selecting for a turn. However, TVG does not see this interruption, possibly because her ability to look somewhere is reserved for indexing her own signing in that moment rather than being directed at TJ. Because he has not yet received the gaze of TVG, TJ produces a larger finger point, which approaches TVG’s signing space and peripheral vision (compare TJ’s finger point in the image in the top row with the one in the middle row of Figure 5). This interactional finger point is followed by a false start and then the sign CORRECT. Only then does TVG shift her gaze to look TJ (at 00:36:20.2). TJ then points once again to TVG and her signing space to index the topic she has just been discussing, signs ACTUALLY, and then shifts his gaze in front of him while signing that ‘before it used to be three offices, I think’ (see bottom row of Figure 5). This example demonstrates how a signer can use finger pointing to self-select for a turn, while also creating time and space for other interlocutors to redirect their gaze and attention. Once TJ received TVG’s attention with his interactional finger pointing, he then was able to continue with what he wanted to add to the conversation.
The findings and examples presented above surveyed the various interactional meanings of finger pointing observed within a set of Norwegian Sign Language conversations. They are in many ways similar to the interactional meanings expressed by different types of manual and non-manual bodily actions in various spoken language contexts (as reviewed in Section 1.3). Even though the data analyzed for this study had much in common with previous research, the functions identified should not be considered exhaustive (nor for spoken language interaction for that matter). It is expected that signers may also engage other types of interactional meanings either through finger pointing or other types of bodily actions. These possibly include, for example, to ‘move aside topics,’ which has been described by Streeck (2009a), or to ‘interrupt’ ongoing discourse, which was observed by Kamunen (2018). It is hoped that future work on more spoken and signed language interaction in different contexts will reveal how communities of speakers and signers use bodily actions for interactional purposes and how language is shaped by this use.
An important consideration for the current study is how finger points and the meanings they prompt fit into a theory of (signed) language. Analysis showed that signers finger point for a range of referential and interactional functions. Many of these functions align with previous descriptions in the literature and support findings demonstrating the essential nature of finger pointing in signed language (Engberg-Pedersen 2003; Liddell 2003; Cormier et al. 2013; Johnston 2013). For example, pointing made up a total 19.6% of all manual sign tokens examined in this study, and many of these points served the three main functions discussed in the literature: pronominal (55.6%), locative (17.3%), and determinative (2.8%). Points outlining paths were also very common in the data (13.8%).
However, it became clear through multiple parses of the data that signers engaged in conversation also used finger pointing for a number of other functions, which served to index and regulate aspects of the interaction itself—and represented 8.3% of all the finger pointing in the dataset (based on dominant hand glosses). These interactional functions were the fourth most frequent function of finger pointing in the study corpus, after pronominal, locative, and path points. These figures indicate that signers frequently leverage the indexicality of finger pointing to coordinate emerging interaction in Norwegian Sign Language, and that these functions should be given more consideration in future studies of signed language pointing more generally. Their frequent use here also underscores the importance of investigating diverse text types, and the caution we should have in using, for example, narrative (re-tellings), to make generalizations about signed language use.
The analysis presented here also highlighted the multifunctional nature of finger pointing in Norwegian Sign Language. One example of this was provided in Figure 3 where a finger point served both interactional (seeking and turn holding) and referential (as second person pronominal) functions. In another example (which was illustrated in Figure 2), a signer’s finger pointing toward his interlocutor showed an acknowledgement of her previous comments. It also acted to mark the common ground which had emerged over the course of their conversation. Furthermore, the fact that signers can point to express both referential and interactional meanings, suggests that other types of signed language actions may also serve multiple functions. Future research could consider how other types of signed language actions also express interactional meanings, e.g., in line with the research on palm-up actions (Lepeut 2018; submitted) or how signs are timed and coordinated across turns (e.g., Groeber & Pochon-Berger 2014).
General theories of language have yet to fully integrate the contextual and interactional nature of language, even though these aspects align with cognitive-functional, usage-based linguistics. There is also specific work that directly addresses how such aspects of language use are essential to linguistic theory (e.g., Kress 1976; Silverstein 1976; Washabaugh 1981; Halliday 1985; Johnston 1992; Bavelas 1994; Couper-Kuhlen & Selting 1996; Langacker 2001). Prioritizing one function (propositional, referential) while dismissing others (interactional) is unable to provide a comprehensive account of language and distorts the complexity of language as it is actually used by speakers and signers. Thus, future work could focus more on these “pragmatic” functions of language so that they can be fully explicated and integrated into theoretical thinking. The pointing investigated in the current study is but one example of this type of work.
This paper has reported on a study of finger pointing in Norwegian Sign Language conversations that serve to index aspects of the emergent interaction, and not just discourse referents. Findings from a corpus of Norwegian Sign Language showed that signers frequently point as a way to 1) deliver information (e.g., indicate common ground), 2) cite previous contributions to the interaction, 3) seek a response from an interlocutor, 4) coordinate turn-taking, and 5) provide conversational feedback. These functions have not been previously considered in linguistic studies on pointing in signed languages, which have been primarily focused on referential functions. However, work on co-speech gesture in spoken language interaction has shown that speakers express similar interactional meanings through various manual indexical actions, such as manual pointing and other gestures, including palm-up gestures (see Section 1.3). While direct comparisons between the findings of this study and other literature is difficult due to different styles in reporting on the form of co-speech gestures and the focus of analysis, it is clear that finger pointing and other manual indexical actions, such as palm up actions, are used for interactional functions in both spoken and signed language communities in very similar ways (see Table 5). Indeed, a few recent studies that directly compare the speakers and signers of a community have observed many similarities across groups (e.g., Shaw 2019 for English and American Sign Language; Lepeut submitted for Belgian French and French Belgian Sign Language).
|English||Bavelas et al. 1992; Bavelas 1994; Bavelas et al. 1995; Holler 2009; Streeck 2009a; Healy 2012; Käänta 2012; Shaw 2019||✓||✓||✓||✓||✓|
|French||Mondada 2007; Leupeut submitted||✓||✓||✓||✓|
|Norwegian||Sivkeland & Ogden 2012||✓|
|Mandarin Chinese||Li 2014||✓|
|American Sign Language||Baker 1977; Shaw 2019||✓||✓||✓||✓|
|Flemish Sign Language||van Herreweghe 2002||✓|
|French Belgian Sign Language||Lepeut 2018; Gabarró-López 2020||✓||✓||✓||✓|
|Catalan Sign Language||Gabarró-López 2020||✓||✓||✓|
|Swedish Sign Language||Ryttervik 2015||✓||✓|
|Norwegian Sign Language||current study||✓||✓||✓||✓||✓|
These interactional functions are as important to meaning-making as pointing to indicate referents and locations as part of propositional meanings. In addition, finger pointing in signed language interaction can express both types of meaning, sometimes simultaneously, which makes drawing distinctions between which tokens are to be considered under the purview of linguistics, and which are not, untenable.
Investigating (multimodal) language in conversation provides an opportunity to further develop a theory of language that accommodates the multimodal semiotic diversity and complexity inherent in face-to-face interaction. This diversity and complexity concerns on the one hand semiotic mode, namely an interplay between description, depiction, and indication (Peirce 1955; Clark 1996; Dingemanse 2013; Kendon 2014; Hodge & Ferrara 2018; Keevallik 2018). On the other hand, it entails different kinds of meaning, e.g., ideational, interpersonal, and textual (Halliday 1985); meaning exchange and presence manipulation (Washabaugh 1981); referential and pragmatic (Silverstein 1976). This study has contributed to this goal by providing a preliminary description of how signers are able to use finger pointing (a type of indication) to not only index discourse referents (referential meaning), but to also (sometimes simultaneously) index aspects of the conversation itself (interactional meaning).
The data, ELAN files, and supplementary materials used for this study are openly available via the Open Science Framework at https://osf.io/g8zv6/.
1Please note that in some of this research the term ‘gesture’ is used to refer to the various meaningful visible bodily actions, such as manual co-speech gestures, produced as part of spoken language utterances (à la Kendon 2004). When discussing such work, original terminology is maintained. However, the term ‘gesture’ is problematic in signed language research and often invokes connotations of a sign’s linguistic status (see Kendon 2008 for comment on the differential treatment of the term ‘gesture’ in signed language research and gesture studies). Thus, this paper will adopt alternative terminology throughout in order to avoid this conceptual baggage, e.g., bodily action, manual action, finger pointing, etc.
2Please note that in this article, the terms manual pointing and finger pointing are both used. Manual pointing is a more general term that includes all pointing done with the hand, which may be configured in various handshapes. Finger pointing is reserved for instances when the fingers of the hand are all flexed except for one, often the index finger but sometimes the thumb, which is extended. Of course, people are capable and do point with other bodily articulators (Enfield 2009; Cooperrider et al. 2018). However, in this study, finger pointing is in focus.
3While not attested in the current study’s data, giving turns can also result from imperative utterances or various types of commands. In some recent research on several signed languages, it was shown that these types of utterances can involve both manual indexing actions and particular non-manual actions (Brentari et al. 2018; Donati et al. 2017).
4It should be noted that conversations 1–8 involve both deaf and hearing signers. A hearing signer was recruited to facilitate the collection of these eight conversations. She is a near-native signer, has deaf parents, and is an active member of the deaf community. As initial data annotation has been focused on the signing practices of deaf signers (in relation to other projects), the hearing signer has not been annotated, and so any interactive pointing actions she produces are not included in this study (which explains why she is not listed as a participant in Table 1). However, in later sections of this paper where examples are provided, she is included and any relevant actions on her part are included in the analysis. Conversations 9–11 were facilitated by deaf signers, OIS and LMN, and their data and signing are included in the analysis reported here (their multiple contributions to the study corpus are italicized in Table 1).
5Only parts of conversations 1–8 have been tokenized for manual signs as part of previous projects. The total time annotated and thus analyzed in the current study for these conversations are supplied in parentheses. Conversations 9–11 are fully tokenized for manual signs.
6Please see http://www.lat-mpi.eu/tools/elan for more information regarding this free annotation software, which is developed by Max Planck Institute for Psycholinguistics, The Language Archive, Nijmegen, The Netherlands.
7For training, the second annotator first read Bavelas (1994) and Table 2 here. Then they were given a random 40 tokens to annotate and discuss with the main annotator. After this, they were given another 40 tokens (10% of the data) to tag independently.
8In order to simplify the figures, only active interlocutors are given lines in the transcript. RH and LH indicate signs produced on the right and left hands. Each of these tiers are associated with a signer by including their initials (e.g., RH-TVG indicates the signs TVG produces on her right hand). Translations in English are also provided. Images in the examples are connected to the ELAN timeline and transcript with a solid (red) line. This line shows when in the example the still shot was captured.
9In the preceding utterance, EMN right hand was used to produce a locative point. Her hand then relaxed and was lowered during the utterance shown in Figure 3. The signer holds this relaxed position until she signs NEAR. Thus, this locative point is not analyzed here as contributing to the meaning of the utterance in Figure 3. Instead, focus is on the pointing actions produced on the left hand.
In the figures, Norwegian Sign Language (manual) signs are represented through capitalized English glosses, as is customary in signed language research. Pointing signs are annotated with the prefix PT, depicting signs are annotated with the prefix DS, constructed actions (bodily enactment) are annotated with the prefix CA, and fingerspelling is annotated with the prefix FS.
I would like to thank all the Norwegian signers who shared their language by participating in this project. I also would like to thank Alysson Lepeut for her comments and discussion on an earlier draft of this paper. Finally, I thank the anonymous reviewers who graciously gave their time through constructive and critical comments. They helped me greatly to improve this paper. All remaining errors are my own.
The author has no competing interests to declare.
Abner, Natasha, Kensy Cooperrider & Susan Goldin-Meadow. 2015. Gesture for linguists: A handy primer. Language and Linguistics Compass 9(11). 437–451. DOI: https://doi.org/10.1111/lnc3.12168
Baker, Charlotte. 1977. Regulators and turn-taking in American Sign Language discourse. In Lynn A. Friedman (ed.), On the other hand: New perspectives on American Sign Language, 215–236. New York: Academic Press.
Bavelas, Janet B. 1990. Nonverbal and social aspects of discourse in face-to-face interaction. Text – Interdisciplinary Journal for the Study of Discourses 10(1–2). 5–8. DOI: https://doi.org/10.1515/text.1.1990.10.1-2.5
Bavelas, Janet B. 1994. Gestures as part of speech: Methodological implications. Research on Language & Social Interaction 27(3). 201–221. DOI: https://doi.org/10.1207/s15327973rlsi2703_3
Bavelas, Janet B., Nicole Chovil, Linda Coates & Lori Roe. 1995. Gestures specialized for discourse. Personality and Social Psychology Bulletin 21(4). 394–405. DOI: https://doi.org/10.1177/0146167295214010
Bavelas, Janet B., Nicole Chovil, Douglas A. Lawrie & Allan Wade. 1992. Interactive gestures. Discourse Processes 15(4). 469–489. DOI: https://doi.org/10.1080/01638539209544823
Brentari, Diane, Joshua Falk, Anastasia Giannakidou, Annika Herrmann, Elisabeth Volk & Markus Steinbach. 2018. Production and comprehension of prosodic markers in sign language imperatives. Frontiers in Psychology 9. 770. DOI: https://doi.org/10.3389/fpsyg.2018.00770
Clark, Herbert H., & Susan E. Brennan. 1991. Grounding in communication. In Lauren B. Resnick, John M. Levine & Stephanie D. Teasley (eds.), Perspectives on socially shared cognition, 127–149. Washington, DC: American Psychological Association. DOI: https://doi.org/10.1037/10096-006
Clark, Herbert H. & Deanne Wilkes-Gibbs. 1986. Referring as a collaborative process. Cognition 22. 1–39. DOI: https://doi.org/10.1016/0010-0277(86)90010-7
Coates, Linda & Rachel Sutton-Spence. 2001. Turn-taking patterns in Deaf conversation. Journal of Sociolinguistics 5(4). 507–529. DOI: https://doi.org/10.1111/1467-9481.00162
Cooperrider, Kensy. 2016. The co-organization of demonstratives and pointing gestures. Discourse Processes 53(8). 632–656. DOI: https://doi.org/10.1080/0163853X.2015.1094280
Cooperrider, Kensy, Natasha Abner & Susan Goldin-Meadow. 2018a. The palm-up puzzle: Meanings and origins of a widespread form in gesture and sign. Frontiers in Communication 3. 23. DOI: https://doi.org/10.3389/fcomm.2018.00023
Cooperrider, Kensy, James Slotta & Rafael Núñez. 2018b. The preference for pointing with the hand is not universal. Cognitive Science 42. 1375–1390. DOI: https://doi.org/10.1111/cogs.12585
Cormier, Kearsy, Adam Schembri & Bencie Woll. 2013. Pronouns and pointing in sign languages. Lingua 137. 230–247. DOI: https://doi.org/10.1016/j.lingua.2013.09.010
Couper-Kuhlen, Elizabeth & Margret Selting. 1996. Towards an interactional perspective on prosody and a prosodic perspective on interaction. In Elizabeth Couper-Kuhlen & Margret Selting (eds.), Prosody in conversation, 11–56. Cambridge: Cambridge University Press. DOI: https://doi.org/10.1017/CBO9780511597862.003
Crasborn, Onno & Han Sloetjes. 2008. Enhanced ELAN functionality for sign language corpora. In Onno Crasborn, Eleni Efthimiou, Thomas Hanke, Ernst D. Thoutenhoofd & Inge Zwitserlood (eds.), The third workshop on the representation and processing of sign languages: Construction and exploitation of sign language corpora (a workshop given at the Sixth International Conference on Language Resources and Evaluation, 26 May–1 June 2008, Marrakech, Morocco), 39–43. Paris: European Language Resources Association.
Deppermann, Arnulf. 2013. Turn-design at turn-beginnings: Multimodal resources to deal with tasks of turn-construction in German. Journal of Pragmatics 46(1). 91–121. DOI: https://doi.org/10.1016/j.pragma.2012.07.010
de Vos, Connie, Francisco Torreira & Stephen C. Levinson. 2015. Turn-timing in signed conversations: Coordinating stroke-to-stroke turn boundaries. Frontiers in Psychology 6. 268. DOI: https://doi.org/10.3389/fpsyg.2015.00268
Dingemanse, Mark. 2013. Ideophones and gesture in everyday speech. Gesture 13(2). 143–165. DOI: https://doi.org/10.1075/gest.13.2.02din
Dingemanse, Mark. 2017. On the margins of language: Ideophones, interjections and dependencies in linguistic theory. In Nick J. Enfield (ed.), Dependencies in language, 195–203. Berlin: Language Science Press.
Donati, Caterina, Gemma Barberà, Chiara Branchini, Carlo Cechetto, Carlo Geraci, & Josep Quer. 2017. Searching for imperatives in European sign languages. In Daniël Van Olmen & Simone Heinold (eds.), Imperatives and directive strategies, 111–155. Amsterdam: John Benjamins. DOI: https://doi.org/10.1075/slcs.184.04don
Enfield, Nick J. 2009. The anatomy of meaning: Speech, gesture, and composite utterances. Cambridge: Cambridge University Press. DOI: https://doi.org/10.1017/CBO9780511576737
Enfield, N. J., Sotaro Kita & Jan De Ruiter. 2007. Primary and secondary pragmatic functions of pointing gestures. Journal of Pragmatics 39(10). 1722–1741. DOI: https://doi.org/10.1016/j.pragma.2007.03.001
Engberg-Pedersen, Elisabeth. 2003. From pointing to reference and predication: pointing signs, eyegaze, and head and body orientation in Danish Sign Language. In Sotaro Kita (ed.), Pointing: Where language, culture and cognition meet, 269–292. Mahwah, NJ: Lawrence Erlbaum Associates.
Ferrara, Lindsay & Gabrielle Hodge. 2018. Language as description, indication, and depiction. Frontiers in Psychology 9. 716. DOI: https://doi.org/10.3389/fpsyg.2018.00716
Ferrara, Lindsay & Torill Ringsø. 2017–2018 (collection date). Depicting Perspective in Norwegian Sign Language. Norwegian University of Science and Technology. Unpublished video recordings and annotation files.
Gabarró-López, Sílvia. 2020. Are discourse markers related to age and educational background? A comparative account between two languages. Journal of Pragmatics 156. 68–82. DOI: https://doi.org/10.1016/j.pragma.2018.12.019
Ginzburg, Jonathan, & Massimo Poesio. 2016. Grammar is a system that characterizes talk in interaction. Frontiers in Psychology 7. 1938. DOI: https://doi.org/10.3389/fpsyg.2016.01938
Girard-Groeber, Simone. 2015. The management of turn transition in signed interaction through the lens of overlaps. Frontiers in Psychology 6. 741. DOI: https://doi.org/10.3389/fpsyg.2015.00741
Goodwin, Charles. 1986. Gestures as a resource for the organization of mutual orientation. Semiotica 62(1–2). 29–49. DOI: https://doi.org/10.1515/semi.1986.62.1-2.29
Goodwin, Marjorie Harness & Charles Goodwin. 1986. Gesture and coparticipation in the activity of searching for a word. Semiotica 62(1–2). 51–75. DOI: https://doi.org/10.1515/semi.1986.62.1-2.51
Groeber, Simone & Evelyne Pochon-Berger. 2014. Turns and turn-taking in sign language interaction: A study of turn-final holds. Journal of Pragmatics 65. 121–136. DOI: https://doi.org/10.1016/j.pragma.2013.08.012
Hanks, William. 1992. The indexical ground of deictic reference. In Alessandro Duranti & Charles Goodwin (eds.), Rethinking context: Language as an interactive phenomenon, 43–76. Cambridge: Cambridge University Press.
Hayashi, Makoto. 2005. Joint turn construction through language and the body: Notes on embodiment in coordinated participation in situated activities. Semiotica 156(1–4). 21–53. DOI: https://doi.org/10.1515/semi.2005.2005.156.21
Healy, Christina. 2012. Pointing to show agreement. Semiotica 192. 175–196. DOI: https://doi.org/10.1515/sem-2012-0073
Heath, Christian. 2004. Analysing face-to-face interaction: Video, the visual and material. In David Silverman (ed.), Qualitative research: Theory, method and practice, 266–282. London: Sage Publications.
Holler, Judith. 2009. Speakers’ use of interactive gestures as markers of common ground. In Stefan Kopp & Ipke Wachsmuth (eds.), Gesture in embodied communication and human-computer interaction: GW 2009: Lecture notes in computer science, 11–22. Berlin: Springer Berlin Heidelberg. DOI: https://doi.org/10.1007/978-3-642-12553-9
Holler, Judith, Kylie Turner & Trudy Varcianna. 2013. It’s on the tip of my fingers: Co-speech gestures during lexical retrieval in different social contexts. Language and Cognitive Processes 28(10). 1509–1518. DOI: https://doi.org/10.1080/01690965.2012.698289
Iwasaki, Shimako. 2009. Initiating interactive turn spaces in Japanese conversation: Local projection and collaborative action. Discourse Processes 46(2–3). 226–246. DOI: https://doi.org/10.1080/01638530902728918
Johnston, Trevor. 1992. The realization of the linguistic metafunctions in a sign language. Language Sciences 14(4). 317–353. DOI: https://doi.org/10.1016/0388-0001(92)90021-6
Johnston, Trevor. 2008. The Auslan Archive and Corpus. In Nathan, D. (ed.), The Endangered Languages Archive. London: Hans Rausing Endangered Languages Documentation Project, School of Oriental and African Studies, University of London. http://elar.soas.ac.uk/languages.
Johnston, Trevor. 2010. From archive to corpus: Transcription and annotation in the creation of signed language corpora. International Journal of Corpus Linguistics 15(1). 104–129. DOI: https://doi.org/10.1075/ijcl.15.1.05joh
Johnston, Trevor. 2013. Formational and functional characteristics of pointing signs in a corpus of Auslan (Australian sign language): Are the data sufficient to posit a grammatical class of ‘pronouns’ in Auslan? Corpus Linguistics and Linguistic Theory 9(1). 109–159. DOI: https://doi.org/10.1515/cllt-2013-0012
Johnston, Trevor. 2016. Auslan corpus annotation guidelines. Manuscript. Sydney: Macquarie University. Retrieved from http://auslan.org.au/about/annotations/.
Jokinen, Kristiina. 2009. Pointing gestures and synchronous communication management. In Anna Esposito, Nick Campbell, Carl Vogel, Amir Hussain & Anton Nijholt (eds.), Development of multimodal interfaces: Active listening and synchrony, 33–49. Berlin: Springer Berlin Heidelberg. DOI: https://doi.org/10.1007/978-3-642-12397-9_3
Kääntä, Leila. 2012. Teachers’ embodied allocations in instructional interaction. Classroom Discourse 3(2). 166–186. DOI: https://doi.org/10.1080/19463014.2012.716624
Kamunen, Antti. 2018. Open hand prone as a resource in multimodal claims to interruption: Stopping a co-participant’s turn-at-talk. Gesture 17(2). 291–321. DOI: https://doi.org/10.1075/gest.17002.kam
Keevallik, Leelo. 2014. Turn organization and bodily-vocal demonstrations. Journal of Pragmatics 65. 103–120. DOI: https://doi.org/10.1016/j.pragma.2014.01.008
Keevallik, Leelo. 2018. What does embodied interaction tell us about grammar? Research on Language & Social Interaction 51(1). 1–21. DOI: https://doi.org/10.1080/08351813.2018.1413887
Kendon, Adam. 1995. Gestures as illocutionary and discourse structure markers in Southern Italian conversation. Journal of Pragmatics 23(3). 247–279. DOI: https://doi.org/10.1016/0378-2166(94)00037-F
Kendon, Adam. 2004. Gesture: Visible action as utterance. Cambridge: Cambridge University Press. DOI: https://doi.org/10.1017/CBO9780511807572
Kendon, Adam. 2008. Some reflections on the relationship between ‘gesture’ and ‘sign’. Gesture 8(3). 348–366. DOI: https://doi.org/10.1075/gest.8.3.05ken
Kendon, Adam. 2014. Semiotic diversity in utterance production and the concept of ‘language’. Philosophical Transactions of the Royal Society B 369. 20130293. DOI: https://doi.org/10.1098/rstb.2013.0293
Kendon, Adam. 2017. Pragmatic functions of gestures: Some observations on the history of their study and their nature. Gesture 16(2). 157–175. DOI: https://doi.org/10.1075/gest.16.2.01ken
Langacker, Ronald W. 2001. Discourse in cognitive grammar. Cognitive Linguistics 12(2). 143–188. DOI: https://doi.org/10.1515/cogl.12.2.143
Lepeut, Alysson. 2018. Gesture and sign on common ground: the social functions of manual and gaze behavior in older Belgian French (BF) speakers and French Belgian Sign Language (LSFB) signers’ interactions. Paper presented at the 8th Conference of the International Society for Gesture Studies (ISGS): Gesture and Diversity, 4–8 July 2018, Cape Town, South Africa.
Li, Xiaoting. 2014. Multimodality, interaction and turn-taking in Mandarin Chinese. Amsterdam: John Benjamins. DOI: https://doi.org/10.1075/scld.3
Liddell, Scott K. 2003. Grammar, gesture, and meaning in American Sign Language. New York: Cambridge University Press. DOI: https://doi.org/10.1017/CBO9780511615054
McCleary, Leland & Tarcísio de Arantes Leite. 2013. Turn-taking in Brazilian Sign Language: Evidence from overlap. Journal of Interactional Research in Communication Disorders 4(1). 123–154. DOI: https://doi.org/10.1558/jircd.v4i1.123
McEnery, Tony & Andrew Hardie. 2012. Corpus linguistics: Method, theory and practice. Cambridge: Cambridge University Press. DOI: https://doi.org/10.1017/CBO9780511981395
McIlvenny, Paul. 1995. Seeing conversations: Analyzing sign language talk. In Paul Ten Have & George Psathas (eds.), Situated order: Studies in the social organization of talk and embodied activities, 129–150. Washington, DC: International Institute for Ethnomethodology and Conversation Analysis & University Press of America.
Meier, Richard & Diane Lillo-Martin. 2013. The points of language. Humanamente 6(24). 151–176. Retrieved from http://www.humanamente.eu/index.php/HM/article/view/154.
Mesch, Johanna. 2016. Manual backchannel responses in signers’ conversations in Swedish Sign Language. Language and Communication 50. 22–41. DOI: https://doi.org/10.1016/j.langcom.2016.08.011
Mondada, Lorenza. 2007. Multimodal resources for turn-taking: Pointing and the emergence of possible next speakers. Discourse Studies 9(2). 194–225. DOI: https://doi.org/10.1177/1461445607075346
Mondada, Lorenza. 2013. Embodied and spatial resources for turn-taking in institutional multi-party interactions: Participatory democracy debates. Journal of Pragmatics 46. 39–68. DOI: https://doi.org/10.1016/j.pragma.2012.03.010
Mondada, Lorenza. 2014. The local constitution of multimodal resources for social interaction. Journal of Pragmatics 65. 137–156. DOI: https://doi.org/10.1016/j.pragma.2014.04.004
Müller, Cornelia. 2004. Forms and uses of the Palm Up Open Hand: A case of a gesture family? In Cornelia Müller & Roland Posner (eds.), The semantics and pragmatics of everyday gestures, 233–256. Berlin: WEIDLER Buchverlag.
Neumann, Ragnhild. 2004. The conventionalization of the ring gesture in German discourse. In Cornelia Müller & Roland Posner (eds.), The semantics and pragmatics of everyday gestures: The proceedings of the Berlin conference April 1998, 217–224. Berlin, Germany: WEIDLER Buchverlag.
Nilsson, Anna-Lena. 2004. Form and discourse function of the pointing toward the chest in Swedish Sign Language. Sign Language and Linguistics 7(1). 3–30. DOI: https://doi.org/10.1075/sll.7.1.03nil
Nordlund, Sanna. 2019. Agent defocusing in two-participant clauses in Finnish Sign Language. Glossa: A journal of general linguistics 4(1). 1–27. DOI: https://doi.org/10.5334/gjgl.801
Parrill, Fey. 2008. Form, meaning, and convention: A comparison of a metaphoric gesture with an emblem. In Alan Cienki & Cornelia Müller (eds.), Metaphor and gesture, 195–217. Amsterdam: John Benjamins. DOI: https://doi.org/10.1075/gs.3.11par
Ryttervik, Magnus. 2015. Gesten PU i svenskt teckenspråk: En studie i dess form och funktion [Palm up gesture in Swedish Sign Language: An investigation of its form and function]. Stockholm: Stockholm University Masters thesis. Retrieved from https://www.diva-portal.org/smash/get/diva2:862407/FULLTEXT02.pdf.
Sacks, Harvey, Emanuel A. Schegloff & Gail Jefferson. 1974. A simplest systematics for the organization of turn-taking for conversation. Language 50(4). 696–735. DOI: https://doi.org/10.2307/412243
Schegloff, Emanuel A. 1984. On some gestures’ relation to talk. In J. Maxwell Atkinson & John Heritage (eds.), Structures of social action: Studies in conversation analysis, 266–295. Cambridge, England: Cambridge University Press. DOI: https://doi.org/10.1017/CBO9780511665868.018
Seyfeddinipur, Mandana. 2004. Meta-discursive gestures from Iran: Some uses of the ‘pistol hand’. In Cornelia Müller & Roland Posner (eds.), The semantics and pragmatics of everyday gestures: The proceedings of the Berlin conference April 1998, 205–216. Berlin, Germany: WEIDLER Buchverlag.
Sikveland, Rein Ove & Richard Ogden. 2012. Holding gestures across turns: Moments to generate shared understanding. Gesture 12(2). 166–199. DOI: https://doi.org/10.1075/gest.12.2.03sik
Silverstein, Michael. 1976. Shifters, linguistic categories, and cultural description. In Keith H. Basso & Henry A. Selby (eds.), Meaning in anthropology, 11–55. Albuquerque, NM: University of New Mexico Press.
Streeck, Jürgen. 2009a. Gesturecraft: The manu-facture of meaning. Amsterdam: John Benjamins. DOI: https://doi.org/10.1075/gs.2
Streeck, Jürgen. 2009b. Forward-gesturing. Discourse Processes 46(2–3). 161–179. DOI: https://doi.org/10.1080/01638530902728793
Streeck, Jürgen & Ulrike Hartge. 1992. Previews: Gestures at the transition place. In Aldo Di Luzio & Peter Auer (eds.), The contextualization of language, 135–157. Amsterdam: John Benjamins. DOI: https://doi.org/10.1075/pbns.22.10str
Van Herreweghe, Mieke. 2002. Turn-taking mechanisms and active participation in meetings with deaf and hearing participants in Flanders. In Ceil Lucas (ed.), Turn-taking, fingerspelling, and contact in signed languages, 73–103. Washington, DC: Gallaudet University Press.
Washabaugh, William. 1981. Sign language in its social context. Annual Review of Anthropology 10. 237–252. DOI: https://doi.org/10.1146/annurev.an.10.100181.001321
Wilkes-Gibbs, Deanne. 1997. Studying language use as collaboration. In Gabriele Kasper & Eric Kellerman (eds.), Communication strategies: Psycholinguistic and sociolinguistic perspectives, 238–274. London/New York: Addison Wesley Longman Limited.
Wittenburg, Peter, Hennie Brugman, Albert Russel, Alex Klassmann & Hans Sloetjes. 2006. ELAN: a Professional Framework for Multimodality Research. In Proceedings of LREC 2006, Fifth International Conference on Language Resources and Evaluation, 1556–1559. http://hdl.handle.net/11858/00-001M-0000-0013-1E7E-4.