Dancing around the subject with robots: ethical communication as a “triple audiovisual reality”
Eleanor Sandry, The University of Western Australia
Communication is often thought of as a bridge between self and other, supported by what they have in common, and pursued with the aim of further developing this commonality. However, theorists such as John Durham Peters and Amit Pinchevski argue that this conception, connected as it is with the need to resolve and remove difference, is inherently ‘violent’ to the other and therefore unethical. To encourage ethical communication, they suggest that theory should instead support acts of communication for which the differences between self and other are not only retained, but also valued for the possibilities they offer. As a means of moving towards a more ethical stance, this paper stresses the importance of understanding communication as more than the transmission of information in spoken and written language. In particular, it draws on Fernando Poyatos’ research into simultaneous translation, which suggests that communication is a “triple audiovisual reality” consisting of language, paralanguage and kinesics. This perspective is then extended by considering the way in which Alan Fogel’s dynamic systems model also stresses the place of nonverbal signs. The paper explores and illustrates these theories by considering human-robot interactions because analysis of such interactions, with both humanoid and non-humanoid robots, helps to draw out the importance of paralanguage and kinesics as elements of communication. The human-robot encounters discussed here also highlight the way in which these theories position both reason and emotion as valuable in communication. The resulting argument – that communication occurs as a dynamic process, relying on a triple audiovisual reality drawn from both reason and emotion – supports a theoretical position that values difference, rather than promoting commonality as a requirement for successful communicative events. In conclusion, this paper extends this theory and suggests that it can form a basis for ethical communication between all kinds of selves and others, not just between humans and robots.
Successful communication is often thought of as dependent upon the transmission or exchange of information and/or the construction of, and a dependence upon, shared social understandings between individuals (Carey, 1992; Chang, 1996; Peters, 1999; Pinchevski, 2005). When described in these terms communication becomes understood as a bridge between interlocutors; a bridge that is not only founded on their commonalities, but that also seeks to develop those commonalities further. However, a number of theorists, including John Durham Peters and Amit Pinchevski, have argued that this reliance on resolving or eliminating the differences between communicators makes these conceptions of communication inherently 'violent' to the other and therefore unethical (Peters, 1999; Pinchevski, 2005). This paper, following Peters' and Pinchevski's lead, acknowledges ethical communication as an important goal for which to strive, and develops an understanding of communication that reassesses the difference between self and other as a valuable part of communicative processes. Its argument is supported by an analysis of the communication that occurs between humans and various different forms of robot, both in fiction and in real life.
Since the term 'robot' was first used, to refer to the artificial humans in the 1920 play R. U. R. (Rossum's Universal Robots) written by Karel Čapek, the figure of the robot has provided a complex site within which many different perspectives about reason and emotion are juxtaposed. In addition, robots in both fiction and fact are often created with human-robot interactions in mind, and while in many cases the robots taking part in these interactions are humanlike in form, there are also examples in which communication occurs with robots whose difference is overtly represented. In choosing to write about communication with machines, animals and aliens, Peters suggests that "[b]y exploring our strangest partners" it is possible "to illuminate the strangeness that occurs in the most familiar settings" (1999, p. 231). While this idea is at the heart of this paper, what makes a consideration of the communication occurring between humans and robots particularly productive is the way in which robots are positioned on some occasions as familiar and humanlike, but on other occasions as strange, non-humanoid partners. An analysis of human-robot interactions can therefore be used to interrogate the placement of commonality and sameness, as well as to highlight the importance of difference and otherness, in communication and communication theory.
In this paper, two fictional robot examples and one real-life example are analysed as a means to draw out conceptions of communication theory. The first example involves a humanoid robot and illustrates the effects of the pursuit of commonality as a basis for communication, as well as a particular way of positioning reason and emotion. The subsequent examples of interactions with non-humanoid robots allow a broader exploration of the complex relationship between communication, emotion and reason. In particular, the value of nonverbal aspects of communication becomes clearly visible in these exchanges, as verbal language takes on less of a focus. Non-humanoid robots therefore draw attention to the opportunities offered by understanding communication in terms of a triple structure, as suggested by Fernando Poyatos (1983). Considering the overt otherness of communicative non-humanoid robots supports the argument that, whatever their form, the bodily expressions of communicators are valuable and should not be overlooked as an important aspect of their communication. Although this paper considers human-robot communication, its argument is not primarily concerned with the need to develop an ethical response to robots per se. Instead, it draws on examples of human interactions with robots - in which difference is sometimes less, and sometimes more, overtly represented - to suggest alternative ways to think about the importance of attending to reason and emotion, mind and body, as important in promoting ethical communicative encounters between all different kinds of interlocutors.
Humanoid robots, commonality and communication as a bridge
My first example is that of Lieutenant Commander Data, an officer on the starship Enterprise, who appears in the Star Trek: The Next Generation television series and feature films. Data is a fully autonomous humanoid machine, he reasons and makes decisions for himself in real time, and is therefore able to work alone or as part of a team in the same way as any other member of the crew. As his name suggests, he would seem best regarded as a technologically advanced, embodied, computer system. He is very nearly human in appearance, although he has some physical anomalies, such as his rather unusual eye colour, and the metallic sheen to his skin. In addition, there are some quirks in his behaviour that set him apart; for example, he rarely uses verbal contractions, saying "is not" instead of "isn't". Most notably, Data is described as not having humanlike feelings, and therefore as not easily expressing humanlike emotions. This means that his face retains a rather blank expression most of the time, although as I discuss below this is not always the case. In addition, Data is often unusually precise in what he says and shows a particular concern to provide detailed and accurate information. He uses a systematic approach to his interactions with humans, following turn-taking conventions very carefully to avoid interrupting others. Nevertheless, he is generally portrayed as communicating and interacting in very human ways with other members of the crew. It follows from this description that possibly the most appropriate communication theories with which to consider a machine like Data are particular readings of the cybernetic and semiotic traditions, those that focus on the precision coding of information using language, together with the accurate transfer or exchange of that coded information from one person to another.
The cybernetic tradition has its foundations in the work of scholars such as Claude Shannon and Warren Weaver (1948), Norbert Wiener (1948) and Alan Turing (1950), being drawn out of, and feeding back into, research into systems and information science, artificial intelligence and cognitive theory. The driving philosophical assumptions behind this tradition are concerned with materialism, rationalism and functionalism, the result being theory that regards all communication in terms of information processing and exchange within systems (Craig, 1999). Of course, Data's construction as an embodied computer system only serves to link his design with cybernetic theory, and with what is known as the cybernetic tradition of communication, even more closely. As Robert T. Craig notes, the cybernetic tradition shares some "common ground" with other communication traditions, and in particular he highlights the similarity between the cybernetic and the semiotic traditions (1999, p. 142). For Craig, this similarity is based in the way that semiotics collapses "human agency into underlying or overarching symbol-processing systems" (1999, p. 142). A similar link, between semiotic communication as a means to "purge semantic dissonance", and cybernetic communication as a process of "information exchange", is also made by Peters (1999, pp. 12 & 24). However, this paper suggests that if one is to explain human communication as a process of information transfer or exchange, then it is useful to interweave semiotic and cybernetic theories of communication even more closely than either Craig or Peters. Semiotic theory describes communication in terms of the use of particular signs whose meaning is shared between interlocutors. It can therefore be argued that such intersubjective understandings of signs provide the means by which information is coded and decoded, and therefore accurately conveyed, in cybernetic models of the process of information exchange. In discussing human communication, and also communication with Data as a sophisticated humanlike machine, it is therefore suggested that a combined cybernetic-semiotic theory can be seen at work.
Data is able to accurately collate, analyse and disseminate many different kinds of information in familiar humanlike ways, and this would seem to make him a near perfect human-computer interface from the perspective of this cybernetic-semiotic model of communication. It can be suggested that Data's precision and tendency to rely on reason as opposed to any emotional content produces human language that has to an extent been 'machine coded'. This is also in keeping with the way that cybernetic-semiotic theory overlooks the possibilities of ambiguity or undecidability in human language, considering this to be a form of noise (Chang, 1996; Hayles, 2005). However, Data's abilities still lead Craig to draw the conclusion that, from the perspective of the cybernetic tradition, Data "might be truly the most 'human' member of the Enterprise crew" (1999, p. 141). Craig's comment emphasises the way that cybernetic, and therefore also cybernetic-semiotic, theories of communication are able to regard Data as "most 'human'", even in the absence of humanlike emotional expression (1999, p. 141). Thinking of Data in this way therefore validates the importance of reason, mind and language over emotion, face and body in humans and human communication.
Of course, what is missing from this analysis of Data, and Data's communication, is the fact that he does show emotional expressions in many of the series' episodes and films. It might be argued that Data's occasional use of expressions simply shows how difficult it is for a human actor to appear consistently as an emotionless robot. However, the way in which the camera focuses on Data's face on many different occasions indicates that, while he is understood not to have feelings, he is regarded as perfectly capable of learning the appropriate human emotional responses to particular situations. Therefore, although Data relies on his cybernetic-semiotic precision to communicate much of the time, he is also depicted as needing to be able to express some emotion in order to 'connect' with the rest of the crew as individuals and as a social group. These ideas draw on other areas of communication theory that are more open to the value of feeling and emotional expression, in particular the sociopsychological and sociocultural traditions.
Craig describes the sociopsychological tradition as theorising communication "as a process of expression, interaction and influence" (1999, p. 143). This theory therefore places expression, and this means not only semiotic expression but also emotional expression, as an important part of interpersonal communication. From this perspective, the semiotic precision of the message and the reduction of extraneous noise are no longer enough to ensure successful communication. Instead, in the sociopsychological tradition the success of communication is also dependent on the message's influence on the receiver in light of their predispositions. Therefore even Data, whose communication is focused on cybernetic-semiotic excellence, demonstrates some ability to read social cues from others, although as discussed above, while he sometimes reacts to situations, his expressions are often rather limited.
The sociocultural tradition broadly describes communication as a "symbolic process through which reality is produced, maintained, repaired, and transformed" (Carey, 1992, p. 23). Sociocultural communication is regarded both as depending upon, and as a means to maintain, the existing order of a social and cultural environment. As a part of his 'life' on the Enterprise, Data's experiences and thoughts about emotions and feelings are often explored by the various storylines. At times his lack of emotion is highly valued, however on other occasions it is Data's attempts to understand and to express human emotion that help him to be more easily accepted as part of the community onboard the starship. For example, Captain Picard clearly values Data's unemotional nature when he remarks: "I only wish we were all so well balanced" (Roddenberry, 1988). However, storylines such as that relating to Data's attempts to tell jokes emphasise the ways in which this helps to bring him closer to those around him (Roddenberry, 1987).
Peters suggests that broadly it is in the late nineteenth century that communication acquired "its grandeur and pathos as a concept" with the coining of the words "solipsism" in 1874 and "telepathy" in 1882 (1999, p. 5). As he notes, both terms "reflect an individualist culture in which the walls surrounding the mind were a problem" (1999, p. 5). For solipsism the walls were "terrifyingly impermeable", whereas for telepathy they were "blissfully thin" (Peters, 1999, p. 5). Since then, communication has been characterised in various forms, but most share the aim of somehow bridging the chasm between self and other. All of the theories discussed above suggest that commonality offers the means to create this bridge. Thus, while the question of Data's humanness is complex, and his form and behaviour do at times only serve to stress his difference from the humans around him, more often emphasis is placed on the need for him to be as humanlike as possible. Interactions with Data therefore illustrate the idea of communication relying on commonality, whether based: in precision language, in accuracy of transmission, in mimicking expression and being part of a social system or in the idea that reason should be used to influence others with the aim of achieving agreement. However, Peters offers a strong critique of "the dream of communication as the mutual communion of souls" and the "pervasive sense that communication is always breaking down" (1999, p. 1). In particular, he argues that if communication "is taken as the reduplication of the self (or its thoughts) in the other", an idea directly illustrated in the creation of Data, then "it deserves to crash, for such an understanding is in essence a pogrom", an organised massacre, "against the distinctness of human beings" or indeed, as I'll argue below, against the distinctness of any form of being (1999, p. 21). Instead, Peters suggests that a better course might be to "renounce the dream of communication while retaining the goods it invokes", with the aim of finding "an account of communication that erases neither the curious fact of otherness at its core nor the possibility of doing things with words" (1999, p. 21).
Peters' approach to developing ethical communication is to set aside the need for reciprocity, and therefore the valorisation of ideal communication as dialogues based in "the capacity to communicate on an even footing" (1999, p. 62). Instead, he revisits the concept of communication as dissemination, since it offers the chance to "meet others with some fairness and kindness" (1999, p. 62). However, while he clarifies that "[o]pen scatter is more fundamental than coupled sharing", Peters nonetheless regards it as "the stuff from which, on rare splendid occasions, dialogue may arise", a comment which suggests that dialogue retains its position as an exceptional form of communication (1999, p. 62). Peters is clearly sensitive to the idea of valuing difference between interlocutors, but he does not conclude that communication is impossible: that interlocutors are so divided by their differences that they cannot come to understand one another at all. Instead, by embracing a pragmatist approach drawn in the main from John Dewey, he is encouraged to characterise communication as "the project of reconciling self and other" (1999, p. 9).
While this is a positive way to proceed in considering communication and difference, Peters' formulation would seem to be undermined somewhat by the ease with which dissemination can be understood to favour the most powerful voice, as well as the relatively fine line between "reconciling self and other" and the "mutual communion of souls" (1999, p. 9 & 1). In response to the need for encouraging a positive appraisal of difference between interlocutors, this paper argues that, rather than just trying to do "things with words", it is also vital to accept a broader sense of communication for which nonverbal as well as verbal signs are valuable. In particular, it is useful to draw on Poyatos' conception of the "triple audiovisual reality" of communication, "what we say - how we say it - and how we move what we say" (1997, p. 249). For Poyatos, a researcher into the subtleties of simultaneous translation, communication consists of: verbal language, speech itself; paralanguage, non-verbal voice qualities, modifiers and sounds used to support meaning; and kinesics, the body language of face, eye and hand movements, and also overall body movements, postures and manners (1997, p. 249). This paper therefore combines an attention to the triple structure of communication, offered by Poyatos' theory, with a consideration of communicative acts as moments of encounter between selves and others during which difference is seen as an integral part of communication as opposed to being a problem to be overcome. An understanding of communication as this kind of encounter, or event, is associated with the phenomenological tradition, which is described by Craig as being concerned with the "experience of otherness" in authentic relationships "founded on the experience of direct, unmediated contact with others" (1999, p. 138). From a phenomenological perspective, recognising the importance of difference, or otherness as it is termed in many phenomenological theories, as well as the presence of bodies, is a key part of ethical communication. It is with an attendance to otherness and the presence of bodies, along with an understanding of the triple structure of communication, that the second example of human encounters with robots, which are radically different from Data, is introduced.
Non-humanoid robots, difference and the dynamic 'dance' of communication
The Fish-Bird project is the result of a long-term collaboration between the artist Mari Velonaki and roboticists at the Centre for Social Robotics in Sydney. This robotic art installation consists of two robots in the form of wheelchairs, which interact with one another and also with people who enter the installation space.
Fig. 1: The Fish-Bird project
Each wheelchair has been given an individual personality, based on two characters from a Greek myth, Fish and Bird, and in keeping with their ancient namesakes the modern day Fish and Bird have fallen in love, "but cannot be together due to 'technical' difficulties" (Velonaki, 2010, p. 3). The robots have various patterns of behaviour that control their movements. They also develop specific "artificial 'emotional' states" in response to their perceptions of the movements of the other robot and of humans that enter the installation space (Fish-Bird: Background, 2006, no page). A large part of their communication can therefore be understood through the term kinesics. In addition, each robot prints notes with a thermal printer using a different handwriting style, and a robot's emotional state also affects which texts are chosen for printing. The fragments of text are taken from donated love-letters, the works of the poet Anna Akhmatova and a text written by Velonaki herself. Their use of text means that these robots also (to an extent) therefore use language.
When they are together without visitors the robots take part in a constant stream of interaction with one another, indicated both by their movements, and their choice of the more personal texts. When a visitor enters their installation space the interaction between Fish and Bird is interrupted and the robots turn to 'face' the person that has entered. Visitors are therefore understood to "disturb the intimacy of the two characters", and in this initial moment of encounter people become aware that the robots are "responding to them in real time, rather than moving in a repetitive automatic manner" (Velonaki, 2010, pp. 3 & 5). The robots cease to exchange personal messages and start to talk about trivialities such as the weather, in what might be understood as a common reaction to the presence of a stranger. As visitors begin to explore different ways of interacting with the robots their movements cause varied responses, the intention and mood of the robots being shown through their speed and direction of movement. For example, "[a] robot indicates dissatisfaction or frustration during interaction by accelerating to a distant corner, where it remains facing the walls until its 'mood' changes" (Velonaki, 2010, p. 3). This description of Fish and Bird indicates that these robots have been designed with a sense of emotion and reason firmly embedded together and operating in a connected way. Emotion is present in the notes and the choice of those notes, and also, possibly even more clearly, in the movements of the robots around one another and their human visitors. What is also clear from considering these robots is the way in which the kinesic channel works so well to communicate these emotional cues, while also continually stressing the differences between the robots and humans.
Velonaki suggests that dialogues are able to develop between the robots and their visitors as the wheelchairs move around based on their "'perception' of the body language" of the humans, who then proceed to react in their turn "to the 'body language' of the wheelchairs" (2006, p. 74). However, they might be better thought of in more dynamic terms, because in these interactions a reliance on the nonverbal kinesic communication channel, together with the production of notes as opposed to spoken language, promotes an understanding of communication which is less about turn-taking in dialogue, and more about a continuous process in which signs overlap, even as they are produced by the participants. This setting aside of turn-taking rules, however, does not mean that the participants are not paying attention to one another, instead the moments of attention and response intermingle in a more dynamic and flowing way. As Donna Haraway suggests, the result is that this kind of "embodied communication is more like a dance than a word" (2006, p. 111; 2008, p. 26).
Haraway develops her understanding in part from the work of Barbara Smuts, in particular Smuts' observations of greeting rituals in baboon society and, closer to home, communications taking place with her dogs. In addition to Haraway's suggestion, Smuts herself describes her relationships with her dogs as "a perpetual improvisational dance, co-created and emergent, simultaneously reflecting who we are and bringing into being who we will become" (2006, p. 115). Smuts' description is drawn from the work of Shanker and King, who suggest that the "dance metaphor" and the "dynamic systems paradigm" that underpins this description of communication have become prevalent in a number of areas of communication research, citing ape language research, nonverbal communication research and infant development research (2002, p. 605). However, the dance metaphor is also an appropriate way to explain Fish and Bird's communications, both with one another and with human visitors. Not only have these robots been programmed with "the capability to perform detailed 'choreographed' sequences" as part of their movements, but there is also a clear sense of dance present more broadly in the patterns that humans and robots make as they move around one another in the installation space (Fish-Bird: Background, 2006, n.p.).
Stuart Shanker and Barbara King identify the central focus of the dance metaphor as "co-regulated interactions and the emergence of creative communicative behaviors within that context" (2002, p. 605). This perspective on communication therefore concentrates more on the interaction and the development of the relationship, than on the individual participants. In particular, Shanker and King draw upon the work of Alan Fogel, who has developed a conception of co-regulation as a process that "occurs whenever individuals' joint actions blend together to achieve a unique and mutually created set of social actions" (1993, p. 6). Fogel stresses that "[c]o-regulation arises as part of a continuous process of communication, not as a result of an exchange of messages borne by discrete communication signals" (1993, p. 6). It is therefore only possible to consider co-regulation from a perspective for which communication consists of information moving in a "continuous process system", or "dynamic system" to use Shanker and King's terminology, as opposed to a "discrete state system" (Fogel, 1993, p. 65).
Drawing on the descriptions of communication presented in this paper, it is the cybernetic-semiotic conception of communication that is most clearly placed within a discrete state system. In such a system "there are senders and receivers" and "[t]he purpose of communication is for the sender to alter the behavior of the receiver by transmitting informative messages" (Fogel, 1993, p. 65). In contrast, as Fogel clarifies in a more recent paper, in a dynamic system:
[w]ords, gestures, and expressions can be altered in their shape, intonation, size, explicitness, duration, clarity, force, and on many other dimensions depending upon the ongoing and simultaneous flow of communicative actions. (1993, p. 13)
Robots such as Fish and Bird, in their interactions with one another and with human visitors, can be understood to demonstrate this type of "ongoing and simultaneous flow of communicative actions" and, although they do not use "[w]ords, gestures, and expressions" in the same ways as humans, they nonetheless express themselves through their own forms of language, paralanguage and kinesics. In addition, while Fish and Bird initially greet their human visitors as strangers, indicated by their somewhat wary movements and discussions of the weather, if their visitors remain calmly present within the installation space then the robots begin to communicate about their relationship once again, and may even write messages directly addressed to the humans. Encounters with Fish and Bird therefore illustrate not only a dynamic situation of communication, but also the way in which relationships change over time. The behaviour of these robots indicates the reappraisal of human 'strangers' as somewhat familiar. However, even as this occurs it is also quite clear that the differences between these robots and their visitors remain, such that the robots and humans are still other to one another.
The dance metaphor and dynamic systems theory enables development of a conception of communication that is far less like a bridge, and is instead better understood as a complex navigation of self and other in a dynamic process of meaning creation. This understanding of communication does not rely on sameness or commonality; instead it is the constantly evolving relationship that forms during the interaction that supports communication and offers the possibility of coordinated action, of "doing things" as Peters asks (1999, p. 21). However, it may still seem that this theory provides a less precise idea of communication than the theory discussed in relation to Data, and it is also difficult to see how such a conception of communication explains how things might be done in the real world. Therefore, the final example in this paper is chosen to help describe more clearly how otherness, and the coordination of diverse individuals working together to complete joint tasks, might coexist successfully.
Communicating and working with non-humanoid robot others
R2-D2 is a non-humanoid robot, or droid, that plays a central role within the Star Wars films, most often in the company of the humanoid robot C-3PO. R2 is described as an "astromech droid" or an "astro-droid" within the films, a class of robot used to service and repair all different kinds of machinery, including starships. The astromech droids, R2 included, do not speak using a human language, but instead produce their own language that consists of a range of electronic noises that 3PO, as a "protocol droid" specialising in translation, is able to interpret for the other characters in the films. Many different examples of both protocol and astromech droids are shown at work in various scenes, but in the same way that 3PO is the only protocol droid to play a major speaking role, R2 is the only astromech droid whose character is fully developed by the storylines.
In contrast with the placement of Data, as an expert cybernetic-semiotic communicator, it is unclear how easily R2's language of beeps and other noises could ever be understood fully by humans. R2's inability to speak any form of humanlike language means that he cannot be regarded as skilled in cybernetic-semiotic communication with humans, yet nonetheless he is shown to play a major part in the rebel alliance's fight against the empire. The actions and role of R2-D2 therefore continually raise the question of how such a decidedly non-humanoid astromech droid's communicative abilities can support its seemingly vital role throughout the Star Wars films. To explain this mismatch, between R2's production of nonhuman language instead of human-recognisable words, and his importance to the rebels' plans, it should first be noted that while 3PO translates R2's electronic beeps for humans, he always speaks to R2 using human language. R2 may not be able to produce speech, but he clearly understands human language with ease. This demonstration of R2's ability to understand in this way helps to explain how easily he is charged with carrying out specific missions, and knows of detailed rebel plans, about which 3PO often knows nothing.
However, if R2's communication is analysed in terms of Poyatos' conception of paralanguage and kinesics, as opposed to concentrating wholly on language, it is easier to see how this robot can act as an effective colleague and companion. Early in the film Star Wars: A New Hope 3PO and R2 have been taken by Jawa traders while trekking across the Tatooine desert in an attempt to find Obi-Wan Kenobi. Luke Skywalker and his Uncle Owen need to buy some new droids, and 3PO is swiftly chosen, along with a different model of astromech droid. As 3PO and the other droid move off, R2 shuffles from side to side, producing plaintive beeps as he is left behind. Luckily the other astromech droid malfunctions, and R2 takes his chance to attract attention by beeping loudly and bobbing up and down. After a short altercation, and some encouragement from 3PO, R2 is picked as a replacement and he is able to join his humanoid companion. However, when 3PO suggests that he should be grateful, R2's paralinguistic response, a rudely blown raspberry, clearly expresses that he is very unimpressed with this idea.
From this example, it seems that the aspect of R2-D2's communication most readily understood not only by other characters in the films, but also by the audience, is the emotional content. R2-D2 can therefore be understood as an "emotional" robot, whose emotion is conveyed very effectively through paralanguage and kinesics, in the absence of a humanlike face and body. In addition, this interchange at least begins to demonstrate the way in which movement and sounds also provide cues to support coordinated action with R2, as he is able to draw attention to himself when required by use of sound, body and "head" movements. Reading R2's communication in this and other situations gives film viewers the sense that his emotions play a role in the decisions he makes, and in addition that emotions are a major part of his communication. It is his communicative actions that establish R2 as a well-developed character or personality, vulnerable, courageous and sometimes rude (in particular to 3PO). The cues that R2 produces through paralanguage and kinesics are able to provide the same kind of dynamic and overlapping communicative effect as was discussed for Fish and Bird. Working with R2 therefore might also be best understood in terms of the dance metaphor, as opposed to through a strict turn-taking structure.
It should be noted that this positioning of R2, as an essential member of the resistance team, stands out in stark contrast with the assumption, made by many real-life roboticists, that robots with non-humanoid form, movement and communication ability would be less valuable as workers in human-robot teams. Only a few roboticists think otherwise, for example Matthew Johnson, Paul Feltovich and Jeffrey Bradshaw note that examples from science fiction, citing R2 in particular, "suggest how effective simple robots can be in interaction with humans" (2008, p. 6). Johnson et al also note, and indeed their paper is entitled, "R2 Where Are You?", that the translation of this non-humanoid conception of the robot helper into real life has been slow. I would suggest that this is because roboticists are more comfortable with the more clear cut idea of communication as transmission of information, than the dynamic systems approach, and with the idea of coding in language as opposed to the overlapping, and less structured use of nonverbal signs. However, while acknowledging that it might well be particularly useful to be able to give a robot such as R2 spoken instructions, Johnson et al appear open to the idea that human-robot interaction is not dependent on designing robots with humanlike form and the ability to use human language. In contrast, their paper focuses instead on the idea that, in order to be able to work with humans, a robot needs to be able to collaborate in many different situations in a flexible way, with its nonverbal communications seen as sufficient to support coordinated action. It would seem that this need for flexibility would be well catered for by a robot able to take part in a dynamic process of communication, enabling a co-regulated tailoring of joint action between human and robot as they work together towards a particular goal. What is also clear is that such joint action is possible even when self and other are considerably different from one another, as in the case of R2 and Luke Skywalker who are portrayed as developing a close partnership in the films.
The first example presented in this paper illustrated the ways in which various traditions of communication theory are assumed to be reliant upon the similarities between communicators, an idea that is overtly illustrated by the creation of humanoid robots. However, in contrast, the second and third examples, involving non-humanoid robots, have stressed the ways in which otherness and difference can be placed as important aspects of communication, through turning to focus on the eloquent use of nonverbal communication channels in human-robot interactions. However, Geoffrey Bennington suggests that otherness, difference and difficulty should have a more valued place in all human communication and offers two scenarios as examples. In the first:
Someone comes and says something. Without really needing to think, I understand what is said, refer it without difficulty to familiar codes, would assign meaning and intention confidently if questioned about them, and possibly I even reply. This situation of 'communication', in its banality, is one in which nothing much happens: information may be transmitted, contact maintained, an order given and received, but nowhere is the established normality of language use or its associated 'forms of life' called into question. (1994, p. 1)
Here, Bennington describes a moment of communication that can be explained by recourse to the cybernetic-semiotic transmission of information, the sociocultural maintenance of social order, or the sociopsychological understanding of influencing others to do one's will. In this scenario difference may not be present, and if it is then it is destined to be eliminated, because as Bennington remarks, nothing is "called into question". However in the second scenario:
Someone comes and says something. This time I do not quite understand, or am not entirely sure of having understood. Something in what is said, or the manner of its saying jars, doesn't quite fit, seems perhaps to break a rule or transgress a norm, be it phonetic, grammatical, semantic, sociolectal, paralinguistic, behavioural. Something appears to have been meant or intended, but I am less confident about what exactly it is. In this situation, something has happened: an event, however small, has occurred an uncertainty opens up: maybe it isn't just that this utterance breaks the rules - maybe this follows other rules, and if it does, might not those rules aspire to replace my own? Maybe these rules are better rules? Or is this a tricky attempt to talk me into something? (1994, p. 1)
Here, while the communication that takes place is still linked to the use of human language, the scenario is more open to the effects of nonverbal communication, and the presence of emotion and reason. Otherness is overtly placed, notably in ways that include the "paralinguistic and behavioural", and Bennington has described a dynamic situation, one in which questions are raised over the difference that has been brought to light, with the suggestion that "[m]aybe these rules are better rules".
This paper therefore suggests that viewing communication as a dynamic process - involving the development of a triple audiovisual reality drawn out of both the reason and emotion of interlocutors - supports a theoretical position that values difference, rather than promoting commonality as a requirement for successful communicative events. Although the conception of communication suggested here arose from considering human-robot interactions, as the brief analysis of Bennington's scenarios shows, it can be extended to offer new ways to promote ethical communication in many different self-other encounters including those in which robots are not participants.
Bennington, G. (1994). Legislations: The politics of deconstruction. London; New York: Verso.
Čapek, K. (2006). R. U. R. Adelaide: eBooks@Adelaide, The University of Adelaide Library.
Carey, J. (1992). Communication as culture: Essays on media and society. New York: Routledge.
Centre for Social Robotics. (2006). Fish-Bird: Background. Retrieved from http://www.csr.acfr.usyd.edu.au/projects/Fish-Bird/Background.htm.
Chang, B. G. (1996). Deconstructing communication: Representation, subject, and economies of exchange. Minneapolis: University of Minnesota Press.
Craig, R. T. (1999). Communication theory as a field. Communication Theory, 9(2), 119-161.
Fogel, A. (1993). Developing through relationships: Origins of communication, self, and culture. New York: Harvester Wheatsheaf.
Haraway, D. (2006). Encounters with companion species: Entangling dogs, baboons, philosophers, and biologists. Configurations, 14(1), 97-114.
Haraway, D. (2008). When species meet. Minneapolis; London: University of Minnesota Press.
Hayles, N. K. (2005). My mother was a computer: Digital subjects and literary texts. Chicago: University of Chicago Press.
Johnson, M. J., Feltovich, P., & Bradshaw, J. M. (2008). R2 where are you? Designing robots for collaboration with humans. In ICRA 2008 Workshop on Social Interaction with Intelligent Indoor Robots (pp. 7-12). Pasadena, CA: ICRA.
Peters, J. D. (1999). Speaking into the air: A history of the idea of communication. Chicago; London: University of Chicago Press.
Pinchevski, A. (2005). By way of interruption: Levinas and the ethics of communication. Pittsburgh, Pennsylvania: Dusquene University Press.
Poyatos, F. (1983). New perspectives in nonverbal communication studies in cultural anthropology, social psychology, linguistics, literature, and semiotics. Oxford; New York: Pergamon Press.
Poyatos, F. (1997). The reality of multichannel verbal-nonverbal communication in simultaneous and consecutive interpretation. In Poyatos, F. (Ed.) Nonverbal communication and translation: new perspectives and challenges in literature, interpretation and the media (pp. 249-282). Amsterdam; Philadelphia: J. Benjamins.
Roddenberry, G. (Creator). (1987). Code of Honor [Television series episode]. In Star Trek: The Next Generation. Hollywood, CA: Paramount Home Entertainment.
Roddenberry, G. (Creator). (1988). Datalore [Television series episode]. In Star Trek: The Next Generation. Hollywood, CA: Paramount Home Entertainment.
Shanker, S. G., & King, B. J. (2002). The emergence of a new paradigm in ape language research. Behavioral and Brain Sciences, 25, 605-656.
Shannon, C., & Weaver, W. (1948). The mathematical theory of communication. Urbana: University of Illinois Press.
Smuts, B. (2006). Between species: Science and subjectivity. Configurations, 14(1), 115-126.
Smuts, B. (2008). Embodied communication in non-human animals. In A. Fogel, B. J. King, & S. G. Shanker (Eds.), Human development in the twenty-first century: Visionary ideas from systems scientists (pp. 136-146). Cambridge: Cambridge University Press.
Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59, 433-460.
Velonaki, M., Rye, D., Scheding, S., & Williams, S. (2006). Sharing spaces: risk, reward and pragmatism (pp. 74-79). Presented at the New Constellations: Art, Science and Society Conference, Sydney.
Velonaki, M., & Rye, D. (2010). Human-Robot Interaction in a Media Art Environment. In What do collaborations with the arts have to say about HRI? (pp. 16-20), Osaka, Japan: HRI.
Wiener, N. (1948). Cybernetics. New York: John Wiley.