Author's note: This essay is the one of the responses I wrote last fall for my PhD exams for the RS&TC program here at the University of Minnesota. The question I was responding to was as follows:
Discuss the rhetorical nature of technology and technology transfer. How are these two concepts constructed rhetorically? What role(s) does rhetoric play in "transferring" technology?
Examining the relationships among technology, technology transfer, and rhetoric reveals the fact that rhetoric aids in technology transfer by helping to conceal technological complexity through "black boxing." Black boxing is the principle agent that makes technology transfer possible, and it is also the method by which technologies are often used as means of control. Generally speaking, attempts to use technologies and rhetoric for purposes other than control will depend on the extent to which the communities using those technologies and rhetorics are able to contribute to the epideictic orthodoxies that help govern both technology and rhetoric.
In this paper, I rely on a variety of sources to help create a working definition of technology. I then attempt to use that definition to untangle the relationship between rhetoric and technology and to better understand rhetoric as technology. I then discuss the role that both concepts have played in the creation of systems and networks to encourage hierarchy, control, and creativity. I argue that the principle mechanism rhetoric uses in this function is to help conceal complexity--a concept Latour labels as "black boxing." I then discuss the role that rhetoric plays in the transfer of technology from the lab back into the world. Finally, I conclude by arguing that epideictic discourse and the rhetorical notion of commonly held beliefs, or endoxa may help us better understand technology as the impact that it has on people and the environment.
Before attempting to unravel and support the thesis that begins this paper, it is necessary to establish working understandings for key terms such as technology, tools, machines, and networks. To begin with, Hughes (2004) sees technology as a "creative process involving human ingenuity" (p. 3), and later as a "creative means to a variety of ends" (p. 5). Key components of Hughe's definition include creativity and process. Technology is not a thing, it is way of dealing with the material world. Furthermore, by focusing on the ends of technology, Hughes makes it clear that technology cannot be considered in isolation from the effects that it has on the social and material worlds--on people and the environment. Winner (1978) views technology as composed of a variety of sub-components: in Winner's definition technologies can be broken down into apparatuses such as individual tools or weapons, techniques (that is, skills, crafts, and other human activities), organizations, which Winner defines as technological social arrangements, and networks: "large scale systems[s] that [combine] people and apparatuses linked across distances" (p. 12). Similarly, Hughes also notes that machines can be connected into systems, but he adds that systematization is often thought of as a dehumanizing force (think urban planning), while networks denote hope and human connections (think Vannevar Bush's vision for the Memex device) (Hughes p. 97). Latour (1987) describes a machine as a device that holds otherwise disparate forces together: "This makes a machine different from a tool which is a single element held directly in the hand of a man or a woman. Useful as tools are, they never turn Mr or Mrs Anybody into Mr or Mrs Manybodies!" (p. 129). Similarly, Mumford (1967) claims that the first machine was the bow and arrow, which is the first human-made device to do more than simply extend the function of an existing organ (such as a club extending the function and force available in the naked human hand). Mumford also describes the systems of human capital and control used to build the pyramids as a "megamachine," which he justifies as more than just an "idle play on words" (p. 191). "If a machine be defined ... as a combination of resistant parts, each specialized in function, operating under human control, to utilize energy and to perform work, then the great labor machine was in every respect a genuine machine" (p. 191). Therefore, for the purposes of this paper, I will consider technology as including creative combinations of tools, machines, processes and/or people that are connected by networks into systems, with the purpose of making, creating, or changing things in the social and material worlds.
Rhetoric and Technology
One thing which hopefully becomes immediately apparent is the role that rhetoric and communication must play in any technological process. Doheney-Farina (1992) discusses this explicitly in terms of technology transfer, but in order to fully understand the role that rhetoric plays in technology transfer, it is necessary to have a sense of the role that rhetoric plays in the creation of technology itself. Mumford, Latour, and Winner make this clear as well. For example, Mumford points to cave paintings and other artistic artifacts as proof that pre-historic people were concerned with communicating for reasons beyond coordinating the hunt--in fact, a central thesis of Mumford's book is that ostensibly non-essential communication such as myth, metaphor, and the pondering of dreams played a larger role in humans' development of ingenuity and larger brains than did practical endeavors such as gathering food or hunting (Chapter 2). Latour places the definition of a "good machine" directly in the hands of rhetoric and persuasion. He claims that a good machine is only good when the bugs have been gotten out of it, and when people are convinced that it works. To make his point, he offers the following two statements as coming from opposite sides of a two-faced Janus figure: "Once the machine works, people will be convinced," and "The machine will work when all the relevant people are convinced" (Latour, 1987, p. 10). Latour also describes machines as collections of allies; that is, to build a machine that can work in the world (i.e. be transferred from the lab to outside the lab), you have to do more than get all the parts to work, you have to collect allies to help you effect "a machination, a strategem, a kind of cunning, where borrowed forces keep one another in check" (p. 129). But whereas Latour sees these sociological complex forces in purely descriptive terms, Winner views this lack of control as a loss of human agency: "In complex, large-scale systems that characterize our time, it is seldom the case that any single individual or group has access to a technological process along the whole of its conception, operation, or result" (Winner, 1978, p. 228). For Winner, this amounts to technology that is in some sense autonomous of human control. But for rhetoricians, the tension between Latour and Winner points to the role that rhetoric plays in controlling technology through technical communication. The tension between technology and human agency becomes even more apparent if rhetoric is thought of as a technology in and of itself.
Rhetoric as Technology
If technology is defined to include creative combinations of processes and people, constituted in such as way as to effect changes in the world, then rhetoric clearly can clearly be considered a technology. While this may come as no great surprise to rhetoricians, it is interesting to see how this concept manifests itself in the work of Mumford (1967), Foucault (1970), and Winner (1978). Once again dealing with pre-history, Mumford challenges the idea that "man" as tool user is more important to human development than man as talker. He argues that the biological changes that moved the task of gathering food from our mouths to hands had its greatest effect in that it freed our mouths up for speech and creativity. Citing Dutch historian J. Huizingen, Mumford presents the concept of "Homo Ludens," or, "the idea that play, rather than work, was the formative element in human culture: that man's most serious activities belonged to the realm of make-believe (p. 7). Furthermore, Mumford argues in Chapter 4 that language cannot have simply been a response to a task as simple as coordinating a hunt, since simple coordination (i.e., "Go to the edge of the forest and wait for me.") does not require the kinds of rich metaphors of human language. Instead, Mumford claims that people developed metaphorically rich language to help themselves understand their dreams and to nurture their loved one (Chapter 4). Additionally Mumford points out something so obvious it can easily be overlooked: Language is a pre-requisite for every other kind of technology (p. 97).
Foucault (1970) picks up this thread in his description of the role that language plays as part of epistemic warehouse: "All knowledge is rooted in a life, a society, and a language that have a history, and it is in that very history that knowledge finds the element enabling it to communicate with other forms of life" (p. 372). According to Foucault, language's role in the epistemic warehouse is rooted in custom and the human mind:
Having become a dense and consistent historical reality, language forms the locus of tradition, of the unspoken habits of thought, of what lies hidden in a people's mind; it accumulates an ineluctable memory which does not even know itself as a memory. ( Foucault p. 297)
Setting aside the Foucauldian mystery of how language could involve itself in "unspoken habits of thought," the key insight is clear. In fact, it is reminiscent of Mumford's observation that the mind is something quite different from the brain, and quite impossible without language (Chapter 2). The crux of Foucault's project in The Order of Things is to explain how rhetorical changes have impacted that epistemic warehouse. Foucault claims that before the modern era, language and knowledge were centered around categorizing and taxonomies. Indeed, Foucault insightfully points out that categories rely on the natural ambiguity of language, since a perfect one-to-one naming system would be completely flat, and therefore meaningless. In other words, language is not simply about naming things; language concerns itself with putting like with like and thereby attempting to understand differences (Foucault, pp. 96-103). Foucault argues that during the modern era, language (and consequently knowledge) moved from taxonomies to grids: "The centre of knowledge in the seventeenth and eighteenth centuries is in the table" (p. 75, emphasis in original). This reflects changes in writing and even printing technologies; it is easier to communicate using grids and tables if these devices can be reproduced consistently. In essence, these rhetorical changes impact the nature of knowledge itself--changes in language and writing affect the way that knowledge is discovered, understood, created, and transferred.
By way of comparison, Winner places the proof of modern science not in knowledge, but in action. Winner argues that science no longer proves itself by what it knows, instead, science today is interested in what it can do--through technology: "Science, then succeeded first as a way of knowing ... Only later as a scientific technology did it triumph as a means of control and manipulation" In this way, technology makes us "masters" of both nature and other people--a central purpose of Winner's work is to challenge that notion and to question our role as ostensible masters of technology.
Systems and Networks as Agents of Control and Community
From its earliest uses, writing functioned as an agent of control. It did this by strengthening networks. Mumford describes the megamachine as the network of social and human capital that made monumental works such as the pyramids possible. He argues that the the new technology of writing helped make the networks and systems that enabled the megamachine to be more efficient: "Action at a distance, through scribes and swift messengers, was one of the identifying marks of the new megamachine (p. 192). Likewise, Latour is interested in this concept when he describes the necessity for science to "act at a distance." He argues that in order to do this, science must transofrm events, places, and people into mobile, stable, and combinable entities. Modern science does this by creating calculable data out of events, places, and people and then bringing that data back to the lab for analysis. He claims that labs function as "calculation centers," which are essentially nodes on the networks connecting "insiders," which Latour defines as the actual scientists, and "outsiders," which are other science workers such as managers who secure funding. Latour could be echoing Dewey (1926) who more than half a century earlier was also concerned with the interactions that researchers make with each other over networks. Dewey too tries to frame these interactions in terms of calculations: "The power of physical facts to coerce beliefs does not reside on the bare phenomena. It proceeds from methods, from the technique of research and calculation" (Dewey, p. 3).
However, Dewey is less interested in how this observation impacts science and technology and more interested in what it means for conceptions of "the public": "Transactions between singular persons and groups bring a public into being when their indirect consequences--their effects beyond those immediately engaged in them, are of importance" (p. 64). In other words, a public will be formed if a group such as a school board and individuals such as parents begin a transaction such as levying a property tax on all of the residents in a neighborhood. In essence, Dewey is arguing that any network or system that impacts people outside of itself will result in a public being formed over concerns about those impacts. But a key point in Dewey's analysis is that technological networks alone cannot create community: "Till the Great Society is converted into the Great Community, the Public will remain in eclipse. Communication alone can create a great community" (p. 142).
Nevertheless, Licklider and Taylor (1968) were optimistic about the role that networked computers could play in the creation of community, even as the earliest computer networks were being developed. They paint a hopeful future for the communities that may be made possible by such networks. They predict that networked computers will lead to communities "not of common location, but of common interest" (p. 38). Nevertheless, about mid-way through the Internet's development, Herring (1993) found that communities built around computer mediated communication were similar to traditional communities in that conversations tended to be dominated by males, which constitutes a form of a censorship, "thus an essential condition for democratic discourse [was not met]" (Herring, n.p.). The conclusion that Doheney-Farina (1998) reaches is similarly leery of the community-building potential of the Internet in The Wired Neighborhood. This book disputes optimistic claims that the Internet will by its very nature increase human communities and interaction. For example, in his analysis of the virtual school, Doheney-Farina finds that the internet can offer much in the way of "connectivity plenty," but very little in the way of "pedagogical plenty" (p. 116). Instead he worries that the Internet itself will lead to an "ever more centralized core of expertise" (p. 116). That is anathema to learning. Furthermore, he argues that "a community is always bound by place, which always includes complex social and environmental necessities. It is not something you can easily join. You can't subscribe to a community as you subscribe to a discussion group on the net" (p. 37). 1 In the following section, I investigate the concern over hierarchy and control that is implicit in each of these analyses of the effects that technology and communities can have on each other. I believe that hierarchy and control are as important to understanding technology transfer as any other factor.
Hierarchy and Control
As mentioned above, Mumford describes how writing helped serve as an agent of control: "If one single invention was necessary to make the larger mechanism operate for constructive tasks as well as coercion, it was probably the invention of writing" (p. 192). A statement like the previous one may cause shudders in writing studies departments, where a great deal of thought and energy is rightfully spent on giving students the ability to use writing as means of empowerment and liberation. Nevertheless, a key component of Mumford's insight points to the complexity of writing's role in the story of human creativity. He argues that although the megamachine represents in one sense, the ruling class's willingness to profit at the expense and poverty of the masses, it also represents the first time that people could imagine and accomplish things much bigger than themselves, that we could build anything we could imagine. Similarly, Dewey sees other technologies as central to creativity through control. He argues that the lens and the level made possible more precise machines, which made possible individual labor, without which the very ideas of individualism and liberal democracy are much harder to imagine (Dewey, pp. 88-90). Nevertheless, Dewey also makes it clear that no amount of technology can ensure freedom without positive human action: "No man and no mind was ever emancipated merely by being left alone. ... Positive freedom is not a state, but an act which involves methods and instrumentalities for control of conditions" (p. 168). Again, note the connection between control and creativity--writing enables control, but control itself can be used for creative or for coercive purposes.
Yates (1989) offers a related analysis in her case studies of changes of communication in nineteenth-century American industry. She finds that three major changes arose in response to the systems management theories that were taken up by industries such as the railroads in the nineteenth century. These three major technological changes were the typewriter, copying technologies, and vertical filing. Although these three technologies seem mundane and obsolete today, Yates argues convincingly that they represented and enabled a fundamental shift in the way that companies did business--making possible the change from informal spoken coordination of business activities to formal coordination and control through new written genres such instructions, manuals, and forms, as well as statistical reporting that made use of tables of graphs (further reinforcing the shift in the epistemological warehouse begun in the seventeenth century and described by Foucault as noted above).
The sources discussed in this section make it clear that technology and writing both play crucial roles in the creation of control and the creativity that control sometimes enables. But in order to better understand the mechanisms by which this control is effected, it is helpful to look for ways in which technology has been seen as possessing agency of its own.
Technology as Agent
Of all the items on this reading list, Langdon Winner's Autonomous Technology (1978), spends the most effort in dealing with the potential that technology has for operating outside of human control. Although Winner shies away from overly deterministic views of technology that predict an unavoidable point in time when our technologies will be "smarter" than we are; he also disputes common assumption that human beings can know and control their technological creations, or even that technology can be considered a neutral tool under any circumstances: "Autonomous technology is nothing more or less than the question of human autonomy held up to a different light" (p. 43). Winner connects the core concepts at the center of autonomous technology with Marx's idea of alienated technology (p. 36), but as Winner is quick to point out, even for Marx, technology was never completely alienated from humanity, because in a Marxian analysis, technology is controlled by an elite "capitalist class" (pp. 39-40). Winner also connects autonomous technology to traditional notions of technological determinism, since "in a fundamental sense, ... determining things is what technology is all about" (p. 75). But Winner points out that technological determinism goes further, arguing that technology not only changes society, but that it is the most important factor in those changes. Furthermore Winner counsels that although technological determinism may have its problems, it ought not be rejected outright, since for example, Marx has shown how changes in the processes of production can lead to social changes. In the final analysis, Winner takes what amounts to a more or less deterministic standpoint, summed up well on p. 278: "Technology is now a kind of conduit, such that no matter which aim or purpose we decide to put in, a particular kind of product inevitably comes out." To the extent that we have shaped and continue to shape that conduit, the deterministic nature of this perspective may be mitigated.
It is important to note that Winner seems not to consider deterministic technology as an entirely bad thing. For example, he argues that loss of control can lead to technological drift, a concept which holds that "technology is most productive when its ultimate range of results is neither foreseen nor controlled" (p. 98). Nevertheless, Winner also acknowledges that the loss of human control over technology will not lead to ends that will be appreciated by all, pointing to concepts such as "reverse adaptation," in which technological systems control markets, politics, and the very needs they are created to serve. At the center of this problem lies what Winner labels as "complexity and the loss of agency,"--the fact that "relative ignorance is growing" (p. 283). In other words, even though more and more knowledge is bring produced, each of us understands a smaller proportion of that knowledge. Even worse, we have become "pathologically dependent" on science and technology, which leads to the state that Winner (citing Koestler) calls "urban Barbarism" p. 283. But for Winner, this lack of knowledge is not as bad as the reaction that we have to it--he argues that even as we know "less and less about the fundamental structures and processes that sustain" us, we do not really care, choosing instead to be entertained by the spectacles offered by technology itself (pp, 295-6). Similarly, Dewey notes that the ability to record what has happened and report the news has outstripped society's ability to do anything with that information (Dewey, Chapter 5). Winner argues that the prevelance of urban barbarism combined with computer technology has led to a world of "concealed electronic complexity" (p. 285). This "concealed complexity" helps explain the method by which technology can continue to grow in both scale and complexity. Latour describes the same phenomenon as a "black box" (p. 2). I discuss black boxes in the following section.
Latour describes a black box as a conceptual object originally developed by "cyberneticians whenever a piece of machinery or set of commands is too complex" to understand fully (pp. 2-3). In such cases, cyberneticians construct a black box, "about which they need to know nothing but its input or output." Latour observes that black boxes have expanded to be used for other technologies of excess complexity as well, since black boxing a technology allows scientists to ignore the complexity of the science and technology upon which their own work is built. Significantly, the examples Latour offers in his explanation of black boxing both depend on rhetoric as scientific or technical writing: "If [the genetic scientist] wants to know anything about the DNA structure or about the Eclipse [the computer he is using to conduct his research], [he] opens Molecular Biology of the Gene or the User's Manual, books that he can take off the shelf" (p. 4). In addition to these transparently rhetorical mechanisms for building and maintaining black boxes, Latour describes the creation of black boxes in general explicitly as a product of persuasion. He argues that putting a structure or a machine in a black box means that everyone agrees to be convinced about it, while at the same time, agreeing to be convinced is a pre-requisite for black boxing the structure or machine in the first place. In other words, black boxing is at the center of another of Janus's dictum's offered by Latour: "When things are true they hold," and "When things hold, they start becoming true" (p. 12). However Turkle (1997) makes clear that this two-sided relationship between truth and concealment does not come naturally to human understanding. She claims that children often struggle to understand objects and toys that are based on computing technology; she argues that this difficulty stems from the lack of transparency in how such objects function--since they are black boxes, children are not able to take them apart to understand their workings (p. 79). This has led subsequent generations of children treat such objects as existing on the boundary of life--they understand that their toys are not alive, but they seem to be somewhat more than mere machines in that they appear to think and have other human characteristics. As Doheney-Farina (1992) makes clear, rhetoric and the role that it plays in black boxing are at the center of technology transfer as well.
Technology Transfer as a Rhetorical Process
Doheney-Farina (1992) is the only item on this list that investigates technology transfer from an explicitly rhetorical perspective. Doheney-Farina's goal in this collection is to investigate the relationship that technical communication plays in technology transfers. As such, he offers a helpful definition of the term itself: "Technology transfer is an umbrella term that refers to an entire range of activities involved in developing new technologies and their applications for the marketplace" (p. 3). In other words, technology transfer is the completion of a cycle of stabilization, collection, and calculation described by Latour above. In Latour's analysis, the scientist must stabilize people, places, or things in the material world in order to bring them back to the lab for scientific inscription and calculation. These inscriptions and calculations are then further stabilized through rhetorical processes into black boxes, and in Doheney-Farina's (1992) account, sent back into the world as technologies, to act upon people, places, and things. Doheney-Farina also sees technology transfer as "highly rhetorical in nature. That is, at their core these processes involve individuals and groups negotiating their visions of technologies and applications, markets, and users in what they hope is a common enterprise" (p. 4). Furthermore, the rhetoric involved in such transfers is socially constructed: "This means that the reality of a transfer does not exist apart from the perceptions of the participants" (p. 4). These participants can likewise be seen as constituting a community, which now takes us to questions involving the relationships among technology, rhetoric, and community. Sullivan (1991) argues that such communities can be understood through the lens of epideictic rhetoric.
Community and Epideixis
Sullivan describes epideictic rhetoric as the rhetoric of orthodoxies (p. 232). He explains the main features of orthodoxy, which include (among others) "the threat of heresy, [which] provides opportunity for building solidarity" (p. 231-2). With this in mind, Sullivan turns to Perelman and Olbrechts-Tytecha's description of epideictic, which Sullivan claims comes "closest to defining epideictic in terms of it social functions. [Perelman and Olbrechts-Tytecha] emphasized epideictic's role in maintaining cultural values, suggested that it established a sense of communion, and made education part of the genre" (Sullivan, p. 31, citing The New Rhetoric, pp. 51-53). Sullivan argues that understanding epideictic's role in the creation of the scientific community is essential to understanding that communities discourse at all. But he concludes his essay by hinting at the potential that epideictic holds for understanding other rhetorical communities: "Much of what has been said about science can be adapted to other cultures, such as engineering or business" (p. 242). This seems to point to technology transfer itself as another area where epideictic discourse may play a role in establishing and maintaining the orthodoxies that make such transfers possible.
Winner suggests "epistemological Luddism" (p, 325) as a solution to the conundrum of concealed electronic complexity. The idea being that we should take things apart, not with the intent of destruction, but with a goal of understanding. However, I believe that the complexity of the technological world in 2011 has outstripped our capacity to "know" it. For example, Turkle (1997) describes ways in which the Internet is changing fundamental concepts of identity and self (see for example Chapter 8). Likewise Haskins (2007) describes ways in which the overwhelming amount of archival information available online continues to outstrip the abilities of scholars to analyze--a state of affairs which has resulted in shifting the task of interpretation from scholars to users. But the blurring of the line between official memory and lived memory described by Haskins may change the way orthodoxies are developed. Indeed, as far back as the Sophists and Isocrates, rhetoricians have questioned our capacity for (forgive the redundancy) "epistemic knowledge," the ability to know things about others or about the material world. However, as Sullivan suggests, rhetoric may offer a more realistic solution, if we set our sites not on knowledge, but on commonly-held beliefs (endoxa), we may be able to study the epideictic genre to better understand these beliefs and the ways in which they shape orthodoxies. Fortunately, such a project need not be cut from cold cloth--indeed, rhetoricians could build such work on the classical rhetorical tradition found in the work of Gorgias, Isocrates, and to some extent, Aristotle as well as on the modern work of Kenneth Burke and Perelman & Olbrechts-Tytecha. We may never be able to take technology out of the black boxes that currently enshroud it, but by focusing on epideictic endoxa, we may be able to better understand how concealed technologies fit into our communities and how we might best utilize such technologies for the common good.
Dewey, J. (1946). The Public and Its Problems: An Essay in Politcal Inquiry. Chicago: Gateway Books.
Doheny-Farina, S. (1992). Technical Communication and Technology Transfer. Rhetoric, innovation, technology: Case studies of technical communication in technology transfers (pp. 1-30). Cambridge, MA: The MIT Press.
Doheny-Farina, Stephen. (1998). The wired neighborhood. New Haven: Yale Univ Pr.
Foucault, M. (1970). The Order of Things: An Archeology of the Human Sciences. London: Tavistock Publications.
Haskins, E. (2007). Between Archive and Participation: Public Memory in a Digital Age. Rhetoric Society Quarterly, 37, 401-422. doi:10.1080/02773940601086794
Herring, S. C. (1993). Gender and democracy in computer-mediated communication. Computerization and controversy, 3(2). Retrieved from http://ella.slis.indiana.edu/~herring/ejc.txt
Hughes, T. (2004). Human-built world : how to think about technology and culture. Chicago: University of Chicago Press.
Latour, B. (1987). Science in action: How to follow scientists and engineers through society.
Licklider, J. C. R., & Taylor, R. W. (1968). The computer as a communication device. Science and technology, 76, 20-41.
Mumford, L. (1967). The myth of the machine: Technics and human development. Harcourt Brace Jovanovich.
Sullivan, D. L. (1991). The Epideictic Rhetoric of Science. Journal of Business and Technical Communication, 5(3), 229 -245. doi:10.1177/1050651991005003001
Turkle, S. (1997). Life on the screen. Simon and Schuster.
Winner, L. (1978). Autonomous technology: Technics-out-of-control as a theme in political thought. The MIT Press.
Yates, J. A. (1989). Control through communication: The rise of system in American management. Studies in Industry and Society (Vol. 6). Baltimore: Johns Hopkins Univ Pr.