Graham Lally - See here for contact details

Langdon Winner asked: Do artefacts have politics? Provide an answer to that question, and illustrate your answer by referring to specific technological developments, and indicate the extent to which those links might be important for policy-makers.

This essay consists of two main parts. In the first, I will attempt to describe the landscape that technology and the political process find themselves situated in together, outlining an itinerary that gives an overview of the numerous dimensions of influence that exist between the two spheres. Much of this map will be based on prevalent views amongst academic and contemporary literature, providing a steady backdrop for the second part, which will consider the implications of these influences for the policy making process. In this latter part, I will argue that policy makers need to understand a greater range of the effects of technology, if they are to successfully grapple with many issues that they currently consider to be relatively independent. By setting out a map of these effects, I hope to set the relationship between technology and decision-making in a slightly wider context, answering the original question not so much in terms of whether technology exists separately from politics or not, but as a study of what the current political approach to technology takes into account, and what it omits.

In order to sketch an initial frame of reference for the links and parallels drawn later in this essay, I will first clarify what I mean by certain terms. In doing so, I also hope that the variety and scope of concepts involved will be clarified a little.

Artefacts and Systems. The title of Winner's essay refers specifically to "artefacts" rather than to "technology" in general. This implies that he is attempting to deal with items that take some tangible form, rather than any more abstract ideas or processes that the field of technology may also involve - the "operational schema" of the human sciences that Foucault describes, for example (Foucault, 1991:185).

This separation between physical objects and more conceptual ideas is essential, as it serves to highlight the realist side of technology (in that there is something that exists apart from human projections), as well as to frame the discussion. In this essay, I assume that such objects exist in and of themselves (without necessarily or absolutely assuming which human influences they may or may not embody). Furthermore, I wish to highlight the difference demonstrated by Winner in the two parts of his essay - namely the distinction between "a specific technical device" and "man-made systems". Indeed, while Winner refers to both as "technologies", one can clearly draw a contrast between, on the one hand, use of words such as "tools", "machines, instruments" and "devices" for the former, and language such as "given technical system", and "systems of production, transportation, and communication" on the other hand for the latter (Winner, 1985:27-33). This difference in scale needs to be recognised, and I shall refer to the specific devices as "artefacts", and to the more complex - yet still of a technical nature - structures growing out of these artefacts as "systems".

Users and Crowds. In the same way that we can think of a technology as an artefact or a system, we can also draw a contrast between two perspectives on people. At the micro level, we can think of individuals (or groups of individuals) that interact directly with artefacts. These I shall refer to as "users", and which can be considered as similarly tangible entities that have a physical relationship with a piece of technology. At the other, macro end of the scale, we need also to take into account the interactions that take place with or amongst a large mass of people - for example, a market, an electorate or a population. The important difference in scale here highlights the focus on a more complex network on actors rather than predictable, direct interactions. We shall refer to such a mass of peope here as a "crowd".

Having outlined the different actors (both technological and human) involved in this theme, it is now necessary to describe the relationships that exist between them. By relationships, I mean that there is a causal link wherein the properties or activity of one actor induces a particular behaviour or outcome in another. This form of power should also be recognised as being consistent - that is, a particular stimulus causing some behaviour will not change unless forced to do so - it can be considered inherent to an artefact, a system, a person or a crowd at any particular point in time. For example, an artefact will not arbitrarily change its behaviour towards different people unless configured to do so, and a person will not use an artefact in a different way without good reason. This power to influence consistency in behaviour can be considered a normalisation factor.

However, the questions we are concerned with are: what are these influences, where do they emerge from, and can they be influenced in turn? Therefore, I wish to structure this part of the essay around a number of main ideas, or dimensions. Each of these is intended to describe one of these influences, as an easily-identifiable contrast or dichotomy that can be used to conceptually categorise the variety of effects being examined. It is important to note here that the ranges identified here are indicative of the breadth of possibilities, and that any particular artefact or system may contain a multitude of these ranges. The distinction made between each of the three dimensions is intended to be more a convenient cognitive division to group like effects together, rather than an attempt to classify and section specific impacts.

I will look at three dimensions here, each with its own sub-dimensions. These three are: the intent of effects; the level of complexity involved; and compatibility and control for the purposes of expansion. Naturally, each of these axes will weigh differently on the components defined above, and thus contain differing implications for the policy maker, as we shall see in the final section.

It should be noted that the direction of influence examined under these three dimensions is primarily from technology to society, rather than vice versa. By highlighting these three fields in particular, I do not claim that these are the only lines of influence between politics and technology. However, study of the political factors affecting the development of technology must surely include aspects outside the scope of this essay, and so the issues chosen here are intended to limit the discussion to one direction of influence only.

1. Intent of effects. The first dimension I wish to describe is the link between human design of a technology (whether, say, low-level hardware blueprinting, or policy-level strategy planning) and predictability of outcomes. Since the industrial revolution, the focal point of much of society's efforts has been on understanding ourselves and our environment to the ends of being able to control both.

It is obvious though that the extent to which we are able to predict the consequences of our actions is incomplete. Moves in the 1980s towards a policy strategy based on risk rather than deterministic prediction highlight the tendency to concentrate increasingly on what we don't know rather than on what we do. There is, then, a constant split between those consequences that we have foreseen, and those that we haven't - in other words between certainty and uncertainty.

Note that we do not differentiate here between results that are desired and results that are not. In designing any technology, one works within limitations and restrictions and, as a result, there will constantly be trade-offs to be made between what we hope the technology will achieve, and what we expect its effects will be. A consequence may therefore be highly expected but undesired, if it has been taken into account but alleviated/offset by other, desired consequences. It is also, of course, possible that a consequence planned for does not become a reality, but it is one of the shortcomings of this essay that such non-actions are not considered here.

The impact of technology can therefore firstly be split along an dimension of intent - whether or not an effect is expected during the design of a given technology. Joerges draws a similar contrast, referring to the "two Cs" of control and contingency, in which theories of a controlling nature - "comprising, in particular, all theories of planned change" - are the polar opposite to "blind" consequences that are "not guided by some overall design". He notes, however, that these latter may be "intentional" (Joerges, 1999:422). This indicates his focus on the idea of control rather than predictability as observed here.

To illustrate the concept of planned outcomes, I could refer back to the example given by Winner of Robert Moses' Long Island bridges, or at least his interpretation of them. However, as the authenticity of this account is subject to some debate (Joerges, 1999), I will instead turn to the benefits cited by American planners for the introduction of automobiles as a mass transit system in the early 20th century. As Pitkin puts it:

"Planners conjectured that health risks from horse manure would decrease, and costs of street cleaning would go down. Nelson Lewis, Chief Engineer for New York's Board of Estimate, proclaimed ... that cars would bring many benefits to urban life, such as ... increasing the independence and mobility of residents"
(Pitkin, 2001:44)

I will return to the obvious drawbacks to automobiles that we attempt to deal with today. However, it would be difficult to argue against the proposition that many more people in cities are, these days, in danger of suffering from diseases caused directly or indirectly by a proliferation of horse-related waste (namely manure and carcasses). Typhoid for example, often transmitted by the dungheap-loving Housefly, was ubiquitous in the first decade of the century, with around 3,000 cases reported in New York State in 1906 (Source: Snopes.com). In 2004 this figure was down to 30-50 cases (Source: New York State Department of Health). Tarr describes the manner in which the planned benefits of switching to this new technology became reality:

"Streets were cleaner, particle pollution resulting from ground-up manure and the diseases thereby produced were diminished, the number of flies was greatly reduced, goods were transported more cheaply and efficiently, traffic travelled at a faster rate, and the movement of people from crowded cities to suburbs was accelerated by the automobile."
(Tarr, 1971)

Against these predicted effects, it is also clear that there are many instances of unforeseen effects arising from the same implementation of technology. It is this level of complex unknowability that a modern tendency towards risk management attempts to stem, but in the light of "improvement through progress", our ability to determine the unseen extent of our actions is often hampered by our ongoing efforts to keep track of the goals we have set in advance. To return to Pitkin's example, we have seen - not much more than half a century after their introduction - that the health and economic benefits of cars have increasingly been sidelined by the dangers and costs they impose.

"Over the past several decades, many transportation planners have tempered their enthusiasm for the automobile, instead focusing their efforts on strategies to decrease Americans' collective dependence on the car. This shift in thinking began to take shape during the 1950s and 1960s, primarily as a response to material conditions, namely traffic congestion, air pollution, and recurring social inequities."
(Pitkin, 2001:45)

A second example would be the introduction, mass usage, and subsequent observed effects of Chlorofluorocarbons (CFCs). These were originally developed as non-toxic, non-reactive replacements for toxic cooling substances such as Ammonia and Sulphur Dioxide (Midgley and Henne, 1930). However in 1974, research found that CFCs could be responsible for destroying stratospheric ozone (Molina and Rowland, 1974). As with the introduction of automobiles, the discovery and mass diffusion of CFCs was borne out of a desire to reduce certain, somewhat immediate health risks - a task that has certainly been accomplished. Again, however, this introduction has itself entailed further unseen consequences which need to be taken into account.1

This is not to say that all unforeseen consequences are necessarily unwanted, of course. We should remember to include here also any effects that may be considered beneficial by some measure, but that have emerged purely by chance. Such results are often the centre of disputes over responsibility just as much as any undesired oversights.

2. Level of complexity. The second consideration of technology to take into account is an aspect that has been observed by economists for some time now (e.g. Galbraith, 1964:42-43). It is important here to realise the effects that the complexity of a given technology can have. What do we mean by complexity though? Or rather, what types of complexity affect each actor?

It may be the case that systems, rather than artefacts, can be thought of as containing an amount of complexity - in terms of their structure and homogeneity. However, I wish to concentrate here on the more specific notion of "practical" complexity as it is this channel that I feel has more applicability to the domain of policy decisions.

In terms of the actors we have laid out above, we can think of two kinds of complexity. Firstly, there is the complexity inherent to an artefact or system itself - i.e. a technological complexity. A given technology can therefore, from the point of view of someone initially trying to design it or thereafter understand how it works, be more understandable (simplistic) or obtuse. The implications of a complex design are numerous - not only is an overall increase in society's understanding of technology needed to reach the level of comprehension required, but as the artefact or system involves an ever larger number of components, so the collaboration and organisation needed between a number of specialised workers also becomes increasingly vital. (Galbraith, 1969:24-27)

On the other hand, if the amount of knowledge and collaboration needed to design, understand or produce a particular technology is less, we can consider the technology to be "accessible" in terms of production. If the concepts need no particular specialist knowledge (and given sufficient raw materials and manpower), more actors may enter the market for producing such a technology. To draw on Galbraith again:

"The modern large corporation is adapted to the needs of advanced technology and the large amount of capital and comprehensive planning which this requires. It reflects the need of its technostructure for freedom from outside interference. ... But if technology is simple, the capital supply need not be large. Since markets then function more reliably, there is less opportunity and less need for planning. And for these reasons there is less need for specialized intelligence and associated organization. As a result, the firm can be small."
(Galbraith, 1969:94)

We can see, then, that the complexity of any technology will influence the market structure (and vice versa). The question for policy makers then is whether this force is equal to, greater than, or can be subordinated by those wishing to determine the market structure for other purposes.

Secondly, we need to consider the complexity involved when a technology interacts with a person or, in other words, the user interface complexity. In a period in which new devices are emerging constantly, and where - it seems - a new way of working replaces the current way just as everyone is at last used to it, the idea of simple interaction has emerged as key. We should note that the ease with which a particular user can adopt a given technology must be measured up against factors such as: the required functionality of the technology (more complicated functionality may require a more complicated interface); the industrial position of the technology (and hence the background knowledge of the user to be assumed); and the extent to which diffusion of the technology is desired (e.g. specialist technology versus that intended for a mass market).

All of these factors combine to determine the expected ease-of-use that is "designed into" an item. Whilst this is not the sole deciding factor in determining how widely used this item turns out to be, it is reasonable to expect that take up can be encouraged or disparaged heavily according to such accessibility.

Based on this, we can expect a macro-scale effect upon the crowd of users that is partially influenced (although to what extent is arguable) by micro-scale accessibility. Assuming an average trade-off point, at which a potential user decides (consciously or unconsciously) to adopt or disregard a particular technology based on expected outcomes, we can plausibly expect a more complicated interface to attract fewer users. Not only that, but we can also expect the type of user to vary according to the artefact's complexity - perhaps as intended by the original planning, but frequently an unexpected section with some alternative use for the technology. This exclusive form of usability (along with other factors) can create a split between those who adopt the technology and those who do not.

For example, in their 2005 report Connecting the UK: the Digital Strategy, the UK Cabinet Office and the Department of Trade and Industry found that:

"By far the biggest barrier to accessing ICT is interest and motivation, followed by a lack of perceived need. 53% of adults who do not use the internet state that they 'Do not want to/need to/have an interest'."
(Cabinet Office, 2005:24. Emphasis in original.)

Furthermore, they find that "35% of all Internet non-users lack the knowledge or confidence to use the Internet", while at the same time "PC packages are too complex which reinforces the view that 'PCs are not for me'". (Cabinet Office, 2005:24) These two points indicate the importance of the complexity-competency trade-off that has so far been designed into computing systems. The flexible and, as a result, often confusing nature of Information Technology is a response to (and a cause of) purposefully complex and variable functionality. This translates into a requirement for a certain level of training or personal competency before a user is able to understand the benefits (or flaws) of a technology, a feature reflected in the Government's findings.

Some might claim that it would be useful or comprehensive to consider how the size of the intended target market affects the extent to which a technology is "pushed" (via marketing, etc), and therefore how much it is adopted. In my own view, though, this is an economic consideration as opposed to an inherent property of technology per se, based as it is on a more quantitative assessment of potential profit and loss. I include reference to it here only to draw attention to the conflict between this economical aspect and the level of complexity deemed to be required, or acceptable. However, the links between consumer demand, production push and technological progress could be explored in more depth, given a greater scope of the topic.

As has been seen here, the level of interactive complexity involved in a particular technology will affect its adoptions then, and as such the intended effects of that technology may be confined to a particular social group, or similar.

3. Compatibility and Control We move now to the final dimension of technological impacts being looked at here, that of the quest for expansion of a system, and the compatibility (and hence control) it entails. These effects draw partially upon the dimensions we have already looked at, but concentrate more on the large-scale, emergent consequences of systems, and the way in which they are inclined to reflect and goad behaviour outside the technology itself. In many ways, an element of this has been present in the previous two dimensions. What we look at here concerns more implicit aspects that arise within a ubiquitous technology-driven society.

The primary method of normalisation and coercion employed by technical systems is that of compatibility. In a general sense, this constitutes a common medium that is arrived at by two parties in order to facilitate communication somehow. However, I argue here that this issue of compatibility is not restricted solely to technological interfaces, but also extends in a number of ways to include social interaction and social cues.

The concept of success as measured from the point of view of a technological system differs from the concept as perceived from an economic or social point of view. Whereas under the latter, success may be defined as an overall increase in wealth and material prosperity, or in some qualitative measure of happiness or health, technological success can be defined as an optimal, efficient performance of the system. This gap in purposes and values is what leads Winner to note that "the originally established end of a system may turn out to be a restraint upon the system's ability to grow or to operate properly" (Winner, 1977:241).

Galbraith describes the way in which this "technical rationality", following increased product complexity and development times, leads to an expansion in the amount of planning and control that must be exerted in order to optimise technical efficiency:

"That, as more time elapses and more capital is committed, it will be increasingly risky to rely on the untutored responses of the consumer needs no elaboration. And this will be increasingly so the more technically sophisticated the product."
(Galbraith, 1969:34)

Not only does this control over market reliability extend to end-users, but in an ever more connected and interdependent industrial domain, growth amongst - and compatibility between - infrastructure providers must also be considered in the same way. This can be seen in the push towards interoperable systems via a series of standards laid out by the UK government:

"The value of interoperable systems, and the benefits of the standards-based approach exemplified in the e-GIF, are becoming widely recognised by the private sector. This has led to calls to adopt the policy, or a similar tool, to further increase efficiency and enable new and exciting services to be developed across different sectors."
(Foreword to e-Government Interoperability Framework, 2005:3)

The result of this technical rationality then is to attempt to ensure adoption of a system, most likely via either a market process or a political process.

Alongside the technical imperative for compatibility we can also place a "discriminatory" compatibility between a technology and a user. While the issue of complexity examined above also discriminates in a sense, it does so according to personal ability. The compatibility here, on the other hand, involves the in-built physical properties and dimensions of an artefact. In other words, even if a user is able to access training and improvement in terms of understanding and/or using an item, is there still a feature of the artefact's design that prevents them from using it?

The obvious example of this kind of normalisation is that of Winner's interpretation of Long Island bridges, as mentioned above (Winner, 1985). Regardless of whether or not Moses' intent was to discriminate against a particular section of society, it is clear that due to its physical dimensions, a particular kind of behaviour associated with the object is excluded - access by tall buses, in this case. This battle between design, functionality and physical accessibility continues on today, most visibly in the form of discussion regarding disabled access to buildings, transport, etc.

The other side of the compatibility coin has little to do with the technical functionality or physical aspects of a particular technology, but relates instead to the social impact of a techno-economic society based on knowledge and progress. In a society where "rapid technological adaptation is the hallmark of a successful developed economy" (speech by Tony Blair, 2005), not only do skills become an essential component in maintaining a competitive international position, but within a country, this adaptation to technology becomes a measure of ability for an individual. The hierarchy of social status becomes decided, to an extent, by the ability to keep up with change, and familiarity with what others are not familiar with. And in this context, artefacts become the symbols of this status. Physical items, whatever their inherent political inclinations may otherwise be, become endowed with a projection of position as understood by society solely for non-functional reasons such as their availability or "up-to-date-ness".

This projection of values has two related aspects to it. Firstly as we have noted, there exists an indication of relative status within a perceived hierarchy, such that particular technologies or items represent a form of superiority/inferiority between people. Conversely, within this hierarchy the same technologies also act as indicators of possible social connectivity, as people begin to form groups around certain shared understanding of technology, its use, or its effects. Skuse calls this projection "enlivenment", and provides us with two examples that are significantly removed from the world of rapid technological adaptation as portrayed by today's politicians.

Quoting Stallybrass, Skuse refers firstly to a historical perspective:

"... Stallybrass points out that for the poor in 19th-century Britain the wealth that they struggled to accumulate was generally 'not stored as money in banks but as things in the house' and that one's economic and social standing tended to be 'measured by the coming and going of those things'"
(Skuse, 2005:124. Emphasis added.)

He also conducts a more in-depth examination of the role of technology within "chronically impoverished" Afghanistan in 1996-1998. Skuse outlines the powerful symbolic nature of the radio in an increasingly connected world:

"[Radio] is a symbol of modernity and a key repository of household wealth that may be reconverted into cash as and when required. Radio continues to have a close association with modernity and global connectivity, an association that has long been eroded in many other countries by more technical, miniaturized and status-defining objects such as colour televisions, personal stereos, computers or mobile phones (du Gay et al., 1997)."
(Skuse, 2005:125)

Similar relations between usage/understanding of technology and social recognition (or perceived recognition) can be seen in workplaces as well. Indeed, Venkatesh and Davis find evidence to support their hypothesis that use of technology is seen as being causally linked to job performance and career progress, and that there is a perceived image that accompanies this use. Furthermore, they find that this perception of technology as a tool of status continues to pervade as familiarity with the technology is increased, lending support to the proposal that such values exist independently to any technical functionality such technology may provide (Venkatesh and Davis, 2000).

We can see therefore that despite (or in addition to) the realist, physical properties of technology, there also exist properties projected onto artefacts and systems by entirely sociological factors. From a political point of view, these properties and the behaviour they influence should be considered as being just as important as behaviours arising directly from interaction with technology itself.

These three sections cover a broad range of major effects that technology can have on society. We now turn to the second part of this essay, in which we consider how the policy maker should react to them. Or - perhaps more relevantly - where does the responsibility of the policy maker lie and what is their position in terms of influence, given the apparently "independent" effects outlined in this essay?

Under the widely-adopted "Red Book" model, wherein a policy making process centred on scientific risk assessment and management, there is a concentration on the link between technology and progress. Economic prosperity and international competitiveness are now the yardsticks by which the "health" of a nation is measured. Of the causal links described above, only the first - the predictability of the effects of technology - is given any significant consideration in the policy process, manifesting itself as this focus on risk and uncertainty. The sources of motivation, and the political accountability involved in this issue are outside the scope of this essay however.

The other two dimensions, of complexity in "our" technology and of compatibility and normalisation, are often impatiently courted by policy makers. Perhaps due to the non-scientific nature of these effects, they tend to remain in the realm of speculative discussion rather than concerted investigative efforts. For example, government efforts to attract more females into engineering and computing fields have been hard going, even backwards:

"...the proportion of information technology workers who are female is down to 21% ...[and]... had declined steadily since the 1960s."
(BBC News, November 2005)

The DTI also notes that its "target of 40% female representation on [Science, Engineering and Technology] related boards and councils remains challenging, and has been rolled forward to 2008" (DTI Departmental Report, 2005:46). It is clear that merely setting a target, and hoping that statistics will meet that target at the point of assessment, is not necessarily the easiest way to do things.

Similarly, efforts to transform more disadvantaged sections of the population through access to technology, or to develop particular areas of the country (or, indeed, small businesses) through a nationwide programme of innovative ability, will depend largely upon how well particular technologies can be adopted for use, production, and further development. The dimensions laid out above should serve as reminders that, rather than merely fulfilling one main pillar of national strategy, adopting a technology-led tactic can and does affect a wide variety of areas.

When it comes to scientific and technological advancement, it seems apparent that policy makers are ready to adopt a relatively laissez-faire approach in allowing the technological industry to develop as it feels it needs to, with only a casual guidance being adopted. However, if these policy makers are to fully come to terms with improving the aspects of life they consider to be external to the field of science and technology, it would be helpful to start assessing the less direct, and less obvious, impacts that a "knowledge-based society" infers. I hope that by taking a wider perspective on the effects of technology, I have at least narrowed this gap between what is perceived as "technological" and what is not.

References

Blair, T. Speech to the CBI conference. Online. Publishing Information: 10 Downing Street, 29 November 2005. Available: http://www.number-10.gov.uk/output/Page8606.asp, 7 January 2005

Department of Trade and Industry. (2005) Departmental Report, 2005. Available online at: http://www.dti.gov.uk/expenditureplan/report2005/, 7 January 2005

Foucault, M. (1991) Discipline and Punish: The Birth of the Prison. London: Penguin Books.

Galbraith, J. K. (1964) 'The Economics of Technical Development', in E. Mansfield (ed) Monopoly Power and Economic Performance, New York: Norton, pp. 39-47

Galbraith, J. K. (1969) The New Industrial State. Middlesex: Penguin Books
Joerges, B. (1999) 'Do politics have artefacts?', Social Studies of Science, Vol. 29, 3, pp. 411-431
Midgley, T. and Henne, A. (1930) 'Organic fluorides as refrigerants'. Industrial and Engineering Chemistry, Vol. 22, pp. 542-547

Mikkelson, B. Typhoid Mary. Online. Publishing Information: snopes.com, 26 September 2002. Available: http://www.snopes.com/medical/disease/typhoid.htm, 7 January 2005
Molina, M. J. and Rowland, F. S. (1974) 'Stratospheric sink for Chlorofluoromethanes: Chlorine atom catalyzed destruction of Ozone', Nature, Vol. 249, pp. 810-814

New York State Department of Health. Typhoid Fever. Online. Available: http://www.health.state.ny.us/nysdoh/communicable_diseases/en/typh.htm, 7 January 2005

Pitkin, B. (2001) 'A historical perspective of technology and planning', Berkeley Planning Journal, Vol. 15, pp. 32-55
Skuse, A. (2005) 'Enlivened objects: The social life, death and rebirth of radio as commodity in Afghanistan', Journal of Material Culture, Vol. 10, 2, pp. 123-137


Smith, A. Girls 'put off technology jobs'. Online. Publishing Information: BBC News, 9 November 2005. Available: http://news.bbc.co.uk/1/hi/education/4421712.stm, 7 January 2005

Tarr, J. A. (1971) 'Urban pollution - many long years ago', American Heritage Magazine, Vol. 22, 6. Available online at: http://www.americanheritage.com/articles/magazine/ah/1971/6/1971_6_65.shtml, 7 January 2005

UK Cabinet Office. (2005) Connecting the UK: the Digital Strategy. Available online at: http://www.strategy.gov.uk/downloads/work_areas/digital_strategy/digital_strategy.pdf, 7 January 2005

UK Cabinet Office. (2005) e-Government Interoperability Framework version 6.1. Available online at: http://www.govtalk.gov.uk/documents/eGIF%20v6_1(1).pdf, 7 January 2005
Venkatesh, V. and Davis, F. D. (2000) 'A theoretical extension of the technology acceptance model: Four longitudinal field studies', Management Science, Vol. 46, 2, pp. 186-204

Winner, L. (1977) Autonomous Technology: Technics-out-of-Control as a Theme in Political Thought, Cambridge, Massachusetts; London: The MIT Press


Winner, L. (1985) 'Do Artefacts Have Politics?', in D. MacKenzie & J. Wajcman (eds), The Social Shaping of Technology, Milton Keynes; Philadelphia: Open University Press , pp. 26-38


1For further information on the rise and fall of CFCs, see: http://www.cmdl.noaa.gov/noah/publictn/elkins/cfcs.html