icosilune

Category: ‘Research’

Norbert Weiner: Cybernetics

[Readings] (08.29.08, 4:40 pm)

Notes

Introduction

Weiner begins Cybernetics by posing some of the problems encountered by the growing field of modern science. Specifically, and this echoes Vannevar Bush, he is concerned about the massive specialization in science. He argues, though, that scientists need to be versed in each others’ disciplines. He too is interested in developing some sort of calculating machine, but is proposing an electronic model that seems to more closely resemble what we use today. What is interesting about Weiner’s model is that it is inspired by the human nervous system.

The essential problem that is set out to be solved is anti-aircraft artillery. This is the essence of the idea, and segues cleanly into the notion of feedback loops that will be explored in detail later. This idea involves a certain forecasting of the future, and relates closely to how human action works as usual. Human actions, such as picking up a pencil involve certain kinaesthetic or proprioceptive senses. This correlates in some fashion to the intentionality described by phenomenologists.

Furthermore, the kinds of dilemmas from the problems Weiner is describing are generally solved by pattern recognition. Distinguishing signal from noise, guidance, switching, control, etc. It is interesting to note that the type of discipline proposed by Weiner more closely resembles analytic patterns that seem to be suggested by Dreyfus.

Some of Weiner’s application seems grounded in Gestalt psychology, which is the psychology of the coordination of senses. The sum idea is that the whole amounts to more than the sum of its parts. Generally it is a psychology of perception. One of the ideas that Weiner is approaching with this and toward the end of the introduction, is the idea of developing a fully functional prosthetic limb. The limb would not only need to fill the space and function as the lost limb, but also register the immediate senses, and furthermore the proprioceptive senses. The combination of these seems to unite the goals of cybernetics. Also notably, the idea here is the replacement/extension of a limb, not the mind.

A further concern with the potential of this prosthetic power of computation is its complicating moral significance. One moral dilemma posed is the notion of machine slave labor, which has the potential to reduce all labor to slave labor. While robots have not replaced human labor, this concern is insightful in terms of the economic changes due to computers (divisions of companies being replaced by silicon chips, etc).

Chapter 5: Computing Machines and the Nervous System

Weiner gives early on a somewhat hand-waving proof that the best way to encode information (when there is a constant cost for the information) is to use binary for storage. The logic of some operators is described, as well as the ways of implementing binary logic in several engineering approaches. After that, he mentions their potential grounding in the neurological system.

Weiner next attempts to address some of the tricky details of mathematical logic (such as the paradoxes of naive set theory) with corresponding analogues that could apply to a computational system modeled after the nervous system.

Reading Info:
Author/EditorWeiner, Norbert
TitleCybernetics
Typebook
Context
Tagsdms, ai, digital media
LookupGoogle Scholar, Google Books, Amazon

Katherine Hayles: Writing Machines

[Readings] (08.29.08, 4:38 pm)

Notes

Introduction

Institutions are not run by specified rules, but rather by complex networks of individuals, among whom the real causes and reasons for things become apparent. It is individuals and networks of them that cause things to happen. Hayles wishes to look at the digital age and writing environments, but to do that must focus on the networks of forces and individuals that surround the discipline and culture.

To do this, she is starting from a somewhat autobiographical perspective: Hayles started pursuing a track in science (specifically chemistry), but later found the cutting edge research to be tedious and unengaging. She took some courses in literature, and in a new track, found herself puzzling out the inconsistencies with her scientific discipline (ambiguity over clarity, investigating rather than solving problems, etc).

Electronic Literature:

Hayles opens this chapter by noting how the Turing Machine (and by extension the computer) was originally theorized as a calculating machine, but had a hitherto unexpected power for simulation. Hayles poses simulation as applying to environments, and this makes it seem a much more tactile and somatic experience than a conceptual one. She connects simulation to literature and the result is this sort of electronic literature.

Hayles specifically is looking at Talan Memmott’s hypertext work, “Lexia to Perplexia”, which is a jumble of jargon and wordplay, intended to confuse the idea of subjectivity. Hayles describes this language as a creole of English and computer code.

The admitted illegibility is an indication of electronic processes that the reader does not understand, or cannot grasp. “Illegibility is not simply a lack of meaning, then, but a signifier of distributed cognitive processes that construct reading as an active production of a cybernetic circuit and not merely an internal activity of the human mind.” I think this is supposed to mean that interpretation is transcendent of human thought.

The goal in this transformation is to raise awareness and weave together the human body with electronic materiality. This idea seems to be looking in the direction of Donna Haraway, but going more in the direction of a semiotic system. The goal is not to challenge human nature, but challenge subjectivity and language.

Prevalent in the work are allusions to Narcissus and Echo, and these mythological references are intended to highlight the collapse of the original into simulation. Following Baudrillard, there is no longer an ontological distinction between the real and the simulated.

The work is intended to be a “later generation” multimedia or hypertext work, very active and confusing with respect to user interaction. The work goes beyond general hypertext and instead of moving from lexia to lexia it acts nervously, seeming of its own accord.

Reading Info:
Author/EditorHayles, Katherine
TitleWriting Machines
Typebook
Context
Tagsdms, cybertext, digital media
LookupGoogle Scholar, Google Books, Amazon

Sherry Turkle: Life on the Screen

[Readings] (08.29.08, 4:37 pm)

Overview

Turkle presents some methods of looking at computation and culture from a psychological perspective. Her work is grounded extensively in ethnography, and follows individuals for whom computers are a part of their lives. Computers, online communication, and simulation each open new means for interaction and expression for individuals, and at the time of Turkle’s writing (1995), the influence of the internet on culture was still very fresh. It remains fresh today, probably partly due to its continually evolving and changing nature. There are clearly mixed feelings about many aspects of computation, but ultimately Turkle seems to make it out to be a force for good.

Notes

Turkle starts by looking at writing, in an anecdote about her learning French. The style of writing Turkle used in her example was bottom-up, rather than top-down, an approach that was contrary to accepted models. This approach uses dialogue and tinkering instead of formal or abstract modeling. (p. 50-51) She divides approaches to material into two categories: hard mastery and soft mastery, which are the practices of engaging with things in a top-down or bottom-up manner, respectively. This distinction is to become a major thread for part of the book, and was an important factor in her earlier work. The appearance of simulation in computer culture encourages soft mastery, bricolage, and tinkering, which make use of the ability to test and experiment, getting into a model as opposed to looking objectively at it. Piaget and Levi Strauss discuss bricolage as a stage of development in infants, but they present it as a stage to be passed, rather than a whole method of thinking. (p. 56)

In software, change has been made to account for bricolage and other styles of learning and interaction: “Instead of rules to learn, they want to create environments to explore. These new interfaces project the message, ‘Play with me, experiment with me, there is no one correct path.'” [they as software designers] (p. 60) However, this positive reception is far from unanimous: Turkle also looks at reception to computers and simulation in academic setting. There is a lot of hostility, much of that derives from the opacity of the computer, and to some domains, especially science, that opacity is threatening to fundamental principles. (p. 63-66)

On: “The Games People Play: Simulation and Its Discontents”: Turkle looks at simulation from the vantage point of games. Early children learn the tools and concepts of the game by getting a feel for them via practice. As rules became more complex, they lended to the credibility of the microworld: “A literature professor, commenting on a colleague’s children with their Christmas Nintendo gifts, likened their disclosure to that of Dante scholars, ‘a closed world of references, cross-references, and code.'” (p. 67)

On Sim games (SimCity, SimLife, etc): Simulation encourages players to develop understanding of rules and relationships, leading to estimation and intuition. Some relationships are very complex and are not understood, but this does not obstruct the interaction experience. (p. 69)

“It is easy to criticize the Sim games for their hidden assumptions, but it is also important to keep in mind that this may simply be an example of art imitating life. In this sense, they confront us with the dependency on opaque simulations that we accept in the real world. Social policy deals with complex systems that we seek to understand through computer models. These models are then used as the basis for action. And those who determine the assumptions of the model determine policy. Simulation games are not just objects for thinking about the real world but also cause us to reflect on how the real world itself has become a simulation game.” (p. 71)

Turkle defines simulation rejection and resignation. A third response is to develop a cultural criticism: “This new criticism would not lump all simulations together, but wold discriminate amongst them. It would take as its goal the development of simulations that actually help players challenge the model’s built-in assumptions.”

“Undertanding the assumptions that underlie simulation is a key element of political power. People who understand the distortions imposed by simulations are in a position to call for more direct economic and political feedback, new kinds of representation, more channels of information. They may demand greater transparency in their simulations; they may demand that the games we play (particularly the ones we use to make real life decisions) make their underlying models more accessible.” There is nonetheless fear that the complexity and opacity imposed by simulations may never be cut through. Further concern is how to understand the relationship between reality and simulation.

Reconstructing ourselves in virtual communities: there is a complex interaction and relationship between ourselves and machines, and a “complex dance of acceptance and rejection to analogies of ‘the machine.'” There is appeal to think of ourselves as cyborg and machine like. (p. 177) Describes IRC and chat, which, while textual and one of the least “dynamic” of communication methods, is extremely personal. The relationship between self and textual identity is very close. (p. 179) Turkle describes anonymity and MUDs, which can be addictive. This seems to relate in the psychological understanding- to the presentation of self. The concept of a persona is much more literal and explicit here. (p. 184)

In an interesting diversion, Turkle discusses tabletop games, citing a specific example wherein the games were used as vehicles for self definition and epiphany. These are grounds for personal exploration and discovery. Later, MUDs serve similar purpose, allow expression for emotion difficult to express well in real life. They allow certain degrees of self experimentation, and ability to work through real issues. Because of their playful nature, they are not deeply binding. They allow the experimentation of being a better self. (p. 186) “You are who you pretend to be”: identity is constructed by fantasy, (this pulls back to role experimentation in sociology), and MUDs enable this as was never possible before. (p. 192)

Alternately, MUDs may be a place to reenact the problems of real life, and may serve as an addiction. (p. 199) The interesting comparison here is that Turkle’s examples so far seem to be generally positive: interaction with technology is a means to self discovery, communication, and enlightenment. She does not delve deeply into the addictive nature of “holding power” that games exert. While they can be very much devices for progression, that seems to be less the case nowadays. Is this something that has changed, or is it just a matter of changing perspective? Or is it a matter of the people who play games? It would seem that as a medium matures, its capacity for enlightenment would grow, but if that is not the case then why is it? Is it because of the capitalization of online entertainment, that games cannot be enlightening if they are to be sold for profit?

Onto some negative qualities about online communication: MUDs enable “easy intimacy” where things can move along too quickly. Commitment is easy in virtual world, but would be too much for real life. Furthermore, due to the lack of closeness and *embodied* intimacy, it is difficult to understand the degree that the actual relationship exists. It may only be in the interactor’s minds. There is a lack of (what sociologists might call) role support. The result leads to projection onto others: “In MUDs, the lack of information about the real person to whom one is talking, the silence into which one types, the absence of visual cues, all these encourage projection. This situation leads to exaggerated likes and dislikes, to idealization and demonization.” The situation is not all that different in modern multi-user environments, MMOs, and Second Life. (p. 206)

Gender play and MUDs. This is more literal in the sense that people *play* a gender. It raises the attention to cross-gender portrayal, and gender relationships and roles. It is also much more acceptable to play at other genders. This gender play is also often about self understanding and experimentation. (p. 214)

“For some men and women, gender-bending can be an attempt to understand better or to experiment safely with sexual orientation. Bot for everyone who tries it, there is the chance to discover, as Rosalind and Orlando did in the Forest of Arden, that for both sexes, gender is constructed.” (p. 223) Each example also understands the male gender as also constructed, but modern games fail to account for this: they hand the players pre-established gender models. This is generally not done with explicit interactions, but it is with subtle things like determination of dress. Male is usually an assumed term, while female is external. Is this the result of a development that is new and occurred with the game industry?

Netsex enables confusion over the question of trust and identity. (p. 225) Also deception… textual nature implies an intimacy that is betrayed by deception. More recent environments seem to be much more jaded? (p. 228) Being digital raises new questions of being. It changes our perceptions of community, identity, relationships, gender, and other things. (p. 232)

On the erosion of the real:

Baudrillard ref: Disneyland and shopping malls are part of culture of simulation. They tend towards a culture of isolation and retreat. The loss of the real encourages this. (p. 234) Discussing the compelling nature of false Disney animatronic crocodiles, versus the imperceptibly slow real ones: “Another effect of simulation, which I’ll call the artificial crocodile effect, makes the fake seem more compelling than the real.” (p. 237)

Janice Radway ref, cross with Henry Jenkins: Engagement with media (romance, TV) offers resistance to “stultifying categories of everyday life.” This engagement is somewhat empowering, but also has other more disempowering, limiting effects (see Radway). (p. 241)

A consistent danger is that MUDs encourage people to solve non-real problems, by living in unreal places. Digital worlds enable exploration, but their appeal may be such that one may not wish to return from them. (p. 244)

Rape in MUDs: submission to digital realm also involves sacrifice of some autonomy. When one controls a rule based avatar, the player’s engagement is confined by the rules, and if those rules are compromised, or even if NOT (the rules are simply outside of player control, and possibly outside of player consent), it is possible that the player avatar be compromised horribly. The self identification and experimentation with avatars can lead to exploration and understanding, but can also lead to new forms of disempowerment and victimization. (p. 251)

A problem ultimately lies in the depth of the “emote”. Authenticity is irrelevant in a culture of simulation. Emotion displayed in a simulation is necessarily inauthentic. How does one understand feeling or emotion. Emotion or action may be easily displayed in a virtual world, but real emotion is not so easily displayed or understood. Does emote simply stand for a reflection of Frederic Jameson’s “flattening of affect in postmodern life”? (p. 254)

Reading Info:
Author/EditorTurkle, Sherry
TitleLife on the Screen: Identity in the Age of the Internet
Typebook
Context
Tagsdigital media, dms, cyberculture
LookupGoogle Scholar, Google Books, Amazon

Sherry Turkle: The Second Self

[Readings] (08.29.08, 4:35 pm)

Overview

This is one of Sherry Turkle’s earlier books, and the crux of this is the understanding of the computer as an evocative object. It calls into question various preconceptions and understandings that we have of ourselves: what it means to be human, what it means to think. By being a fuzzy object that expresses some humanlike characteristics but not others, it leads us to examine and return to the nature and definition of those characteristics. In this book, Turkle looks specifically at cognition, learning, programming, and the various cultures that have emerged around computation. While a bit dated, her findings certainly inform an understanding of the development of culture and computers to the modern day.

Notes

Turkle opens with an analysis of the “Wild Child” who was discovered in France in the year 1800. This occurred shortly after the French Revolution, while theories of human nature and culture were wildly fluctuating, the Wild Child was a “test case” for understanding many of those theories. He was an evocative object, and our understanding of him challenged and provoked new understandings of what it means to essentially be human, and our relationship with nature. (p. 12)

The computer is similarly an evocative object because of its holding power, which “creates the conditions for other things to happen.” (p. 15) It provokes self-reflection. “The computer is a ‘metaphysical machine’, a ‘psychological machine’, not just because it might be said to have a psychology, but because it influences how we think about our own.” (p. 16)

Using here, a symbolic embedding of human concepts within devices. This difference is cognitive, not embodied, but human ideals and values are put into the machine. As a product of this interaction, the machine concepts and values [as in all communication] return to the user. The computer has not a mind, but reflects the minds of its creators and programmers. (p. 17)

Turkle on technological determinism vs attribution: Attribution claims that technology has long-term impact on people, while attribution claims that technology only has meanings and can be understood in terms of meanings given to it. Both are wrong, computer evokes rather than determines, and opacity of the computer prevents attribution. (p. 21)

The evocative computer, through engagement with ourselves, encourages us to think introspectively and become philosophers. Asks “what does it mean to be human?” Behind anxiety of popular reception to AI is a concern over what it means to think. This is similar to the anxiety in the 1950s over nuclear holocaust (as evidenced in film and whatnot), and the subtle sexual anxiety that underlies Freudian psychology. (p. 24)

Evocative objects are on the borderline, inspire breakage to understand. They are marginal: too far from humans for real empathy, but autonomous enough to be ambiguous. Turkle uses Piaget’s metaphysical developmental studies that inspire transcendent questions. (p. 32)

There is a conflict between things and people. Contrast things with The Sims. The border here is especially fuzzy. Software is an extension of the computer- Turkle is discussing physical devices, but software adds an extra layer into this understanding. There is a meta consideration of the concept of intelligence or humanity: rational explanation undermines this: computers are rational, but not human. (p. 61)

On the seduction of games: “Those who fear the games often compare them to television. Game players almost never make this analogy. When they try to describe games in terms of other things, the comparison is more likely to be with sports, sex, or meditation. Television is something you watch. Video games are something you do, something you do to your head, a world that you enter, and to a certain extent, they are something you ‘become’. The widespread analogy with television is understandable. But analogies between the two screens ignore the most important elements behind the games’ seduction: video games are interactive computer microworlds.” (p. 66-67)

Games lend immersion, and to the feeling of being cut off outside the magic circle. For some (notably Jarish, one of Turkle’s primary informants in this chapter), games provide immersive environment which is comfortable alternate to real world, whose complexities cannot be understood fully. The world of games is whole, conscious, it can be a pleasing alternative to reality. (p. 72)

On Woody Allen and the interactive novel: Echoes adaptation and fanfiction here: The key is immersion, being in a world. This aspect counters the classical understanding of phenomenology which wants to look only in the real physical world. A difference is in identity: computers enable projection of other identities, and allow for self-insertion, role-playing, and controlling characters. (p. 77) A reference to Gone with The Wind: also a matter of world construction. Clearly the present of the game industry is quite different. (p. 78)

On the culture of simulation: “Video games offer a chance to live in simulated, rule-governed worlds. They bring this kind of experience into the child’s culture and serve as a bridge to the larger computer culture beyond. They are not the only agent to do so. Reinforcements come from surprising quarters. Children come to the video games from a culture increasingly marked by the logic of simulation.” (p. 79) Later Turkle discusses Dungeons and Dragons: which operate as a rule-governed simulation fantasy.

On “playing house” style of games: Turkle is placing rules in opposition to empathy, but rules [especially social ones] underlie even abstract fantasies. These enable experimentation and play with real domains. Sociologists describe understanding of rules and structures that occur within society, and see “playing house” games as role-learning. There is not simulation proper, but there is a reflectivity that is also present in simulation. (p. 83)

Papert and LOGO: Learn to speak French the way French children do: speak French to French-speaking people. Learn mathematical logic by speaking in math to a mathematical entity. This is Piagetian learning, it happens automatically in the right circumstances. From translation studies, we know that language implies culture. (p. 97)

Learning and mastery: Various different styles, both hard and soft are valid. Leads to wondering: What of the middle between them? Top down style can be disasterous if the initial plan is faulty. The bottom up may never get anywhere due to its lack of structure. What of a middle-out strategy? (p. 105)

Programming is an ambiguous field by which people may explore methods for reality. Naturally, this leads to exploration of gender. This is enabled by the objectivity of the computer. “Approximate models” could be simulated and addressed reflectively. (p. 109)

Adolescence is characterized by self-discovery and self-definition. Example here is a girl for whom power was threatening, but constraint enabled control. Computer enables a “world apart” for building a new self-image. (p. 145)

Reflection is an externalization of the self. Computation and conceptual metaphors offer a new means for looking at the self, in relation to the machine, and to the world. (p. 155) Children eventually turn to using computational metaphors to describe themselves, and can extend to culture at large. Has this taken place?

The computer is a catalyst for culture formation. This is even more true with widespread internet adoption and is discussed in “Life on the Screen”. A new computational aesthetic enables a new cultural understanding. It enables previously inaccessible understanding, not just information, but knowledge. Considering wikipedia and the like. (p. 166)

A computer is an object to think with and more, it is a building block around which may emerge new cultures and values. “The men and women I am writing about here also used the computer as an ‘object-to-think-with.’ But here the computer experience was used to think about more than oneself. It was used to think about society, politics, and education. A particular experience of the machine–only one of the experiences that the machine offers–became a building block for a culture whose values centered around clarity, transparency, and involvement with the whole. Images of computational transparency were used to suggest political worlds where relations of power would not be veiled, where people might control their destinies, where work would facilitate a rich intellectual life. Relationships with a computer became the depository of longings for a better, simpler, and more coherent life.” (p. 173-174) Here, again, the computer is a vehicle for utopian thinking. However, while the utopia may not be realized (and indeed, it hasn’t been), these computational values have definitely been adopted within computer culture.

Games feel more knowable than the depth and incomplete engagement with the real world. This is the opposite of phenomenological expectation. It relates back to mechanization and mechanical reproduction. In a mechanized world, the person is a cog and may only see a part. In a game world, a person may see the wholeness of the system and attain mastery over it. (p. 186)

Relationships with computers: Children: keep it mysterious. Adults: make it transparent, want total understanding. (p. 195) This is not to say that children do not try to understand them, but they use computers as a vehicle for deeper philosophical understanding of things. Adults will use computers to escape the overbearing gravity and complexity of the world. (p. 192)

On the controversy of “The Hacker”: The concern is over the relationship to engineering tools, but this concern does not apply to artistic tools. Culture accepts an artist’s relation to tools as being intimate, but this seems over the line when extended to engineering tools. The danger of the hacker is the rejection of physical embodied life for the purity of the machine. (p. 205) “They are like the virtuoso painter or pianist or the sculptor possessed by his or her materials. Hackers too are ‘inhabited’ by their medium. They give themselves over to it and see it as the most complex, the most plastic, the most elusive and challenging of all. To win over computation is to win. Period.” (p. 207)

Science fiction, literature, and hacker culture: evolve around the desire to control and master: imposes “hacker values” of individualism, mastery, and nonsensuality into literary worlds. The game industry has been built around hacker culture. This may go some way into explaining the games we have now….. (p. 222)

Hacker culture is built around an intimate identification with the object. Baudrillard definitely carries through here. The purity of the object is pure seduction. (p. 238)

Turkle discusses Newell and Simon’s generalized problem solver, as a big step in AI. Predictions of the future in AI models for how people think: but only thinking certain types of problems, and only modelling certain types of thinkers. One of the projects is a computational model of Freudian slips. (p. 244)

The Freudian Slip program was evidently made by Don Norman. There is a difference between thinking and feeling behind Freud model here. Machine implies intention, human implies mistake. How could a simulation “make a mistake?” Searle criticizes AI for its lack of intentionality, but the problem here seems to be a lack of involuntary behavior. “Freud saw meaning behind every slip. Sometimes the meaning was obvious, sometimes it had to be traced through complex chains of association and linguistic transformations. Norman’s emphasis on computational representation draws attention not to meaning but to mechanism.” (p. 248)

Again on computational anxiety: “Behind the popular acceptance of the Freudian theory was a nervous, often guilty preoccupation with the self as sexual; behind the widespread interest in computational interpretations is an equally nervous preoccupation with the self as a machine.” (p. 299) A thought on why some models are so powerful and compelling.

Paradox is present in machines and has a power (on Godel’s incompleteness theorem). It makes machines, and the underlying mathematical logic behind them complex enough to reflect the potential for paradox, and gives a further depth to them as human-like. (p. 305)

“Ours has been called a culture of narcissism. The label is apt but can be misleading. It reads colloquially as selfishness and self-absorption. But these images do not capture the anxiety behind our search for mirrors. We are insecure in our understanding of ourselves, and this insecurity breeds a new preoccupation with the question of who we are. We search for ways to see ourselves. The computer is a new mirror, the first psychological machine. Beyond its nature as an analytical engine lies its second nature as an evocative object.” Computations provide a mirror, but reflect the self as a machine. (p. 306)

Reading Info:
Author/EditorTurkle, Sherry
TitleThe Second Self: Computers and the Human Spirit
Typebook
Context
Tagsspecials, digital media, cyberculture
LookupGoogle Scholar, Google Books, Amazon

Don Norman: The Psychology of Everyday Things

[Readings] (08.29.08, 4:33 pm)

Overview

In this classic and seminal work, Norman canonically covers topics of the design of objects and devices, using copious examples of bad design to illustrate how good design may be achieved. Norman’s perspective is that of a usability expert and an engineer. He is interested in users’ cognitive models of devices, and how easily they may execute their goals using those objects. In this sense, he uses a classic cognitive model of problem solving, very reminiscent of Newell and Simon’s Generalized Problem Solver.

Norman’s key points are on the cognitive models of objects that users form, the transparency with which the objects enable users to discover that model, and how the affordances objects map onto these models. Objects should use feedback to make transparent the states of an object so that it may be transparently understood.

In studying human engagement with artifacts, Norman uses a cycle (familiar to cognitive science) of goal formation, execution, and evaluation. Norman outlines seven stages of action:

  1. Forming the goal
  2. Forming the intention
  3. Specifying the action
  4. Executing the action
  5. Perceiving the state of the world
  6. Interpreting the state of the world
  7. Evaluating the outcome

This clearly operates in opposition to a more phenomenological reading of interaction. It stands by the notion of man as a reasoning animal, so this cycle, and its iterations (identifying error and the like) is a normal part of engagement with the world. Where frustration occurs is when expectations conflict with feedback in execution and evaluation.

Where Norman’s theory fails is in his treatment of aesthetics (which are always subservient to the functional capacity of an object, and also in cases where user intention is totally non-accounted for in the original design. This makes Norman’s approach conflict strongly with creative tasks. The best example of this is the Lego motorcycle (p. 83), which Norman seems to believe has a “correct” assembled shape.

Reading Info:
Author/EditorNorman, Donald
TitleThe Psychology of Everyday Things
Typebook
Context
Tagsdigital media, dms, hci
LookupGoogle Scholar, Google Books, Amazon

Brenda Laurel: Computers as Theatre

[Readings] (08.29.08, 4:30 pm)

Notes

Chapter 1: The Nature of the Beast

Laurel opens with Spacewar, the first computer game, which was developed at MIT in 1962. The game was the “natural” thing to do with the computer when it arose. Laurel claims that this is because games have the capacity to represent action, and the key ingredient is the human element. That fact makes the interface a matter of significant importance. Agency is shared in computers, both the user and computer act and have a common context for action. This idea is required for the standard dialogue and communication model of UIs. The idea of agency and action is derived and anchored in Aristotle.

Laurel looks at psychology, which has been a longstanding influence in HCI (especially through Don Norman), and finds that psychology and theatre are closely related. Psychology “describes what goes on”, and theatre “represents what might go on”. The approach to interfaces defined by Norman and in psychology in general are found to be the same as the definition of theatre: “representing whole actions with multiple agents.” (p. 7)

Laurel finds some criticism with traditional definitions of interfaces. Interfaces may be a simple layer between the user and the machine, but cognitive science introduces the notion of mental models, so the user and computer have models of each other. However, this can easily turn into a “horrible recursion” with self reference and abstraction. Another classical approach is the view of the interface as a prosthetic, mapping directly between the user and the machine, where actions are mapped nearly one to one. To Laurel, none of these approaches is satisfactory, so she turns to look at a theatrical model of interfaces. The idea with this is that the program performs for the user, and that the UI should be modeled after the technical support for a theatrical production, which, when it works, is totally invisible and does not matter or affect the user’s experience. This idea seems to reverberate in contemporary “rich client” design, which uses display effects and lots of theatrics and feedback to demonstrate interaction. However, like in many theatre productions, technical support does not always conceal the nature of the play (for instance the moving mechanical set in Sweeney Todd) Not all models of theatre apply to transparency, some are highly reflective, and this should be seen as making quite a bit of sense, especially when connected to rich UIs.

The challenge with the theatrical model of the interface is the user’s role within the production. Clearly, a play can not easily welcome audience participation. Instead, Laurel proposes that the interface treat the user as a performer (or character), and this again echoes with others, notably Goffman. This tradition is generally known as scripting the interactor, and is commonly used in tutorials and games. Defining interactivity leads to another conceptual problem. This challenge also percolates into Chris Crawford on interactive storytelling. Laurel’s definition is this: “It enables you to act within a representation that is important.” (p. 21) This definition complicates standard assumptions with interfaces, but is nonetheless quite evocative. It however depends on ideas of action and representation that need to be fleshed out further.

Imagination plays a significant role in models: It is used by humans for planning (in traditional AI sense), but it is also generally used for much more. Art is an external representation of imagination. The things represented in art often tend to some with whole worlds of meaning, even when the things are wholly imaginary. The idea of an interface metaphor convolutes the relation between representation and reality, whereas the language of theatre better establishes that relation as a knowable boundary.

Chapter 2: Dramatic Foundations

The focus of drama here is of the Aristotelian variety, focusing squarely on the Poetics. The question of “Why Aristotle?” is addressed, and this seems to be because Aristotle is complete, and one might also say that Aristotle is easy to model. The Greek cultural frame of drama, specifically with the idea of divine possession inspires a more modern sentiment of immersion, which links to immediacy and transparency where interfaces are concerned. Divine inspiration precludes the boundary of a medium.

Aristotle outlines four causes of representation, and Laurel expands on these and applies them to HCI. Programs themselves are difficult to reduce to this model because of their complex layers and patterns. Laurel’s investigation of programs is persistently grounded in the user experience of the software. So, while a program may be very good at a task, if it is nearly impossible for users to perform it, the task is not effectively performed by the software. Laurel finds two things specifically: “functionality consists of the actions that are performed by people and computers working in concert, and programs are the means for creating the potential for those actions.” And: “The most important way in which applications, like plays, are individuated from one another is by the particular actions they represent.” (p. 45)

  • Formal cause: This is the idea of applying to the essence of something, the abstract idea that the representation aims to represent. The formal cause of a play is the action and plot that is performed. The formal cause for a human/computer action is a representation of action (the functionality?)
  • Material cause: This is the physical construction of an object, its substance and composition. The material cause of a play is its enactment. The material cause of an interface (not a program!) is its enactment as well, the presentation and feedback given to the user.
  • Efficient cause: This is how the object is made, involving the maker and the tools. The efficient cause of plays is its technical construction, and with interfaces it is the same. However, interfaces also are very notable in their use of models and concepts, especially in reference to the underlying application itself.
  • End cause: This is the purpose of the thing in question in its existence and application. The end cause for a play would be its catharsis, for the Aristotelian model. The end cause for an interface is the successful use of the functionality by the user.

Laurel also presents Aristotle’s six elements of qualitative structure in drama. These are: Action, Character, Thought, Language, Melody/Pattern, Spectacle/Enactment. Each of these are connected between UI aspects as well as classical dramatic ones. This is notably the same scale used by Mateas and Stern in describing Facade. An effect of these is that material causes propagate backwards, originating in the enactment and affecting the other levels towards the top. Formal causes originate at the level of action, and then move downward affecting each other aspect of the production. The conflict framed by this confluence of factors resembles the problems found in design of all sorts. Even still, this stratification of elements is found in many other sources (pertaining to architecture and design among other things), not all of which are Aristotelian.

In terms of the behavior of programs, Laurel emphasizes that it is important to connect the idea of character. For Aristotle, a good (virtuous, even) character is one who successfully transformed thought into action. A virtuous object is one which fulfills its purpose. This idea also implies that a good character is one who fulfills expectations. This in mind, that aspires to the idea of mythologically emphatic plots, closely perscribed experience, as well as to transparent user interfaces. What complicates the situation is that characters and interfaces must be appropriate for the function or action.

Reading Info:
Author/EditorLaurel, Brenda
TitleComputers as Theatre
Typebook
Context
Tagsdigital media, dms, cybertext
LookupGoogle Scholar, Google Books, Amazon

Brown and Duguid: The Social Life of Information

[Readings] (08.29.08, 4:29 pm)

Overview

Brown and Duguid in this book look at the growing futurism of the information age and present a variety of critiques to common claims that digital technology will change the face of the world. This book was first written in 2000, and many of the fantastic endeavors of the dot-com boom have begun to collapse. Given sufficient retrospect, Brown and Duguid’s warnings that the internet will not totally change the face of business (among other things) seem obvious and unsurprising as the futurist claims now seem facile. They ultimately do not pin down a solution to the question of where technology is taking society, but do assert that tunnel vision will not yield the correct answers. Rather, they advise, it is important to look around. Specifically, they emphasize that the entities threatened by new technology will still adapt in self preservation, that the existing status quo is a product of evolution and has many benefits that technology cannot replace, and that technology will fall on its face if it is unable to address social needs.

The book is published by the Harvard Business School Press, and this betrays some of the interest underlying the exploration in the text. Brown and Duguid are interested in how technology shapes the economic and business world, not necessarily its inherent or expressive capabilities.

Notes

In the preface (in 2002), Duguid and Brown say “We are not trying to ignore or denigrate information technologies (the term Luddite nonetheless appeared intermittently in conversations). Rather, we are arguing that by engaging the social context in which these technologies are inevitably embedded, better designs and uses will emerge.” This places a usability/design spin on interpretation. (p. x) They also compare technology as a replacement of man (or human processes) versus augmentation. This echoes the early division between Engelbart and common AI modes of thinking. (p. xii)

The discussion at this early point is about design: coming from the perspective of information versus social consideration in design objectives. (p. 4) A recurring theme in the book is the “6-D’s” or “6-D vision”, about several theories which emerged aound digital technologies:
“demassification”, “decentralization”, “denationalization”, “despatialization”, “disintermediation”, “disaggregation”. These visions predicted fundamental change at every level of our social structure, that would involve the demise of many of the most stable institutions, notably: offices and work space, firms and corporations, universities, national governments, etc. Evidence and hindsight would indicate that these have, indeed, not been realized. The problem is a matter of perspective: if information is seen as a cause to everything, then change in the transmission of information will create dramatic results. The authors seem to imply that the abundance of information technology has lead to the vision that everything is mere information. (Sounds kinda like Raymond Williams) (p. 22)

Predictions miss due to lack of anticipation of other factors occurring from “around”, not just in the direction of the technology (as in 50’s futurism). A question to consider: What external forces shape development of information technology today? What are the next revolutions that will occur (or have occurred) in parallel, unanticipated by information futurism? (p. 32)

On agents: the authors use a robust understanding of agents and their applications, but tend to veer down the course of doomsaying to a significant degree. They share the assessment of Weizenbaum that agents are made to seem more like people and vise versa. (p. 39) There is a great deal of fear over the ambiguity and the opacity of agents: When we do not know about what goes on under the surface, they may not be innocent. Do people expect agents to be innocent? People may be more jaded now, but still we use the Amazon recommendation and various Google technologies without complaint. Are they any more different or less scary now? (p. 45) There is further discussion on agent brokering and worrying remarks over whether people can tell the difference between an inept or a corrupt agent. Compare with Jesse James Garrett, on how agents could automate many dimensions of problem solving and search processes. (p. 48) Agents and social/cultural issues of goal oriented behavior: The author discusses “human foibles” as evidenced by shopping at a supermarket, and how decision making changes very rapidly. Also, human rules (in terms of how to perceive and select, etc) change rapidly and are more volatile than agents could hope to be. (p. 51) Doomsaying is also like tunnel vision, it neglects many states and measures in place to track agent behavior, and autonomy is extremely limited. So the fear that agents would replace human behavior seems slightly misinformed. (p. 55) This begins to follow an argument towards embodiment, but stops short explicitly. The authors do not discuss other forms of agents (for instance in games, or specifically, The Sims) whose goals are to simulate and intentionally represent human foibles to some degree. One could make an argument that human understanding of the world (and our decision changes) are already the result of manipulation and preference abstraction as represented by advertising stereotypes. So, agents could *potentially* enlighten this matter were they transparent. The authors do not go so far as to suggest that agents could have a place in the world that would not undermine its moral foundation. This fear seems as flawed as the idyllic holy land predicted by futurists.

Brown and Duguid move on to some traditional environments which information technology “threatened” to devastate, but the replaced version of the process or environment turned out to be less capable of satisfying social needs than the traditional. The problems produce sound very similar to Norman’s common frustration with everyday objects. (p. 72) Part of this, again in hindsight, indicates fledgeling technology/medium, where new experiments and changes are made at a rapid pace and the successful survive. Part of this critique sheds light on an interesting issue, though: existing institutions and processes tend to emerge via evolutionary means, that which creates success will spread by natural selection. But the problems are being critiqued in a manner that sounds like a problem with design. Could Norman’s principles of transparency and affordances and his model process-oriented interaction be applied to constructs like an office? A worthy question to consider.

Some embodiment is discussed here: it sounds like digital media does not enable or is not equipped to handle embodiment? “Putting this all on the desktop, while supporting the individual in some ways, ignores the support and knowledge latent in systems that distribute work.” This argument sounds a lot like Norman, but instead of frustration from execution/evaluation, it is from the designer’s lack of accounting for human social structures. (p. 80) An example of good social design is how Apple distributed computers in schools very liberally, which led to a wide consumer base. Additionally, IBM and Microsoft benefited from widespread distribution of IBM machines in a corporate environment.

Discussing knowledge and information: the classical concern is to find the difference. The flaw with digital futurism is to equate information with knowledge. The difference seems to involve a few details: knowledge involves a knower (***), who actually knows the material in question. It also reflects a matter of knowing how, versus knowing that. Knowledge seems to invovle a “gradual assimilation”. Furthermore, knowing how also is a matter of knowing to be. (p. 119) Knowing how requires practice. (p. 128)

Knowledge relates to learning, while information relates to search. Learning seems to require a community? The discussion here is focusing on business practices rather than more abstracted knowledge. Learning requires practice and peer support. (p. 125)

Knowledge is leaky in some areas and resistant in others. These changes seem to be community oriented. Duguid and Brown bring up the example of Steve Jobs visiting Xerox PARC and coming away with UI ideas, where the rest of the Xerox community was ill-receptive to them. (p. 151) The ability for information to disseminate like that and find footing in diverse areas is one of the greatest strengths of the internet, but it goes oddly unmentioned. Instead, they argue that the firm will persevere in their roles to nurture new buisnesses, and that regional clustering will be a significant factor in development of business. This is true, but it is foolish to ignore the many affordances of digital media to disseminate knowledge in other forms.

Document design is discussed next, and how paper has grown despite many hearkening its demise. Much of this revolves around the embodied nature of documents and the context that surrounds them. They suggest that digital technology can learn a lot from traditional document design, and I think that history has shown us that it has. Metadata and community-based aggregation services/technologies/portals have applied copious metadata to form context in many electronic documents. This change is a strong feature of modern web design. History (in all levels) has shown that, yes, metadata is important. (p. 205)

The authors leave basically without answers other than to avoid tunnel vision. They iterate that technology will not succeed without accounting for social context, but never once imagine what capability technology might have were it to do so. They assert that institutions (the firm, the paper document, the university) evolved and will continue to operate and continue their existence, as they are evolving structures. Technology will not eliminate them. BUT technology will change them, and the question of how is left unaddressed. Some institutions, ie copyright, have many forces supporting their continued existence, but clearly the digital age and ease of replication of data can not leave copyright unchanged. There are many directions and possibilities for these changes to occur, but Brown and Duguid are uninterested in identifying the problem or stressed areas, or taking a position on how or what changes may or should occur. (p. 252)

Reading Info:
Author/EditorBrown, John Seely and Duguid, Paul
TitleThe Social Life of Information
Typebook
Context
Tagsdigital media, dms
LookupGoogle Scholar, Google Books, Amazon

Espen Aarseth: Cybertext

[Readings] (08.29.08, 4:27 pm)

Overview

Aarseth attempts in this work to catalogue and develop a theory of nonlinear (also multilinear and otherwise) texts, both of the electronic and paper variety. He addresses the complexity of understanding the nature of a text, its access, interaction, and phenomenological presence. The variety of these devices leads to what he calls cybertext, which has underlying differences from that of narrative.

Notes

The act of reading can be interpreted as a kind of power play. This involves a sense of safety, which can be compared between things such as tabletop and computer simulation. Power grants added control, but with extra immersion and investment. Reading itself is not devoid of power, but it is a more subtle kind (Intro, 5). Oral storytelling traditions involve significantly more interactive and participatory characteristics. Some of this is phenomenological and rooted in sense of place. Conversation can be seen as heavily interactive, whereas dialogue is less so (Intro, 15).

Aarseth discusses semiotics in the first chapter, exploring means of interpreting signs in cybertexts (specifically games) in a literal manner. Ie- each visual element in the game is some sort of sign unit. Signs need not be human-interpretable, though, a sign system does not need to be inherently linear, either. Could some cybertexts exist without logical/symbolic interpretation inherently (Paradigms, 31)? Aarseth emphasizes the duality of code, vs the Execution of code- which heavily distorts the symbolic interpretation (Paradigms, 40).

Aarseth explores some types of interactivity. He cites Peter Bogh Anderset (1990: “A theory of computer semiotics: Semiotic approaches to construction and assessment of computer systems.”): “An interactive work is a work where the reader can physically change the discourse in a way that is interpretable and produces meaning within the discourse itself. An interactive work is a work where the reader’s interaction is an integrated part of the sign production of the work, in which the interaction is an object-sign indicating the same theme as the other signs, not a meta-sign that indicates the signs of the discourse.” Compare w Crawford, others.

One of the most interesting features of Aarseth’s work is his statistical typology of cybertexts. Statistics is less absolute or conceptual, but rather empirical and analytic. He categorizes quite a few of these properties along which to sort possible works: Dynamics, Determinability, Transiency, Perspective, Access, Linking (Textonomy, 62).

Aarseth discusses Mary Ann Buckles (1995: “Interactive Fiction: The storygame ‘Adventure'”), sounds like a thing to look up. Aarseth also discusses differences between plot (sjuzet), story (fabula), as well as drama and intrigue. These are in context of the adventure game, and exploring how these are dynamically intertwined and related (Adventure Game, 112). Aarseth also describes many IF works as functionally Autistic (115). “Personal relations and habits in an adventure game like Deadline might best be described as autistic. The Encyclopedia Britannica defines autism as ‘a neurobiological disorder that affects physical, social, and language skills.’ Further, ‘it may be characterized by meaningless, noncontextual echolalia (constant repetition of what is said by others) or the replacement of speech by strange mechanical sounds. Inappropriate attachments to objects may occur. There may be underemphasized reaction to sound, no reaction to pain, or no recognition of genuine danger, yet autistic children are extremely sensitive’ (Britannica Online, ‘Autism’)”.

In Cyborg Author, Aarseth criticizes dramatic theory. Finally, he critiques literature as an ideal for cybertext. “To achieve interesting and worthwhile computer-generated literature, it is necessary to dispose of the poetics of narrative literature and to use the computer’s potential for combination and world simulation in order to develop new genres that can be valued and used on their own terms.” (The Cyborg Author, 141).

Aarseth explores the MUD, describing it as a symbolic exchange environment. Not necessarily utopian, but unmoderated, open. Consider application to game idea (!). Compare to Second Life, others.

Ruling the Reader describes various kinds of general communication, reading, listening to reading, listening to a lecture, conversation. Each have significant differences, and are different forms of compunication and participation.

Reading Info:
Author/EditorAarseth, Espen
TitleCybertext: Perspectives on Ergodic Literature
Typebook
Context
Tagsdigital media, dms, cybertext
LookupGoogle Scholar, Google Books, Amazon

Umberto Eco: The Open Work

[Readings] (08.29.08, 4:26 pm)

Overview

The Open Work, (Opera Aperta, in its original Italian) is Umberto Eco’s first book on the subject of semiotics, although it was not considered such at the time. Eco is concerned with the evolution and values of open works, where openness is in the sense of freedom of interpretation and meaning making. Openness is dependent on the freedom for an observer to interpret or explore meaning within a work.

The Open Work is reaction against Croce, a predecessor of Eco, who was a product of Italian fascism, and strongly emphasized the idea of pure meaning and authorial intent.

semiotics as encyclopedic sense/meaning derives from rules applied to sign systems infinite semiosis- implies that language cannot touch the world (Wittgenstein?) in AI connects to problem of infinite regress.

“Meaning is an infinite regress within a closed sphere, a sort of parallel universe related in various ways to the ‘real’ world but not directly connected to it; there is no immediate contact between the world of signs and the world of the things they refer to.” (p. xxii)

modern work is representation of the knowledge of the contemporary world, and the contemporary crisis. Eco argued that via formal ambiguity, art makes a political stand. Breaking down form is a political act, idea still reverberates in avant garde. (p. xx) Though Eco tones down on this idea during his semiotic period.

Chapter 1: The poetics of the open work

Eco lists a number of composers and their works that incorporate degrees of freedom for the performers. It is interesting to connect this to the composer, Sylvano Bussoti, part of whose piano sheet music was included as the header for Deleuze and Guattari’s chapter Rhizome in A Thousand Plateaus.

can connect with fan/participatory culture in terms of remixing movements

“Every reception of a work of art is both an interpretation and performance of it, because in every reception the work takes on a fresh perspective for itself.” openness is an interpretive freedom.

The next question: why does the artist need to include this openness?

platonic form argues for closure, that there is only one right way to do something aesthetically.

this is changed in medieval interpretation, where scriptures were read according to moral (or allegorical or analogical) dimensions, and interpreted and applied to the new meaning. This is not indefinite or open, but rather constrains the interpretation along four channels of interpretation. This reflects the order of society, which is imperial and theocratic.

Eco moves onto the Baroque, which has an interesting style: its richness and complexity (between extremes of solid and void, light and darkness, curvature, etc), and certain plasticity, all demand an observer to witness the work not just from one perspective, but to move around to better absorb the movement and dynamic of the Baroque form.

Baroque culture emphasizes a certain creativity in man- in the renaissance, man has changed to a puzzle solver, a creator, etc. However, it still has a significant degree of codification and rigidity in its structure.

Considers Romanticism next, and the emergence of pure poetry, which is by nature, abstract, blurry and interpretive. The pleasure of poetry is guessing. This leads to the idea of suggestiveness, which attempts to create openness for the reader.

Onto death of authorial intent?

Contemporary openness (Kafka, Joyce) lead to the construction of worlds, which are self contained and are microcosms. These worlds reflect the incarnation of ideas, certain *senses*, which are are arguably the real meanings of an open work.

The open work requires the reader to *make* the composition with the composer.

Artistic forms, the aesthetics/poetics reflect the way that science or culture views reality. An example of this is how the emergence of the “field” in physics influences the manifestation of cause and effect in artistic works. Similarly, Eco discusses the logical problem of binary logic. This idea relates to the “Law of the Excluded Middle” which is a foundation of mathematics and recently has been questioned by some logicians. Further examples are mathematical incompleteness (Goedel, notably), as well as Einsteinian relativity, Heisenberg’s uncertainty, etc.

Openness is a fundamental part of perception. We can observe and interpret, but essentially never exhaust.

For artistic creation, the artist’s role is to start a work, and it is the role for the viewer (addressee) to finish it. The “completed work”, which exists from the interpretation of the observer, still belongs to the artist in a sense, but must also belong to the viewer. Open works are never quite the same, but are also never gratuitously different. The open work is still constrained in its outcomes and limited in that it is still grounded within an ideology.

Concludes with some bullet points:
1) Open works are in movement and are characterized by an invitation for the observer to make the work with the author.
2) Of all works in movement, there are some works that are open for interpretation and uncovering on the part of the observer.
3) Every work, is open to degrees of interpretation.

Eco stops short of connecting these worlds defined by open works- each open work is a field of possible interpretations, together which define a world of connected meanings that is consistent and can be navigated by observers. At the same time, this requires certain degrees of accessibility. Eco still stops short of allowing these worlds of works to connect to each other. This is what Deleuze and Guattari would argue. That these worlds of meanings set up by each work would connect to each other and to the areas from which they borrowed references and ideas.

Dimensions of openness: openness and games- games may be very prescriptive, even despite interactivity, while leaving no ambiguity in terms of meaning making. Contrasting this, we can explore meaning making in games that have essential ambiguity (like Metal Gear, which tears down the 4th wall at points). Other games have a great deal of freedom in understanding meaning, This openness seems to derive from inherent ambiguity in some works.

Reading Info:
Author/EditorEco, Umberto
TitleThe Open Work
Typebook
Context
Tagsdms, philosophy, narrative, media traditions
LookupGoogle Scholar, Google Books, Amazon

David Bordwell: Film Studies

[Readings] (08.29.08, 4:25 pm)

Notes

Book is about the end of Theory. Namely, Bordwell is referring to Grand Theory, which is the first general theory of film and aimed to apply a sort of universal approach towards interpreting films. Theory is to be replaced with the process of theorizing. This idea seems to replace classical Grand Theory with a wide number of new approaches to cinema.

Bordwell’s essay examines the ways in which film theory has developed and splintered over time. The dominant schools of thought in this are subject-position theory and culturalism, both of which try to describe and explain properties of “society, history, language, and psyche”. Bordwell contrasts these with a third “middle-level” research, which aims to answer smaller problems.

The predominant theory of film originating in the 1920’s was auteurism, which posed the idea of a film as being authored by a single creator. This theory was challenged by others, namely structuralism in the 1960s, which looked at films within structures and genres (such as gangster films, etc).

It is odd to note that auteurism was not attacked by the fact that films are such massive and complex projects that a single individual cannot possibly be due for the film in its entirety. That idea, while it may not be widely accepted as a theory, is still massively popular.

Both auteurism and structuralism are strongly prevalent in popular understanding of games. Games are always categorized by genre, and many ones that tend to be notable have names attached: Miyamoto, Wright, Molyneux, etc.

Structuralism, by way of semiotics and mythologies, serves to relate films to a ritual structure, reflecting and enacting social dilemmas. “For example, Thomas Chatz argues that like myth, Hollywood genres are social rituals replaying key cultural contradictions. The emphasis which Hollywood filmmakers place upon the resolution of the narrative indicates the importance of key thematic oppositions, such as man/woman, individual/community, work/play, order/anarchy.”

Structuralism was gradually superseded by subject-position theory via the percolating influence of post-structuralist thinkers such as Derrida, Lacan, Foucault. The changes originated in the challenge of finding the the social and psychic function of cinema, which in turn led to questions about the role of the “subject”. This shift in focus led to the perspectives of scopophilia and narcissism, where films serve to satisfy various voyeuristic desires. This aspect defined ideology through representations, which led to an inescapable situation where representation determined subjectivity.

A theoretical school that emerged beyond subject-position theory is culturalism, which held that “pervasive cultural mechanisms govern the social and psychic function of cinema.” This domain too sees to define a foundation of knowing and acting, but allows subjectivity some freedom from representation. The center of cultural studies is the understanding of history of cultures that use texts. Cultural studies offers a more general, lighter perspective than subject-position theory.

Having reviewed the grand theories, Bordwell examines some bullet points concerning doctrine and practice.

  1. Human practices and institutions are in all significant respects socially constructed.
  2. Understanding how viewers interact with films requires a theory of subjectivity.
  3. The spectator’s response to cinema depends upon identification.
  4. Verbal language supplies an appropriate and adequate analogue for film.

Looking at the practice of theory (now there’s a phrase!), there are several methods that are employed:

  1. Top-down inquiry
  2. Argument as Bricolage
  3. Associational Reasoning
  4. The Hermeneutic Impulse

Middle-level theory emerged as a result of an increased awareness of the history and practice of actual film making. The goal of such theories is to employ both empirical and theoretical means of understanding film. The aim of these is to approach theory from the perspective of specific problems rather than sweeping doctrine. Bordwell’s defining point is the line “… you do not need a Big Theory of Everything to do enlightening work in a field of study.” (p. 29)

It might serve well to note that some of these aspects of film theory were used to reason about games, and wound up falling flat for their preoccupation with visual imagery and total failure to account for interactivity and issues of gameplay.

Reading Info:
Author/EditorBordwell, David
TitleFilm Studies and Grand Theory
Typebook
Context
Tagsdms, film, media traditions
LookupGoogle Scholar, Google Books, Amazon
« Previous PageNext Page »