The Ends of the Parabola

The Ends of the Parabola-Kevin Pratt

WIDELY RECOGNIZED in academic circles as an architectural polymath, Sanford Kwinter is famous among students for beginning each semester by first asking his classes what they would like him to teach, and then, regardless of subject, assigning reading material and speaking extempore with the kind of accessible erudition most scholars and theorists have long abandoned. This remarkable ability helps account for the popularity of his lectures, which are given throughout the United States and Europe and have made him one of the most influential architectural educators of our time. It might also explain why Kwinter, a contrarian and polemicist by nature, has long favored the essay form as his preferred means of written communication.

His new book, Far from Equilibrium: Essays on Technology and Design Culture, is a collection of such pieces. The volume assembles writings-many of them first published during the 1990s in the now-defunct journal ANY-that often deal with the complex feedback loops that have developed between the user and the used in architecture and design. Stated differently, Kwinter is evidently fascinated as much by what technology does to us as by what we do with it. To some extent, the complexion of this interest is familiar, even historical, in that it draws on arguments made by Marshall McLuhan-although Kwinter proposes that rather than semiotics being key it is “the relationship between the mathematical and the physical” that presents “the framework against which any modern understanding-aesthetic as well as scientific-must invariably be placed.” Heightening the stakes for us today is the practical instantiation of this relationship in the algorithm-which, its proliferation accelerated through the use of the microchip, has become the metatechnology of our age, and, especially, of architecture. In design culture, the mathematicization of relationships the algorithm made possible has led to an outpouring of new forms by making the physical and energetic processes that govern the creation of material culture predictable-at least in theory. Computer-driven design and engineering in all their diversity have reorganized matter and energy at increasingly microscopic and macroscopic scales, a development that, while resisted in many quarters, has nevertheless created a condition wherein a large fraction of the planet now exists in a state of being neither natural nor constructed but some amalgam of the two.

At the center of Kwinter’s argument, and smack in the middle of the book, is an essay titled “The Cruelty of Numbers,” which tackles what underlies many theorizations of this hybridized condition: the presumed “eclipse of a material or mechanical world by an increasingly electronic one.” Kwinter argues that the mechanical and the electronic are not oppositional, as is widely assumed, but rather sympathetic modalities that both interoperate with the biotic and generate complexity. That the mechanical and the electronic offer comparable means of encoding operations is evident to any student of computational history, in that the first computers, like Charles Babbage’s Difference Engine, used gears and a hand crank instead of semiconductor technology, and in fact the main difference between the mechanical and the electronic is based in the dimension of time. The microchip allows for temporal compression to such a degree that computational space expands exponentially, allowing so many billions of calculations to be performed in a heartbeat that the interpolation of the real through both simulation and analysis reaches an unprecedented degree of sophistication. Computational space thus becomes, if not analogous to the material world, at least a seductive tool for apprehending, conceiving, controlling, and inhabiting it. The problem with all of this, according to Kwinter, is that computers and other electronic tools tend to “routinize interactions with people and processes in increasingly engineered, confined, and deterministic spaces”; in other words, to seamlessly extend a neo-Fordist system that actively seeks to impose upon us a “new subjectivity-a new type of ‘Man,’ to use Nietzsche’s expression-whose matter/intelligence variables are being reengineered and finely calibrated to fit those of a new machinic workplace-society into which s/he is to be seamlessly integrated.”

Some clues to a conceivable process that would create the opposite of such regulated uniformity are apparent in the title of Kwinter’s book: The phrase “far from equilibrium” comes from the language of thermodynamics and refers to a system state where an imbalance of forces creates dynamic instability that is, in Kwinter’s words, “most likely to produce radical, productive and unforeseeable behaviors.” (Its opposite, “close to equilibrium,” describes a state in which “the disturbances, anomalies and events passing through a system are easily absorbed and damped out.”) The metaphoric implications of this, and the resulting conclusion that “instability … is the precondition of creativity,” percolate through Kwinter’s book. In taking his title from the lexicon of science, he makes clear that he seeks to engage the techno-capitalist world machine using its own language, its own tools, and its own methods of operation-yet without succumbing to the narcotizing fetishism of technophilia.

In fact, Kwinter goes on to argue in “The Cruelty of Numbers,” more or less the only positive potential of the accelerated new infraspace created by computational power resides in its ability to model what he calls “natural or ‘wild’ intelligence.” This is what underlies the creativity evident in biological morphogenesis, as well in other natural systems: “No computer on earth,” he writes, “can match the processing power of even the simplest natural system, be it of water molecules on a warm rock, a rudimentary enzyme system, or the movement of leaves in the wind.” By modeling such systems, however, computers could, like microscopes and telescopes in the past, extend “the exploratory, evolutionary process of differentiation and refinement by inventing new levels of order and shape.” Although the idea of natural intelligence as indeterminate and generative harks back to a Romantic view of natural fecundity, it is worth noting that the concept has been probed more deeply in both architecture and the natural sciences in the ten-plus years that have passed since Kwinter’s essay was first published. Some recognition of these developments is implied in the short (and previously unpublished) critiques of specific contemporary buildings and practices that are interspersed among the longer essays in Kwinter’s book. These tend to explore design processes that incorporate the mimesis of the natural algorithm: In a piece titled “The New Immanence,” for example, structural engineer Cecil Balmond is described as someone “who has seen deeply into the reality of numbers as living agents.” Indeed, Balmond is well known for using parametric technologies to develop structural systems for buildings-such as the new CCTV Headquarters in Beijing (designed by Rem Koolhaas and Ole Scheeren)-that are based in a kind of natural complexity that denies the relationship between structural logic and obvious visual order.

Kwinter’s notion of a radical link between material and idea-of design, in its most expansive sense-is especially evident in one of the densest essays in the book, “The ‘Avant-Garde’ in America (or The Fallacy of Misplaced Concreteness),” which resurrects the metaphysics of Albert North Whitehead, coauthor of Principia Mathematica (1910-13) alongside Bertrand Russell. Whitehead’s proposition was (according to Kwinter) that traditional scientific thinking from the seventeenth century on had mistakenly assigned the property of concreteness to matter, principally by adopting an abstract notion of “simple location” that saw space and time as “simple, inert, and serv[ing] as a detachable substrate to the events which place on or within them.” In fact, concreteness (read, perhaps: “reality”) lies in the process of becoming itself, or as Kwinter puts it, “nature’s concreteness may legitimately be found nowhere else than in its activity. Nature is first and foremost process,” and it is impossible to speak “strictly about a point in space-time . . . without reference to its embeddedness, that is, its relations with other points both distant and near.” According to Kwinter, the failure to realize this is one with the failure of “traditional metaphysics to account for real physical existence in its dynamic heterogeneity, organicity and multiplicity.” This insight opens into Kwinter’s case that the fallacy of misplaced concreteness led to the reductivist structures of the modern bureaucratic state, a form of government based on “the two principles of simple location, that is, on formalisms of separation and abstraction.” With the aim of maintaining social control, such a state “does not merely assume that both objects and humans behave in equally predictable, quantifiable ways, it firmly cultivates and induces such types of predictable behavior.”

Although Kwinter leaves unanswered the tantalizing question of what might happen were this basic metaphysical error to be rectified, it is illuminating to compare it with a different, more recent paradigm among practitioners of science. In a shift that has passed almost unnoticed among theorists of design, scientific inquiry has increasingly turned to stochastic models to describe complexity. While linear models work well for problems like, How do I drop bomb X on target Y, they are not so useful in many other cases-for instance, in predicting the mean ocean surface temperature rise caused by the increase of carbon dioxide in the atmosphere. The stochastic models used to predict not one inevitable future but many possible futures and the likelihood that they will occur are probabilistic: They rely, by definition, on the inclusion of random factors in their algorithms. They are also dependent on history, in that historical data is needed to calibrate and initialize them. Natural processes that can be modeled in this way thus require the twin infrastructures of both time and algorithm, which together create relationships.

At the level of complexity of interest to the designer, the paramount algorithm of natural intelligence is evolution. In some sense, the ghost of Darwin hangs over the whole enterprise of the generation of complex form: Ever since Darwin invented a technology (and his theory is a technology in the fullest sense of the word, since we use it as a tool to create subjective meaning) that explains the organization of matter without recourse to an a priori theory of form, normative means of architectural design-whether based on formal typologies, historical precedent, or the interpolation of Continental philosophy (the literalization of Gilles Deleuze’s “fold” being perhaps the most obvious example)-have had to contend with the possibility of their own irrelevance. Evolution offers an entirely different model that requires a thermodynamic gradient, i.e., distance from equilibrium, because without environmental dynamism to inject new information into the process natural selection grinds to a halt and, in the absence of meteor strikes, stagnation ensues. Moreover, evolution is both stochasticthe random factor is mutation of the genetic code-and history-dependent, since the expression of mutations relies on both the presence of the differentiated gene in the first place and a higher-order morphological structure: the organism in which it can be expressed.

It is interesting to couple this with what Kwinter terms “radical anamnesis”-the creative (if necessarily selective) use of the past as a way of constructing an argument about the future. Beyond allowing us “better to see and to judge the unprecedented mediocrity of our present aspirations, and indeed as well, the possibly dire importance of what we currently seem all too willing to give up,” this is another of Kwinter’s recommended methodologies for breaking the determinism and sterility of gravity’s rainbow, the inevitability of the ballistic missile’s return to earth. Kwinter is himself an enthusiastic practitioner of such “controlled remembering,” and one of the delights of his book is the way the essays jump effortlessly from Goethe to J. B. S. Haldane and around Thomas Pynchon, never didactically and always in service of a larger argument. Kwinter’s treatment of Buckminster Fuller, for example, focuses not on the obvious implications of the geodesic so evident in his work but rather on his processual innovations, which lead to a contemporary understanding of ecology as “the patterned interplay and mutual determination of organism and environment.”

It is worth noting here that Far from Equilibrium is one ofthose rare works, an insider’s critique. Kwinter is well enough embedded in the milieu of design practice to have a good sense of the current architectural gestalt, and whether or not one sees the analogy in terms of physics or biology, it is difficult at this point not to note a contrast between the dynamic conditions that induce change in the natural world and the feeling of aimlessness that generally infects the design professions. Kwinter refers directly to this anomie in the essay titled “Mach 1 (and Other Mystic Visitations),” in which he recounts his trip to the Mojave Desert on the occasion of the fiftieth anniversary reenactment of Chuck Yeager’s first supersonic flight. This event took place at the same time as the dedication of the Guggenheim Bilbao, which attracted the usual suspects from the art and design worlds, and was orchestrated as a supposed preview of the direction architecture was to take in the century to come. That Kwinter was half a world away, mining the obscure (to architects, at least) history of extreme avionics, indicates that he sees things differently: “Out there somewhere we knew was the zero-degree and the future, and that Bilbao was the past.”

In that desert, I think, Kwinter was searching, quite literally, for clues as to how another discipline, perhaps a particular branch of the techno-historical process itself, had found a way to achieve escape velocity and break the deterministic paradigm that compels a return to stasis through the seeming intractability of its parabolic arc. To Kwinter, Bilbao remains “a lyrical son et lumières show” that, despite its tectonic and technological ingenuity, fails to be truly radical, since it answers the question, What next? merely with, More of the same, only different. Kwinter’s answer to the same query, ferreted out of pathological moments such as the breaking of the sound barrier, is that design must be a practice that is “experimental in the fullest and most dangerous sense.” By this he means that to overcome the anxiety inherent in our not knowing how or where a particular trajectory might find expression, we must confront its indeterminacy. In the current moment, perhaps, we can engage such indeterminacy directly in the information-rich environment of the algorithm and the constrained infinity of the biotechnical process, where the natural world has built a garden of forking paths that we can apprehend only through a dynamic mimesis of its own method of becoming.

Tags: , , ,

Categories: Academia, Literature, Technology, Theory

Author:jonbailey

studying: architecture design

Subscribe

Connect with archimorph and help grow the network.

No comments yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: