Redefining Bioinformatics:
A Critical Analysis of Technoscientific Bodies

Eugene Thacker

Enculturation, Vol. 3, No. 1, Spring 2000

About the Author
Table of Contents

When we first started out in genomics we wanted to be a biopharmaceuticals company--the next Amgen. But then we saw that the power is in the information. And that's how we decided to become an information-based company.
Randy Scott, President & Chief Scientific Officer, Incyte Pharmaceuticals

Bioinformatics in Context

Within contemporary technoscience research, the rapidly developing field of "bioinformatics" represents a pragmatic response to genomic mapping endeavors such as those recently initiated by biotech corporations (Celera, Incyte), non-profit research organizations (TIGR), and of course the government-funded Human Genome Project (see Celera; Incyte; TIGR; HGP). Such projects are producing an incredible amount of biological data which must be organized so that it can be useful for research in such areas as medical genetics, pharmacology, immunology, internal medicine, and biotech generally. In addition, many technoscience research projects which have already incorporated computers into their programmes are also utilizing the Internet and the Web as constiutive components of research. Examples include the numerous genomic databases (such as GenBank or EMBL), which exist on the Web, are open to public access, and archived on a remote server (see GenBank; EMBL). A researcher may access the web page of these sites and perform a search according to genetic tag, sequence information, location on chromosome, related genetic disorders and diseases, as well as the names of genes themselves. Such online databases of biological information are made possible by techniques involving computer programming and software applications for the Web, whose primary element is the digital file, or the data packet being uploaded/downloaded over the Net, and whose over-arching logic is that of a particular type of database, usually accessed through a search engine. With bioinformatics, computer science and the biological sciences intersect to produce a unique and perhaps unprecedented hybrid of networking/computer technology and genetic/technoscientific bodies, culminating in large-scale networks and databases of biological research. The investment in such research in the continual negotiations over what will constitute the (scientific-biological-natural) body cannot be overstated.

On the surface the task of bioinformatics seems fairly straightforward: to efficiently organize the overwhelming amount of biological and genetic data produced by such projects as those involved in genomic mapping and sequencing. In the case of genomic mapping projects, bioinformatics must organize genetic databases so that they can accomodate several types of activity: (1) accessing and searching the database for information relating to particular genes and/or genetic sequences, (2) the development of techniques and methods for analyzing the production of amino acids and proteins from DNA sequences, (3) the comparative analysis of gene and protein sequences across different databases, and (4) the development of techniques for working with molecular modeling and molecular structure prediction. Bioinformatics in this instance is less a new scientific or technological field, than a scientific tool for facilitating research and analysis in genetics and biotechnology. The "research" done by bioinformatics--in such cases as data mining tools and intelligent software agents--does not necessarily affect the integrity of the information which is being organized; rather, bioinformatics works through a polyvalent logic of differences, relations, and linkages--what has sometimes been referred to as a high-tech filing cabinet.

However, it is exactly this "bottom-up" character of bioinformatics that is particularly interesting in light of the fact that bioinformatics, in this definition, does not propose new experimental or conceptual perspectives on molecular biology, but rather marks a technological development that is likely to have a practical effect on research within these sciences and their attendant industries. What all of these sciences and technologies have at stake is not just a disciplinary intersecting of biology and technology, but more specifically a complex hybridization process between the contingencies of bodies and the embodying potentials of information--all on a practical level which occurs daily in laboroatories and research facilities. This hybridizing is certainly not a new pheonomenon; depending on the flexibility one is willing to grant each of these terms of "body" and "information," it is also found throughout the history of the anatomical sciences as well as in genetics. For example, in the early modern period one finds in Andreas Vesalius' sixteenth-century texts an anatomical body doubly encoded through representational strategies (based on but differing radically from the practice of dissection) and the regular utilization of diagrams, taxonomies, and charts. And since Gregor Mendel's experiments with plant hybrids at the turn of the century, and more explicitly with Francis Crick's efforts in molecular biology towards "the coding problem" during the 1950s, modern genetics has always been about the transmission of (genetic) information, and genetics research and analysis still largely assumes the primacy of the genetic code as its disciplinary foundation. [1] What distinguishes contemporary technoscience from earlier historical-scientific precedents is thus not only the specificity of the technologies and techniques composing a given scientific research programme, but also the social and political resonances indissociable from the activation and materialization of these scientific practices.

In The Order of Things, one of Michel Foucault's primary suggestions is that the organization of a given knowledge system at a particular moment both produces knowledge in a certain way, as well as articulating the range of what it is possible to say and to know. It is in this sense that I would like to begin refiguring this emerging discipline of bioinformatics, by extending its scope to include technoscience research which is involved in the complex process of structuring, organization, and legitimation in determining what types of articulations may take place in the dynamic and often convoluted relationship between the (scientific-biological) body and (computer-networking) information technologies. Therefore, the term "bioinformatics" will be taken here as an instance where, within a technological frame, the body and information inscribe and materialize each other in a variety of different contexts. The primary examples here will extend from biotech and genetics research (genomic mapping; the DNA Chip) and digital anatomy (Visible Human Project), both of which have moved in significant ways onto the Internet, and both of which, as normal scientific practices, have a great deal to say about what will hegemonically constitute a "body" in a given technocultural situation.

Technoscientific Bodies

Digital Anatomy: In the early 1990s the National Library of Medicine (NLM) initiated the Visible Human Project (VHP), whose projected goal would be to construct a digital archive of mostly cross-sectional images of a male and female cadaver which would be available for use over the Internet (Visible). Previously a text-only archive, the NLM's intention with the VHP was medical (for use in virtual surgery trials), educational (for use in classrooms and medical schools), and informative (for a general, non-specialist public). To get an idea of the kinds of transformations which the VHP effects upon the anatomical body, one needs to consider the exhaustive process to which the first cadaver was subjected (a white male in his mid-30, on death row, who had donated his body to science): Upon lethal injection, the body was frozen and shipped to the Medical Center at the University of Colorado, where researchers and technicians began to prepare and then methodically slice the body at approximately one-millimeter cross-sections (totalling over a thousand slices and over 15 gigabytes of information). Each slice was prepared for digital photography and scanning, and from this data three-dimensional images were generated for the body's major sections (head, torso and arms, pelvis, legs and feet) as well as internal organs and skeletal structure. Each of these digital image files of the cadaver were then organized into a database on a remote server, which is currently accessible, under a licensed agreement with the NLM, on an FTP (File Transfer Protocoal) server over the Internet for downloading. Numerous medical schools and research centers have made use of the VHP Dataset, both in educational contexts (replacing both the anatomical textbook and the cadaver in the gross anatomy lab) and medical contexts (for visualization of surgical procedures). [2]

Genomic Mapping: Though there has been much written on the Human Genome Project (HGP), the government-funded endeavor to map out the complete genetic sequence of "the" human being (what geneticists not so long ago called the "Book of Life," comprising some 80,000 genes), recently it has been overshadowed by the more ambitious and technologically-saavy challenges put forth by such biotech corporations as Incyte and Celera, both of which have promised to complete the genomic map in shorter time and for less money than the HGP (see "Celera Presents" and Moukheiber). The projects outlined by these corporate mapping projects have extended the development of genomic science in two ways: that of techniques and technologies utilized in the actual sequencing process, and that of software applications designed to archive genomic data and to make it available over the Internet. Examples of such technologies include sequencing components and software applications such as the "LifeSeq" genomic data analysis products from Incyte (databases, microarrays, and sequencing applications), designed to speed up, automate, and increase accuracy in genetic analysis (what Incyte calls "point-and-click biology"). Biotech corporations such as Incyte and Perkin-Elmer are quickly becoming the trend within biotech research, providing the "pick-and-shovel" tools, financial backing, and research to handle such large-scale projects as the mapping of the human genome. Also, unlike the HGP, these corporations are also focusing on all aspects of genomic mapping research. An example, which we will return to below, is a type of genetic mapping called single nucleotide polymorphism (SNP) mapping, designed to mark out the extremely minute differences in genetic sequences from one individual to another, something of great potential benefit for the new field of "pharmacogenetics," where gene therapy and drug design combine to provide custom-made, individually-specific drug treatment.

Implicit in both of these instances of "bioinformatics" is the intersection of the scientific-biological body and the digital technology of the database. In a recent article on the "database as a new genre," Lev Manovich differentiates the logics of the database from those of narrative (what he refers to as the "projection of the ontology of a computer onto culture itself"):

The world is reduced to two kinds of software objects which are complementary to each other: data structures and algorithms. Any process or task is reduced to an algorithm, a final sequence of simple operations which a computer can execute to accomplish a given task. And any object in the world--be it the population of a city, or the weather over the course of a century, a chair, a human brain--is modeled as a data structure, (i.e., data organized in a particular way for efficient search and retrieval) (Manovich).

For Manovich, as I read his text, this "reduction" of the world to the database field is not so much a pejorative value judgement as it is a critical concern for the current transformations occuring within digital and networking technologies such as the Internet and Web. The application of information technologies to society and culture at large asks two primary questions: (1) How can the totality of objects within a given discourse be articulated digitally, put into digital form; that is, how can the totality of a defined field be translated into an apparently dematerialized language based on numerical on/off switches, efficient routing, automated functions, and an infinite malleability? (2) How may the totality of discursive elements defining a given field be organized and classified digitally; that is, in terms of databases, networks, and access applications? These are issues pertaining to language, but they are also issues pertaining to the materiality of objects situated in/as language. More specifically, what is at issue here is what N. Katherine Hayles calls the "materiality of informatics," where what is digitized and how it is digitized play a direct role in the information that will constitute bodies and subjects (Hayles 147-70). [3]

What is important here is not the implication that the laboratory of technoscience is becoming increasingly automated (though this is certainly an issue), nor that the measure of scientific objectivity and truth is being transferred from the subject to the machine (a more nostalgic viewpoint), but rather that in incorporating these new computer-based technologies into their research (that is, as constituent components of science research), endeavors such as genomic mapping or digital anatomy are constrained in particular ways through their context, and it is this constraint that is a marker of the ways in which they are productive endeavors. In addition, this type of constraint is not repressive, but contingent in minute ways. In other words, the use of, say, a search engine on the Web, or the use of the DNA Chip, effectively inscribes the range of mobility with which questions may be asked concerning the scientific "truth" and application of the body, as well as the types of bodies which such questions produce (be they anatomical or genetic bodies). When computer-based and networking technologies become core elements in technoscience research, what is at issue is not the disappearance of the body (a position which assumes a certain facticity in what is vanishing), but that the possibility is created for producing a range of modifications or reconfigurations in the very ontology of these sciences, in large part due to the questions which such technologies bring with them. Such reconfigurations are described by Donna Haraway as instances of what she calls "corporealization":

I am defining corporealization as the interactions of humans and nonhumans in the distributed, heterogenous work processes of technoscience. The nonhumans are both those made by humans, for example, machines and other tools, and those occurring independantly of human manufacture. The work processes result in specific material-semiotic bodies--or natural-technical objects of knowledge and practice--such as cells, molecules, genes, organisms, viruses, ecosystems, and the like. . . . The bodies are perfectly 'real.' and nothing about corporealization is 'merely' fiction. But corporealization is tropic and historically specific at every layer of its tissues (Haraway 141-42).

Corporealization, as Haraway describes it (one of her primary examples deals with the Human Genome Project, maps, and ethnicity) is a technical-political instance which does not simply determine or wholly produce the body--that is, produce it into disappearance--but rather, corporealization configures the relationship between materiality and technology in a way that complicates traditional relationships based on a mutual exclusivity of the material and the immaterial, the natural and the artificial, body and text. Elsewhere Haraway terms such hybrids "material-semiotic nodes," and the bodies produced through the work processes of digital anatomy and genomic mapping bring up troubling instances where the digital is embodied, where the visceral is constantly mediated, and where the facticity of the body is being largely displaced by its particular mode of rendering, modeling, and animating.

Bioinformatic Corporealizations

VHP (Visible Human Project) as database: As an FTP site open to public access, the VHP dataset exists as a series of archived digital files of the human body on several remote server computers. Like an anatomy textbook, much of the data is organized according to anatomical division (head, torso, pelvis, legs, organs, skeleton), but unlike anatomy texts, the dataset is also organized according to two other modes of organization: first, that of the technological cut made possible by CT and MRI scanning technologies (something becoming increasingly common in contemporary anatomical texts for medical schools), and secondly, that of the modality of the digital file (whether still images - .JPG, .TIFF, .PICT, or movies - .AVI, QuickTime, collected still-frames).

In addition, the VHP dataset is a particular type of database. Certainly it contains large amounts of digital information and files, but it also plays a particular role as an FTP (File Transfer Protocol) site, designed specifically to accomodate the transmission of digital files over the Internet from point-to-point (download/upload). In itself it is not intended to be a pedagogical database (in the way that an anatomical atlas is), though it certainly spatializes the digital body within the general database structure (folder, subfolder, files). Nor is it meant to be a reference database (as an encyclopedia or reference book), though through its modes of organizing an epistemology is implicit. As its name indicates, the dataset involves the dynamic management and regulation of its contents (the transfering of digital files from its database over the Internet). This type of database must accomodate requests for information, respond to those requests, present the requested files, and prepare them for downloading via the Internet. Its defining features are not pedagogy or knowledge-accumulation, but rather a limited set of data transfer gestures through which articulated fragments (digital files) of the anatomical body (the VHP body) are transmitted over an electronic network for a range of uses (from educational to commercial to aesthetic). This is, like genomic databases, an open database that is defined by a technological governmentality concerning its contents. Again, what defines the VHP dataset as a database is that it is an FTP site--here the actions of uploading, downloading, file compression and decompresssion, and finally file processing and utilization, all extend from the data transfer lines which constitute the VHP dataset. To assume, as one would from a medical or biological perspective, that the VHP simply contains the modern anatomical body does not say anything about the ways in which this anatomical body is transformed through mediating processes.

VHP as digital files: The type of hybridization occuring with the VHP is, needless to say, not singular; the multiple methods of scientifically and technologically enframing this body also mean multiple types of corporealizations. While considering the VHP as a database offers one approach, a consideration of what the dataset contains--what its contents are--provides another entry-point. As a series of digital files, the VHP body is partitioned in several ways: First, according to the cuts made in the preparation phases (using medical and diagnostic technologies most commonly found in hospitals or research centers). These preparatory cuts are as essential to the production of the VHP as the numerous software and rendering programs utilized in the post-production phases; they frame the body-to-be-scanned according to a technico-anatomical logic based on the possibilities and constraints of the technologies used, as well as on the best or most scientifically useful perspectives gained through the use of those technologies.

Secondly, the VHP body is framed by the methods of digitizing this cut-up body. Though this procedure is, technically, fairly direct (researchers used a digital camera along with other scanning technologies to visually capture the cross-sections of the cadaver), it is perhaps the most important methodological gesture in the production of the VHP virtual body. The scanner here provides the interface between two types of bodies, none of which is the "real" or "natural" body. It is an interface between, on the one hand, a body physically cut, sliced, and exhaustively opened by anatomical science--this is a body thoroughly marked by the complex history of dissection in early modern anatomical science. Its marks priviledge a strategic visibility that is the producer of scientific knowledge concerning a certain "truth" about the human body. This body, in all its grotesqueness and abject formlessness, is intimately discursive, we might even say viscerally discursive; as the histories of the anatomical theaters in early modern Europe illustrate, the mere fact of dissection--of opening to visibility that which is not normally visible--this mere technical fact is not coextensive with the truth claims which diagram so many Renaissance anatomy texts. Yet it is not simply separate from it either. What is required as part and parcel of the opening of the body by the hand of anatomical science is a set of discursive optics which present, see, and produce certain types of bodies--the anatomical body (and here as well, there is no one anatomical body during the Renaissance, see Cunningham).

But on the ambiguously delineated "other side" of the interface is not simply the virtual body, the simulacral, image-saturated body of so many SF and Cyberpunk narratives. Rather than highlighting the issue as one of materiality/dematerialization, we might consider the digitizing of the VHP corpse as the production of a particular type of zone of morphology. With the VHP, the digital file can exist in a variety of image-file formats. In producing the animations and movies of the VHP virtual body (as in the Body Voyage CD-ROM), attention will therefore be paid to the handling of the body as if it were a physical, corporeal entity, while enabling procedures which could not exist in a physical, corporeal context (e.g., "peeling" away layers to reveal different systems, "fly-through" animations). The VHP in its different potential contexts (educational or in medical applications) will thus constantly hover between recognizability and the uncanny; as a digital file, the VHP necessarily engages with a zone of formlessness which is specific to the technological medium as well as to anatomical and medical science. New zones on the margins of recognition (that is, of legitimacy) are thus produced through this virtual body--as a body situated by techno-science, it is no less "material" than the anatomical body, however its historical and technological contexts will differ, and thus the boundaries and issues brought forth when the technoscientific body is a digital file.

These multiple bodies are not, of course, simply an open-ended field of disseminated free-play, but are very specifically constrained and articulated by a variety of factors (including technologies, institutional and governmental support, utilization of the Internet, range of practical applications, epistemological assumptions, complex historical precedent within the anatomical sciences).

HGP (Human Genome Project) as database: The HGP presents an example of information-management on a different level. Our primary example will be the significant number of online genomic databases related to the Human Genome Project. The primary interface for such databases is that of a Search Engine on the Web. Like the numerous search engines available on the Web (InfoSeek, Excite, Lycos, Netscape, etc.), the search engine for a genomic database is designed to facilitate access to certain types of information--in this case, genetic sequence information concerning the human genome. This not only means that the database will be constructed in a certain way, but that the ways in which the search engine will access (genetic) information will also be specific to the types of searches accommodated in this database. The most basic search in these genomic databases is by some analogue to the keyword or title--a known gene marker, for instance. On one level this is a simple matter of cross-referencing and accessing matching information, but on another such performative organizations of information are never merely neutral with respect to content. Like the VHP FTP site, the genomic database operates through a distribution of information. The three main types of genomic maps--sequence maps, linkage maps, location maps--must be made accessible using the search engine, which must do two things simultaneously: it must make these maps available over the Internet and Web without altering the logics and epistemologies of genetic science in the production of these maps. In addition, because the HGP is an ongoing endeavor, it cannot be a fixed or closed database--accommodations must be made for updated information, new discoveries, information sharing, and other modifications, most of which can be automated through basic input and request commands to the server computer where the genetic database is stored.

The primary issue here is that of a redoubled encoding procedure which largely defines the genomic database. But this happens in a very different way from the VHP's own reference to the direct presentation of the "real human body." Though the historical trajectory of genetics and biotech is less extensive than anatomy in the establishment of a scientific-cultural tradition, it is no less dense in the complexities of its ongoing issues. The history of molecular biology has been, in many ways, a history of the different translations between the tropes of information and biomolecules. It is no accident that during the post-war period--the same era in which molecular biologists such as Francis Crick, Francois Jacob, Jacques Monod and other were working on "cracking the genetic code"--researchers such as Norbert Wiener and Claude Shannon were discussing cybernetics and information theory as applied to fields as diverse as physiology, military technology, anthropology, and communications. So, while one type of suggestion may be made for the history of anatomy as an information technology, this occurs in a different way when considering genetics and biotechnology. The notion of a genetic code and of sequencing information is so infused in the discourse and research of genetics and biotech that it has become its defining feature as a scientific discipline, often culminating in what Haraway calls "gene fetishism," and what Richard Lewontin refers to as the "ideology of biology" (see Haraway and Lewontin).

Some basic distinctions have also been made by Evelyn Fox Keller in the different meanings given to the term "information" by genetics and information theory during the same period. [4] While Watson and Crick's usage depended to a great extent on the content or the "text" of the genetic sequence, Shannon and Warren Weaver's formulations (in research done for Bell Telephone Labs during the late 1940s) defined information as a quantitative entity independent of actual content. In their respective uses, the goal of Watson and Crick was to decipher this hermetic text, in lines with modern science's rhetoric of discovery and progressive accumulation of knowledge, while for Shannon and Weaver their goal was predominantly pragmatic--how to insure that a signal sent at point A would arrive at its destination B with the least amount of "noise" or interference.

These distinctions are important theoretically, for they reveal the assumptions and intentions behind the modes of textualizing in a science such as genetics; and, as Haraway and other historians of science have reiterated, the discourses, tropes, and conceptual tools which inform a science have a very real effect on the production of scientific practice and the situating of scientific research within the social. This is happening on another level with the genomic database; here the distinctions between genetic information (where what exactly the code or sequence is matters a great deal) and telecommunications/networking information (where what is important is the efficiency of the reproduction of a signal between two points), intersect in their differences. While recent research in genetics and especially biotech have provided challenges to the "doctrine of DNA" proposed by geneticists during the post-war period (e.g., perspectives placing greater emphasis on a decentered, highly processual model of inter- and intra-cellular activity; perspectives placing a much more nuanced role on the importance of DNA in the production of proteins), the notion of the importance of the genetic code is still very much a part of genetics research, as evidenced by the institutional, economic, intellectual, and political support granted to genomic mapping projects. In this sense what the genomic database will provide is a variety of different approaches to a model of a human genome. In contrast to the VHP, which presents the digital encoding of a human cadaver (framed by the sciences of medical diagnostics and anatomy), the genomic databases associated with the HGP are, first and foremost, models and maps. However this is not to say that they are not objects and not territories; indeed it is difficult to make the clear distinction between textual and organic, map and territory, when genetics' primary mode of approaching the human body has been through modes of textualizing, codes, and serial arrangement.

This is complicated further in considering the genomic database. Here the "raw material" to be organized as information in the database is itself thoroughly infused with a discourse of information distinct from but closely tied to information technology. When genetic information--itself organized in particular ways--is gathered and ordered in a computer and networking database, concepts of what constitutes "information" intersect to produce a complex, information-saturated genetic body. In this redoubled informatics, databasing information treats the genetic information as an empty unit--it's primary concern is how the external form of this unit is recognized by the database and search engine, and how it is to be transmitted (in the case of modifications to the database) with the greatest amount of certainty possible.

Analyzing Bioinformatics

Bioinformatics is both a suggestive trope and a material practice which provides an example of the ways in which the scientific body is currently being reconfigured and reorganized, largely through an intersection of developments in biotechnology and the Web. Its primary object is a hybrid of the biological body and the computer/network database, a set of interlocking components which frustrate attempts to simply regard such activities as combinations of nature and culture, or the body and technology. As biological science and computer technology become increasingly indissociable from each other, the assumptions concerning the transparency of technology (technology-as-tool) and the assumptions concerning the facticity of the (natural, organic) body become highlighted in a complex institutional and technological space of refiguration and corporealization. Despite the rhetoric of incremental progress in research, increasing technical sophistication, and unmediated, universal medical potential, fields such as biotechnology, genomics, pharmacology, and medical science are continually positioned in this site of negotiation concerning what will come to be scientifically-biologically accepted as "the body."

What is needed are analyses of the kinds of bodies produced through postmodern, posthuman technoscience practices, enframed by medical, commercial, governmental, and educational contexts, and intimately immeshed within a range of research technologies. The very existence of genetic databases constitutes the formation of a significant type of body, doubly encoded by genetic sequences and computer bits, constantly at some distance from "the body itself." Similarly, the creation and utilization of the VHP Dataset effectively rearranges the anatomical body so that it may be accommodated by electronic networks and computer visualization software. Instead of assuming the transparency of technology (e.g., the way that the VHP presents its product as a "real human body"), and instead of generating anxiety over transgressive impingements upon the "natural" body (e.g., alarmist media reports on patenting or cloning; essentialist critiques of the sciences), these complex and troubling technoscience projects seem to call for approaches which do not necessarily privilege the anthropomorphic, organic, natural body as its point of reference. [5] Part of this involves the dual move of regarding the relationship of the body and technology as uncomfortably intimate and immeshed, and considering the body in such projects as the VHP from non-science perspectives (e.g., the VHP as a database-body). When organizational epistemologies intersect (e.g., the anatomical atlas and the FTP database), a relationship is produced that is increasingly indicative of the status of the biological body in late-twentieth century culture. Corporealized fusions such as the VHP Dataset or the DNA Chip are examples of rematerializations of and within the digital, and rather than simply producing a singular, homogeneous normative scientific body, these technoscience projects and objects are engaged in the production of a certain type of multiplicity which in no way relinquishes the efficacy of legitimizing discourses and (scientific) models of normativity.


1. For early modern anatomy see Jonathan Sawday, Andrew Cunningham. For modern genetics see Evelyn Fox Keller, Lily Kay. (back)

2. A list of medical, educational, and research institutions making use of the VHP Dataset can be found at the Visible Human Project website. (back)

3. Katheryn N. Hayles' example is the intersection of information theory and the discovery of the DNA double-helix, and the discrepancies in the term "information." (back)

4. For classical texts on information theory and cybernetics see Claude Shannon and Warren Weaver, Norbert Wiener, and Steve Heims. (back)

5. For example, see Margaret Lock, Vandana Shiva, and Sandra Harding. (back)

Works Cited

Affymetrix. Affymetrix. 25 Aug 2000.

Celera. Celera: A PE Corporation Business. 25 Aug 2000.

Cunningham, Andrew. The Anatomical Renaissance: The Resurrection of the Anatomical Projects of the Ancients. Brookfield: Scolar, 1997.

Debare, Ilane. "Incyte Says It Will Map DNA Within a Year." San Francisco Chronicle. 18 Aug 1998. LINK. 25 Aug 2000.

EMBL. EMBL: European Molecular Biology Laboratory. 25 Aug 2000.

Foucault, Michel. The Order of Things. New York: Vintage, 1973.

GenBank. National Center for Biotechnology Information. 25 Aug 2000.

Haraway, Donna. Modest_Witness@Second_Millennium. FemaleMan(c)_Meets_OncoMouse(tm): Feminism and Technoscience . New York: Routledge, 1997.

Harding, Sandra. Is Science Multicultural?: Postcolonialisms, Feminisms, and Epistemologies. Bloomington: Indiana Univ., 1998.

Hayles, Katherine N. "The Materiality of Informatics." Configurations 1.1 (1993):

Heims, Steve. Constructing a Social Science for Postwar America: The Cybernetics Group 1946-1953. Cambridge: MIT, 1993.

HGP. Human Genome Project Information. 25 Aug 2000.

Incyte. Incyte Genomics. 25 Aug 2000.

The Institute for Genomic Research. TIGR. 25 Aug 2000.

Kay, Lily. "Who Wrote the Book of Life? Information and the Transformation of Molecular Biology, 1945-55." Science in Context 8(4): 609-34, 1995.

Keller, Evelyn Fox. Refiguring Life: Metaphors of Twentieth-Century Biology. New York: Columbia, 1995.

Lock, Margaret. "Decentering the Natural Body: Making Difference Matter." Configurations 5.2 (1997): 267-292.

Lewontin, Richard. Biology as Ideology: The Doctrine of DNA. New York: Harper-Collins, 1992.

Manovich, Lev. "Database as a Symbolic Form 1/3." Online posting. 14 Dec 1998. Nettime Archive. 25 Aug 2000.

- - - . "Database as a Symbolic Form 2/3." Online posting. 14 Dec 1998. Nettime Archive. 25 Aug 2000.

- - - . "Database as a Symbolic Form 3/3." Online posting. 14 Dec 1998. Nettime Archive. 25 Aug 2000.

Moukheiber, Zina. "Chip + DNA = Hype." Forbes Online. 15 June 1998. LINK. 25 Aug 2000.

NHGRI. The National Human Genome Research Institute. 25 Aug 2000.

Sawday, Jonathon. The Body Emblazoned: Dissection and the Human Body in Renaissance Culture. New York: Routledge, 1995.

Shannon, Claude and Warren Weaver. The Mathematical Theory of Communication. Chicago: Univ. of Illinois, 1968.

Shiva, Vandana. Biopiracy: The Plunder of Nature and Knowledge. Toronto: Between the Lines, 1997.

SNP Database. Single Nucleotide Polymorphisms in the Human Genome: SNP Database. 25 Aug 2000.

The Visible Human Project. The National Library of Medecine's Visible Human Project. 25 Aug 2000.

Wiener, Norbert. The Human Use of Human Beings: Cybernetics and Society. New York: Da Capo, 1954.

Copyright Enculturation 2000

Home | Contents 3:1 | Editors | Issues
About | Submissions | Subscribe | Copyright | Review | Links