Idiosyntactix Strategic Arts and Sciences Alliance The Brand Name of the Media Revolution |
[library]
Defending Cyberspace and Other MetaphorsNOTES
1.On the theory that once a trend hits Time magazine it has already peaked, the cover article of the 21 August 1995 issue was "Cyber War," by Mark Thompson and Doug Waller (38- 46).
2.Martin C. Libicki, What Is Information Warfare? (Washington, D.C.: National Defense University Press, 1995). It divides information warfare into (1) command-and-control warfare, (2) intelligence-based warfare, (3) electronic warfare, (4) psychological warfare, (5) hacker warfare, (6) economic information warfare, and (7) cyberwarfare.
3.See for instance, Gustuvus J. Simmons, "Cryptanalysis and Protocol Failures" in Comm. ACM 37, 11 (November 1994), 56-65; and in the same issue, Ross J. Anderson, "Why Cryptosystems Fail," 32-40.
4.Whether breaking into one part of a complex system means access to the whole depends on such factors as internal firewalls, the ways some parts of the system grant privileges to other parts, and the ways multilevel security is implemented. Thorough security systems are redundant and compartmented.
5.The physical properties of the electromagnetic spectrum (e.g., the permeability of the elements to various wavelengths) constitute an exception.
6.Mark Stefick's Internet Dreams: Archetypes, Myths, and Metaphors (Cambridge, MA: MIT Press, 1996) argues that the metaphors used to describe the Internet may be hazardous to its development.
7.The reference to Alvin and Heidi Tofflers' War and Anti-War: Survival at the Dawn of the 21st Century (Boston: Little Brown, 1993) is obligatory.
8.From the Report of the Defense Science Board Task Force on Information Warfare - Defense (Washington DC: Office of the Under Secretary of Defense for Acquisition and Technology, November 1996), 2-1.
9.An earlier version of this essay appears as "Protecting the United States in Cyberspace," in Alan D. Campen, Douglas H. Dearth, and R. Thomas Goodden, Cyberwar: Security, Strategy, and Conflict in the Information Age (Fairfax VA: AFCEA International Press, 1996), 91-105.
10.An earlier version of this essay can be found in Winn Shwartau, Information Warfare, 2nd Edition, (NY: Thunder's Mouth Press, 1996), 592-600.
11.On 25 June 1996, John M. Deutch, Director of the Central Intelligence Agency (CIA), testified before the Senate Permanent Committee on Investigations and maintained that among the most worrisome threats to U.S. national security, hacker attacks ranked second -- just below weapons of mass destruction. In response to the threat of hacker attack, he had drawn up plans for a roughly thousand-person office located at the National Security Agency (NSA) which would focus on the risks foreign hackers posed to U.S. computers. Deutch also supported plans for a "real-time response center" in the Justice Department to work against widespread hacker attacks. He noted that the intelligence community has assessed risks to the United States of such an attack, but the results were classified. Jamie Gorelick, the Deputy Attorney General, who also testified that day, opined that information warfare was the nation's premier security threat and called for a 1990s' Manhattan Project to deal with it.
12.Yet, the NII has yet to suffer a major attack or anything close it despite numerous smaller attacks. The opportunity for a major attack already exists: the United States has been an automated society for years. Nor does the United States have a single large enemy holding its fire until the time is right. A statistical relationship between an incident's size and its rank (Zipf's Law), suggests that for one very large incident there should be two not-so-large incidents, four lesser incidents, and so on -- each category a geometric ratio of the one above it. A distribution composed of a few cataclysms and a panoply of annoyances would be anomalous. Anomalies prove nothing, but they ought to be addressed by proponents of any NII catastrophe theory.
13.Yet, three-quarters of all real-time transactions are still on mainframe-based networks, according to Salvatore Salamone, in "How to Put Mainframes on the Web," Byte 21, 6 (June 1996), 53.
14.Although the creators of Java paid careful attention to security issues when designing the language (essentially disabling some dangerous features of C++), Java remains problematic for systems with hard outer shells (which keep intruders from posing as users) but soft innards (which allow users to wreak havoc on the system). A Java-based program picked up from the Net can do almost anything a user can. Thus, an unpatched bug (e.g., sendmail), which lets users access system administration files, can also allow Java-based agents to do so.
15.Systems can also be attacked physically (e.g., jamming, microwaves, shot and shell), but only for the purpose of physical denial and associated blackmail. Physical attacks require a nearby presence and thus are akin to acts of well-understood terrorism. They carry far greater risks to the attacker than do cyberspace attacks, which can be launched from anywhere on Earth.
16.Queering source code may, ironically, be easier if Ada (DOD's official computer language) is being used for coding. Ada is virtually self-documenting; this allows a hacker to look at the code, identify the purpose of its modules, and edit accordingly. Ada also supports information hiding so that a rogue module can be added without its code being open for inspection.
17.Not every bad egg will harm society: during the Gulf War, sensitive war plans stolen from a car were promptly returned along with a message that the perpetrator, while a thief, was by no means a traitor.
18.This rule admits several exceptions. First, a flooder can curtail communications between the United States and a foreign nation if it can get on the links that connect the two countries. Second, a flooding attack can be aimed at large known reflector sites. Third, under some circumstances, a flow of incorrectly addressed mail, even if smaller than a link's capacity, can clog a router's memory buffers. Fourth, a flooder can try to propagate a virus among networked computers that, upon activation, floods the rest of the network from multiple and unsuspecting sources. The first cannot take down the NII; the second can be turned off; and the third requires a software fix. The fourth may be a real problem but the mathematics for the attacker (how long it takes a virus to infect enough computers to have any effect without being discovered prematurely) are daunting.
19.The problem of insider sabotage is a difficult one approached through traditional security checks and compartmentalization as well as authentication methods to link effects to their authors, and, for sensitive areas, the equivalent of dual-key authorization.
20.Operational software must be complemented by data files which must be constantly change. Data files, used properly, cannot host viruses; they can be corrupted, but a corrupted data file can do only so much damage.
21.Digital signatures can also inhibit some insider crime. For instance, ensuring that a data-base that can only be changed by a digitally signed entry makes it tamper- resistant; corruption is easier to trace back to a specific individual.
22.Unfortunately, to be secure, a digital signature needs to be 512 to 1,024 bits long -- thus difficult to memorize. Human use may require hardware-encoded schemes coupled with a personal information number (PIN) so that theft of the hardware will not reveal the full password.
23.See Lee Bruno, "Internet Security: How Much Is Enough?" Data Communications 25, 5 (April 1996), 60- 72.
24.In 1994 and 1996, hackers broke into computers at the Rome Laboratory and at Los Alamos, respectively, using a bug in the Unix sendmail program that had been used in, and thus known since, the Internet Worm incident in 1988.
25.Security research is focusing not so much on how to make systems secure and as on proving that they are secure. Detecting failure modes and developing tools, metrics, simulations, and formal models are all being emphasized. It would be nice if systems could be developed that could prove software secure, but considerable effort is needed to verify even a small program. A meta-model of a software system written to highlight a system's security features may be useful, but such effort will compete with all the other meta-models a designer may be asked to create (e.g., to state rigorous architectural assumptions for later integration into other systems). Fortunately, only a small part of most programs deals with access privileges, and this part (if compact and well-identified) is more easily checked than the whole. Another approach is to hire in-house hackers, give them the source code (thus giving them a great advantage over outside hackers -- except for Internet systems, whose source code is public and available), and see how far they get. A third approach is to offer a reward for cracking in (as Netscape has done, for their security software) while the product is in beta testing.
26.On 2 June 1996, the London Times reported that banks in London, New York, and Tokyo had paid roughly a half billion dollars in blackmail to "cyber terrorists," who had demonstrated to them that they could bring computer operations to a halt; over three years, they had made more than forty attacks. The Times report has proved unusually difficult to verify, because neither the banks nor the alleged perpetrators (nor anyone quoted in the report) was identified by name.
27.By comparison, the total cost of all credit- card fraud is $5 billion.
28.Stolen intellectual property does not disappear; it is duplicated. Suppose one carmaker performed a billion dollars' worth of product development. Another company hacks into its computers and copies this information. The first company has lost no information; has the second company gained anything that might reduce the former's competitive position? It is probable that all but a small fraction of the research was specific to a particular product and thus offered little of value for the company that illicitly copied the data. SAIC surveyed 40 major corporations that confidentially reported having lost $800 million in 1995 through computer break-ins both in lost intellectual property and stolen money (Steve Lohr, "Ready, Aim, Zap," The New York Times, CXLVI, 50566, [30 Sept. 1996], D4.)
29.An analysis of CERT reports, by John Howard of Carnegie Mellon University, suggested that, after growing apace with the Internet, the number of incidents peaked late in 1993 and has since remained relatively constant.
30.U.S. General Accounting Office, Computer Attacks at Department of Defense Pose Increasing Risks (Washington, D.C.: GAO/AIMD-96-84, May 1996).
31.Most people are probably still loathe to entrust their credit cards to the Internet. In the 1950s, only 20 percent of the Americans polled were willing to fly aircraft. Aircraft manufacturers quickly realized that their prospects were tied directly to safety concerns. Boeing developed and implemented its "single-failure" philosophy, with the goal of preventing any single aircraft failure from resulting in a crash. Aircraft accidents declined over the next forty years despite more than a tenfold increase in takeoffs and landings. In a similar fashion, the newest version of the Internet Protocol (IPv6) can sharply reduce many threats such as source-address-spoofing, source-related routing attacks, password sniffing, and connection hijacking.
32.N.Y.: G.P. Putnam, 1994.
33.Fortunately, the fog of war affects hackers as well. An all-points assault has to work almost everywhere at once; second chances may not arise. Yet, any attack so complicated is difficult to test; an attacker is forced to bet everything on one shot. True, some systems do experience nonlinear failure from relative small outages (e.g., the electric power grid in the western states); yet finding and exploiting potential cascades before they are found by systems administrators is no mean trick.
34.By shutting down the northeast for half a week, the January 1996 snowstorm cost the economy $10-15 billion. Hurricane Andrew (1992) cost roughly $25 billion. Damage from the 1994 earthquake in Northridge, California, cost roughly $10 billion.
35.Attacking the NII may have a psychological impact disproportionate to its real one. That being so, is the public better served by a Government that magnifies the possibility and the consequences of such an attack; or one which concedes the possibility but puts it in the same category with accidental and natural disasters -- a fact of life whose costs one can minimize but never eliminate?
36.Fast deployments tend to move assets under DOD control (and are easier to protect); slow deployments tend to require public facilities and reserve assets. The latter are harder to protect, but their time-sensitivity is less.
37.Discussed in Roger Molander, Andrew Riddile, and Peter A. Wilson, Strategic Information Warfare: A New Face of War (Santa Monica, CA: Rand Corp., RAND MMR-661-OSD, 1996).
38.In practice, insurance may pay, but insurance rates would come to reflect insurers' judgements about their clients' information security programs. The net effect would be the same.
39.Technically, the NSA does not oversee the use of encryption in the United States, and the export of moderately hard (56-bit) encryption will be permissible if provisions are made for recovering the key pursuant to a legitimate search requirement. However, the NSA's historic secrecy, its role in earlier digital signature and encryption controversies, the impact of restricting for-export software on the capabilities of domestic-only software, and the possibility that key-recovery requirements may obviate particular encryption methods all feed the public perception that the NSA is opposed to encryption. Negative public perception could complicate the DOD's encouragement of private efforts to protect systems.
40.On 15 July 1996, the President's Commission on Critical Infrastructure Protection was formed under Department of Justice leadership to do this. If researchers are diligent, skeptical, and well funded, they should make progress.
41.Nothing prevents system owners from suing their software providers to recover the costs of a hacker attack that can be traced to deficiencies in the software. Shrink-wrapped software, however, is provided "as is" for good reasons; provably secure software scarcely exists.
42.See, for instance, Computer Science and Telecommunications Board of the National Research Council's Cryptography's Role in Securing the Information Society (Washington, DC: National Academy of Science Press), 1996.
43.If deterrence against a state is problematic, deterrence against a stateless organization is even more so (for instance, one must find something worth striking in return). For this reason, the discussion to follow considers only actions against sponsoring states as the clearest case for a viable deterrence policy.
44.Some residents of cyberspace take issue with punishing someone who breaks into computer systems, reads information (and perhaps leaves a calling card), but otherwise does no harm. Most enforcement officials nevertheless favor prosecution. Consider an analogy to the problem of graffiti. Graffiti is minor vandalism; yet, as James Q. Wilson has theorized and the New York City Police Department has concluded, graffiti marks a neighborhood as one whose standards of conduct can be violated with impunity. Citizens feel unsafe and anxious, and the neighborhood is often marked for subsequent, more serious crime. Hacker attacks, even those that cause no damage, can mark cyberspace as a lawless environment.
45.Hackers as motivated as suicide bombers may not be deterred by detection, but the balance of risk and reward felt by their sponsors may be righted by deterrence.
46.Gary Wheatley and Richard Hayes, Information Warfare and Deterrence (Washington, D.C.: NDU Press, 1996), 24. The Defense Science Board (op.cit., ES-3) has also argued:
In the information age as in the nuclear age, deter is the first line of defense. This deterrence must include an expression of national will as expressed in law and conduct, [and] a declaratory policy relative to consequences of an information warfare attack on the United States . . .
47.Strategic nuclear deterrence is not the only form of deterrence. So-called tactical nuclear weapons were designed both to deny battlefield objectives and to raise the level of destruction so high as to deter battle in the first place. John Mearsheimer argued, in Conventional Deterrence (Ithaca: Cornell University Press, 1983), that an aggressor can be deterred by the prospect that victory, although likely, will be expensive. A nation that adheres to this theory might invest resources not to increase the odds of defeating aggression but to increase the odds that the aggression would be costly.
48.See Keith Payne, Deterrence in the Second Nuclear Age, (Lexington, KY: The University of Kentucky Press, 1996), especially the first four chapters, for an elucidation of this point.
49.Wheatley et al., 12.
50.Such control helps keep the punishment proportional to the incident. To mete out great punishment for a modest incident might make the punishment in and of itself seem an aggressive act; it would also remove the flexibility in responding to an adversary that sees little to be lost in moving from a modest to a major incident.
51.As differentiated from the legal argument that in certain circumstances reprisals in kind may be legitimate under commonly accepted (e.g., Hague) laws of war.
52.Consider the relevance of the notion that a distributed system -- which is what the global information infrastructure is becoming -- is "one in which the failure of a computer you didn't even know existed can render your own computer unusable" (cited from Ivars Peterson, Fatal Defects [N.Y.: Random House, 1995], 121).
53.Physical terrorism lends itself to a simple threshold: did people die (imagine terrorist incidents that would cause great damage without human casualties)? Physical terrorism also seems to be easier to link to specific perpetrators because it leaves physical evidence.
54.What about alternative criteria -- the presence of a hostile rationale, or the attack's systematic nature? Rationales are often unknown even to historians with access to all the documents. Systematic is almost as hard to define; mere breadth of attack is inadequate.
55.See Craig Whitney, "Five Americans Called Spies by France, Asked to Leave," New York Times, CXLIV, 49981 (23 February 1995), A12.
56.Paul Blustein, Mary Jordan, "U.S. Eavesdropped on Talks, Sources Say," Washington Post, 118, 316 (17 October 1995), B1.
57.The requirement that a nation's warriors identify themselves as such (e.g., by uniforms or other official gear) reflects laws of war that entitle captured warriors to be treated as prisoners of war rather than as criminals or spies.
58.Compare the Mafiya-connected hackers in Vignette 4 to the "students" in Teheran who held the U.S. embassy hostage in 1979-81.
59.Consider, for instance, the CIA's alleged use of the Mafia to kill Castro in the early 1960s (Alleged Assassination Plots Involving Foreign Leaders: An Interim Report of the Select Committee to Study Governmental Operations with Respect to Intelligence Activities, 94 Cong., 1 Sess., Report No. 94- 465 [Nov. 20, 1975], 72-81, 92-97, 109.)
60.The downside is that an organization for sale to the "red side" may also be for sale to blue. Because BDA is difficult for information warfare, blue might induce red's contractor to report back to red that blue's systems were successfully attacked when in reality little damage had occurred.
61.Whether this attitude can be sustained in a hypersensitive democracy in which personal or corporate problems can turn into claims on public resources -- even military resources -- is a different question.
62.A term invented by Richard Dawkins (by analogy with "gene") to suggest that memes are ideas that parasitize people into propagating them much as viruses do.
63.The most obvious connection is that propaganda spreads more slowly through a population whose information infrastructure has been crippled or destroyed.
64.See Ethan Watters, "Virtual War and Peace," in Wired, 4.03, 49.
65.See David Parnas, "Software Aspects of Strategic Defense Systems," Comm. ACM 28, 12 (December 1985), 1326-1335.
66.Computer and network scientists worry about whether stringing together enough separate systems may give rise to behaviors neither predictable nor understandable simply on the basis of knowing each piece. Emergent behavior (a term from complexity theory) suggests that the DOD's system of systems, however good it looks in theory, could go haywire even without attack, a possibility that deserves examination.
67.Eliot A. Cohen, "A Revolution in Warfare," Foreign Affairs 75, 2 (March-April 1996), 53.
68.Consider how Israelis used electronic warfare to achieve an 82 to 0 exchange with Syrian jets in their 1982 confrontation in the Bekaa valley of Lebanon.
69.Theodore Ropp, War in the Modern World (Durham: Duke University Press, 1959), 183.
70.In the mid-1950's, the Soviet Union used this trick to induce the CIA to overestimate how many Bison bombers it had produced (Dino A. Brugiani, Eyeball to Eyeball: The Inside Story of the Cuban Missile Crisis, [N.Y.: Random House, 1991], 9).
71.North Korea's invasion of South Korea was preceded by raids. By October 1973, Egypt had lulled Israel into a false sense of security: during the previous spring, Egypt staged feints that forced Israel to mobilize partially; the feints were revealed as such, leading Israel to conclude that Egypt's pre- invasion covering moves also were feints.
72."Minimum" implies minimum for some purpose. What purpose would define an minimum infrastructure? conducting a nuclear war, protecting the ability to conduct two conventional wars, preserving the public's faith in its institutions?
73.Information security may be tightened in line with rising DEFCONs, but the policy would be correct only accidentally (physical and information attacks are likely to have different timeliness) and the carryover into civilian systems would be, at best, hit or miss.
74.Even the term RAND used for cyberwargaming, "Day After," was adapted from nuclear wargaming.
75.Exceptions include an individual's loss of privacy, the public's loss of confidence in an institution, physical damage (e.g., an unanticipated power outage can freeze aluminum in its smelter pots), and permanent injury or death.
76.Information warfare is likely to have varying impacts on how information is classified. Since anything having to do with intelligence is more highly classified (e.g., top secret, codeword) than matters related to operations, the growing role of intelligence in operations raises security levels across the board. Similarly, because the unclassified portion of the defense information infrastructure -- logistics, deployment -- is most vulnerable to a hacker attack, if their vulnerabilities must be hidden, then systems management data for these systems may have to be classified -- and with it, perhaps the systems themselves. Conversely, the greater the impact of media-based information warfare, the more frequently warriors must justify themselves to the media and thus the greater the pressure to declassify so as to reveal information that would indicate why, for instance, a target selected for destruction (e.g., the ostensible schoolhouse) was believed to be military.
77.Joseph C. Wylie, Military Strategy: A General Theory of Power Control (New Brunswick, N.J.: Rutgers University Press, 1967).
78.The NSA manufactures computer chips at its own on-base silicon foundry.
79.The DOD does operate information systems that cut across Service lines: the Global Command and Control System, various CINC command systems or intelligence systems. The systems of tomorrow will probably be built from those of today, and many of these systems, particularly those tied to Service-specific warfighting communities, as well as weapons systems will probably be administered by Service detachments. Even in a truly joint world, ordinary bureaucratic mistrust among different communities - - intelligence, operations -- will persist.
80.Based on the author's conversations with representatives of the New York Times, the BBC (British Broadcasting Corporation), and CNN (Cable News Network).
81.Eli Benjamini, Geoffrey Sunshine, Sidney Leskowitz, Immunology: A Short Course (N.Y.: Wiley-Liss, 1996), 19. Note that replacing "organism" with "nation" and omitting "cell" make the quotation speak precisely to defense.
82.See, for instance, Robert G. Evans, "Health Care as a Threat to Health: Defense, Opulence, and the Social Environment," Daedalus 123, 4 (Fall 1994), 21-42. As Susan Sontag argued in AIDS and Its Metaphors (N.Y.: Farrar, Straus, and Giroux, 1989), such explanations are not always benign:
Military metaphors have more and more come to infuse all aspects of the description of the medical situation. Disease is seen as an invasion of alien organisms, to which the body responds by its own military operations, such as mobilizing of the immunological "defenses," and medicine is "aggressive" [9]þ. Military metaphors contribute to the stigmatizing of certain illnesses and, by extension, of those who are ill [11]þ[so that] the effect of military imagery on thinking about sickness and health is far from inconsequential. It overmobilizes, it overdescribes, and it powerfully contributes to the excommunicating and stigmatizing of the ill [94].
83.See, for instance, The Wall Street Journal, 15 Jan. 1996, A1, or "Cyber Wars," The Economist, 338, 7948, (13 Jan. 1996), 77-78.
84.Benjamini et al. provides an excellent introductory text. Also recommended are the September 1993 issue of Scientific American (269, 3), in particular: Sir Gustav Nossal, "Life, Death, and the Immune System"; Irving Weissman and Max Cooper, "How the Immune System Develops"; Charles Janeway, "How the Immune System Recognizes Invaders"; Philippa Marrack and John Kappler, "How the Immune System Recognizes the Body"; and William Paul, "Infectious Diseases and the Immune System." Subsequent articles in Scientific American include: Howard Johnson et al., "How Interferons Fight Disease" (270, 5 [May 1994], 68-75); Victor Engelhard, "How Cells Process Antigens" (271, 3 [August 1994], 54-61); and Martin Nowak and Andrew McMichael, "How HIV Defeats the Immune System" (273, 3 [August 1995], 58-65). The author also gratefully acknowledges the assistance of Dr. Amy Rosenberg (Food and Drug Administration) on this chapter.
85.No simple sketch can adequately convey a system's complexity, and much remains to be discovered about how the immune system works. The immunologist, physician, writer, and philosopher Lewis Thomas, who died of cancer in 1995, once remarked that because the immune system is so complex, he would not know which of his cells to root for to fight his cancer (from Jimmie Holland, "Cancer's Psychological Challenges," Scientific American 275, 3 [September 1996], 160).
86.An antigen is any foreign material that is specifically bound by specific lymphocytes; generally speaking, it is the marker for a pathogen. Some antigens are recognized not by their own epitopes but for the enterotoxins their activity produces (e.g., a rotavirus is detected when its enterotoxin, NSP4, reacts with a T-cell). Some B-cells also recognize polysaccharides (long chains of sugar molecules).
87.By contrast, a tight match with an epitope that occurs after clonal selection in the thymus will energize the lymphocyte. Exactly why is poorly understood. A T-cell has a CD28 receptor which needs to be in contact with a B7 protein to be activated. Such proteins are generally absent during clonal selection, but abundant later in life. However, other mechanisms may play a larger role.
88.The difference between 10,000 and a few million suggests that the average antigen can be recognized by each of a few hundred receptors. This is true probably because (a) antigens contain many different proteins which themselves contain many different epitopes, and (b) many receptors are sufficient matches for any one epitope.
89.Research, notably by Polly Matzinger of the National Institutes of Health, suggests that to induce antigen presentation dendritic cells need to be activated by signals from other cells dying or otherwise under attack.
90.MHC Class I molecules tend to interact with helper T-cells and their CD4 receptors. MHC Class II molecules (which have a larger groove for acquiring antigens) are more closely associated with cytotoxic T-cells and their CD8 receptors.
91.How the innate immune system reacts to the antigen determines whether T-cells mature into either Th1 or Th2 cells. Viral or bacterial stimulation of NK cells promotes Th1 cells, which emits interleukin-2 (IL-2, a T-cell growth factor) and gamma-interferon (INF-þ). The latter activates cytotoxic T-cells, NK cells, and macrophages and it stimulates the proliferation, circulation, and presentation of antigens by the Class II MHC proteins they host; this activates yet more helper T-cells. INF-þ has generic antiviral properties and helps modulate immune reactions by turning strongly stimulated B-cells on and turning weakly stimulated ones off. Mast cell and eosinophil stimulation (probably by parasites) promotes Th2 and induces emission of IL-4, IL-5, and IL-6 (all of which stimulates B-cells and immunoglobulin secretion), IL-9 (which activate mast cells), and IL-10. Path selection also affects which kind of immunoglobulin (Ig) molecule is produced when B-cells are stimulated. INF-þ (the Th1 path) favors IgG subtypes; IL-4 (the Th2 path), IgE. Once a path is selected, it tends to reinforce itself: IL-10 inhibits Th1 formation, while INF-þ inhibits Th2 formation (the Epstein-Barr virus protects itself by making a protein similar to IL-10 and thereby inhibiting the body's Th1-based defense). In Western societies, the general absence of whooping cough and tuberculosis which otherwise stimulates Th1 formation is correlated with a greater incidence of asthma which is an allergy-based condition exacerbated by IgE stimulated by Th2 formation (see William Cookson and Miriam Moffat, "Asthma -- An Epidemic in the Absence of Infection?" in Science, 275, 5296 [3 January 1997], 41-42).
92.Partial matches are important in the immune response, because they cause helper T-cells to emit IL-4 which stimulates APCs. Yet, partial recognition tends put T-cells to sleep rather than stimulate their replication. See Gilbert Kersh and Paul Allen, "Essential Flexibility in the T-Cell Recognition of Antigen," Nature 380 (11 April 1996), 495-498.
93.Most memory cells (helper T-cells and B-cells: see Rafi Ahmed and David Grey, "Immunological Memory and Protective Immunity: Understanding Their Relation," Science 272 [5 April 1996], 54-60) are thought to need constant regeneration, a process facilitated by dendritic cells, which retain just enough antigen to induce new memory cells to replace ones that have deteriorated.
94.This principle underlies vaccination. A dead or weakened virus is injected into the body, where its antigens stimulate an immune reaction, rather than fullblown disease, so that a subsequent encounter with a healthy live virus can be rapidly defeated. Occasionally, the second encounter overstimulates the immune system and precipitates an autoimmune response.
95.The basic Ig molecule looks like one or more "Y"s connected at the stem. There are five classes of Ig molecules (with multiple subclasses). IgG is the basic workhorse; it can penetrate all body cavities. IgM (5 Y groups), the second most common, is the largest and first to be produced when B-cells are turned on; the presence of cytokines determines which classes the IgM is converted to. IgA (2 Y groups) is bound in mucus membranes (and thus in tears and saliva). IgE is associated with histamine response, and the functions of IgD are largely unknown. During an immune response, B-cells tend to favor those Ig classes which exhibit the highest affinity for the antigen.
96.A cascade usually begins with a reaction between one antigen and either two IgG molecules or an IgM molecule.
97.Tumors, as opposed to antigens, tend to have less access to B7 proteins and therefore do not elicit so strong an immune response.
98.Carl von Clausewitz, On War (Princeton: Princeton University Press, 1976 [originally published in 1832]), 119.
99.By contrast, computers can differentiate long strings of characters by a difference in one bit; yet imagine the human lifespan were the immune system to crash as often as Windows does.
100.The passing of genes to the next generation rather than the survival of the organism that carries them is what makes traits survive evolution. Traits that ensure survival are important only insofar as they keep an organism alive long enough for it to have and rear children. After that, additional years offer little gene-passing advantage. The immune system has correspondingly selected for traits that fight diseases of childhood and early adulthood, rather than against geriatric afflictions.
101.Once T-cells are activated, they have CTLA-4 rather than CD-28 receptors; a reaction which pairs the B7 molecule with the CTLA-4 receptor turns off the synthesis of IL-2 and induces the production of memory cells.
102.IL-16 and IL-12 (which may retard tumor growth by inhibiting the development of blood vessels) are being investigated for use in the treatment of AIDS.
103.See Howard M. Johnson, Jeffrey K. Russell, and Carol H. Pontzer, "Superantigens in Human Disease," Scientific American 266, 4 (April 1992), 92-101.
104.Several features of the immune system tend to be Stalinist: attacking everything it does not recognize, inducing infected cells to commit suicide, "retiring" warriors who have had close contact with the enemy, and eliminating all stray substances that the attacker may feed on.
105.Designers of the Army's All-Source Assessment System have recognized that a phenomenon (e.g., a platoon preparing for a hostile operation) need not be recognized in its entirety in order to trigger a conclusion; a sufficiently good template can be constructed by considering only a fraction of all the relevant attributes.
106.See Jeffrey Kephart, "A Biologically Inspired Immune System for Computers," in Artificial Life IV, edited by R. Brooks and P. Maes (Cambridge, Mass.: Massachusetts Institute of Technology Press, 1994).
107.See S. Forrest, A. S. Perelson, L. Allen, and R. Cherukuri, "Self-Nonself Discrimination in a Computer," in Proc. IEEE Symp. Res. Security and Privacy (Los Alamitos, Cal.: IEEE Comp. Soc. Press, 1994), 202-212.
108.Woflgang J. Streit and Carol A. Kincaid- Colton, "The Brain's Immune System," Scientific American, 273, 5 (November 1995), 54-61.
109.Fred Ikl‚ and Albert Wohlstetter, Discriminate Deterrence: Report of the Commission on Integrated Long-Term Strategy (Washington, D.C.: U.S. Gov't Printing Office, 1988).
110.Most surveillance satellites fly a polar orbit in a band 200 to 400 kilometers high. A population of several hundred million pellets scattered across the equator in that altitude band would hit such a satellite with a cross section of four square meters once every five years -- halving its normal ten- year lifespan.
111.Perhaps the most useful on the subject of how to secure computer systems (even though computer security, itself, gets scant mention) is Nancy Leveson's Safeware: System Safety and Computers (Reading, MA: Addison-Wesley, 1995). The habits and practices required to secure systems against accident and error carry over very nicely to securing them against deliberate attack.
112.From Chapter Five of Essays in Persuasion (New York, W. W. Norton, 1963 [originally published in 1926]).
.
.
.
[library]
.
.
.
.
.
.