Idiosyntactix Strategic Arts and Sciences Alliance The Brand Name of the Media Revolution |
[library]
McNair Paper 29, NotesNOTES
1. Alvin Toffler and Heidi Toffler, War and Anti-War (Boston: Little Brown, 1993).
2. Late in 1994, the two directorates negotiated a formal division of labor. How well that division holds up as the roles and missions of information warfare come up for decision remains to be seen.
3. Readers are also pointed to Julie Ryan, "Offensive Information War, a paper presented to the Naval Studies Board of the National Research Council (Washington, DC), 8 Sept. 1993.
4. Seminal works plural on information warfare include: George Stein, "Information War -- Cyberwar - - Netwar," Air War College, 1993, and John Arquilla and David Ronfeldt, "Cyberwar Is Coming!" Comparative Strategy, 12 (1993), 141-165. The definitions used there differ from those used here. Netwar for those authors is akin to psychological warfare against both national will and national culture; cyberwar is command-and-control warfare, which broadly includes psychological operations against opposing commanders. A tenth of the latter is devoted to Genghis Khan's use of such techniques.
5. Anne Wells Branscomb, Who Owns Information? From Privacy to Public Access (N.Y.: Basic Books, 1994), 1.
6. MOP-30, which is currently being revised, slices and dices information warfare out to operational units. Limited to military operations, it covers, "the integrated use of operations security, military deception, psychological operations, electronic warfare, and physical destruction, mutually supported by intelligence, to deny information to, influence, degrade or destroy adversary C2 capabilities while protecting friendly C2 capabilities against such actions"(2).
Mapped. on the schema used here, MOP-30 covers C2W, the anticommunications aspect of EW, defensive IBW, and unit-level psychological operations: "these [PSYOP] forces used in C2W are, in most cases, the same forces used to conduct other aspects of warfare, and unless they represent some unique capability, will move in the same flow as the units to which they are organic" (20).
7. An extended period of material deprivation coupled with continuous carpet bombing prior to the ground offensive has also been cited.
8. Physical (weak) and virtual (strong) decentralization are different: physical decentralization retains a centralized information architecture but protects the system by dispersing and replicating memory and processing; virtual decentralization makes subunits capable of operating on their own but uses coordination with the center to strengthen the quality of their decisions.
9. Whiteboarding is a network application that permits what is put on one person's screen to come up on another's.
10. Part of the Southern strategy in the Civil War was to conduct raids against railroads and telegraph lines used by Union forces; by 1864, nearly half the Union strength was devoted to occupation duties and protecting its lines of communication.
11. U.S. officers in Vietnam must often have wished fervently (though silently) that communications lines between them and Saigon were severed, or at least those from Saigon to the field.
12. It is also possible that the supposed disagreement between Belgrade and the Bosnian Serbs may have been disinformation designed to reduce the West's pressure on the Serbian economy.
13. See, for instance, Paul Bracken, The Command and Control of Nuclear Forces (New Haven: Yale Univ. Press, 1983); Bruce Blair, Strategic Command and Control (Washington, D.C.: Brookings Institution, 1985); Ashton Carter et al., Managing Nuclear Operations (Washington, D.C.: Brookings Institution, 1987); or earlier classics such as Thomas Schelling, Nuclear Weapons and Limited War (Santa Monica, Calif.: Rand, 1959) and Herman Kahn, Escalation: Metaphors and Scenarios (N.Y.: Praeger, 1965).
14. Observe Orient Decide Act.
15. Sending a query to a system that is automatically answered.
16. See Eliot Cohen and John Gooch, Military Misfortune (New York: Free Press, 1990), which argues, among other views, that the U.S. Navy's slowness during World War II to institute a centralized learning process retarded the development of proficiency in submarine and antisubmarine operations on the Atlantic front.
17. See, for instance, Jeff Cooper, "Toward a Theory of Coherent Operations," SRS Technologies, 30 June 1994.
18. See Charles A. Robert, "Digital Intelligence Extends Army Force Projection Power," in Signal, 48, 12 (August 1994), 33-35.
19. Giving every soldier the commander's view of the battlefield can create a major vulnerability. Capturing a soldier and his equipment can give the enemy the same view. This could nullify, with one stroke, whatever prior advantage the other side had at information-based warfare. It also would reveal how such a view was obtained, and thus the capabilities -- or even better, the blind spots -- of the other side. This creates a major problem. How does one explain to troops at risk that information on the enemy that, as they see it, may affect their survival must nevertheless be withheld from them, even though its transmission is physically possible, and, indeed, easy? Efforts to control such information are more likely to be frustrated from within than from without.
20. Although most modern platforms will probably evolve to reduce observability, cost considerations are likely to relegate stealth to specialized uses (e.g., deep attack, support to special operations forces), and traditional forms (e.g., submarines).
21. In lower density realms -- plains, deserts, blue water -- a man-made object, particularly a military one, will stand out as not belonging there, so, to avoid becoming a target, an object should, instead, resemble the background, rather than ambient man-made clutter.
22. As an example, a reasonably detailed (.1 meter resolution) multispectral (8 bits x 8 bands) image of a typical theater of operations (400 kilometers on each side) generates an image that, uncompressed, takes up a million billion bits of information. Even with compression and selective intelligent updating, the bandwidth required to send the same information over the air to a location behind the lines does not exist in the electromagnetic spectrum.In lower density realms - - plains, deserts, blue water -- a man-made object, particularly a military one, will stand out as not belonging there, so, to avoid becoming a target, an object should, instead, resemble the background, rather than ambient man-made clutter.
23. As a perhaps apocryphal illustration, although Winston Churchill was said to know through Britain's Enigma system of codebreaking that the Germans would bomb Coventry, he decided to abjure countermeasures that might reveal that the British had broken the German codes.
24. Consider the close alignment between what was formerly named the Joint Electronic Warfare Center in San Antonio, Texas, and now the C2 Warfare Center and the co-located USAF Information Warfare Command.
25. Antiradar techniques can be generalized to antisensor techniques (e.g., the use of flares to confuse IR-guided missiles). The important characteristic of radar is that it receives reflected, as opposed to passive, electromagnetic radiation; radar signals can be attacked coming or going.
26. Voice calls have certain patterns in terms of who talks when and what percentage of the time is filled with blank time (e.g., listening). Encryption techniques can mask blank time patterns. False emitters can generate false conversations from random locations.
27. In a trivial comparison, an F-18, with its pilot, FLIR sensors, and attached weapons, can link all three with wires or in the pilot's mind and is therefore far more resistant to jamming.
28. For single DES, with its 56-bit key, x = 56. Triple DES is comparable to an 80-bit key, or x = 80. The formula works differently on PKE, because the challenge to the code breaker is to factor a product of two prime numbers rather than guess a correct key. Although today PKE software can support key lengths of 1,024 bits (and thus unbreakable in the foreseeable future), PKE is roughly a thousand times more computationally inefficient than DES and is best used to pass DES keys back and forth.
29. If the timing of the message is part of its content (e.g., Global Positioning System [GPS] timing signals), a system could be fooled when its original message would be blocked and retransmitted at a slightly different time -- provided the recipient lacked an accurate clock.
30. Otherwise, every aspect of war might be included, because breaking the enemy's will is generally the fundamental aim of military operations (e.g., the Coalition's use of carpet bombing against Iraqi positions prior to ground operations had an immense and perhaps decisive psychological impact).
31. Aideed's ingenious use of tom-toms, satellite terminals, and radio transmissions that bounced off city walls and so were difficult to geolocate has been cited as an indication that he understood other aspects of information warfare.
32. Masterpieces of the military art, according to Winston Churchill, contain "an element of legerdemain, an original and sinister touch, which leaves the enemy puzzled as well as beaten" (The World Crisis, 1915 [London, 1923], 21).
33. Although unexpected success can be disorienting, it is easier on the ego and unlikely to induce a disturbing reevaluation of one's competence (few people dwell on an inability to forecast their triumphs). The unexpectedly successful are also unlikely to be forced into subsequent decisions.
34. See defense intelligence-based warfare for mention of deception at the tactical level.
35. Thanks to George Kraus (Science Applications International Corporation) and Allen Carley (CIA) for suggesting this line of argument.
36. When cryptography was weak, one method of deception was for one side to let a message fall into the other side's hands as though as it were an accident. Now that cryptography is strong, such serendipity is likely to be met with more suspicion.
37. Samuel Huntington, "The Clash of Civilizations?" Foreign Affairs, 72, 3 (Summer 1993), 22-49.
38. Winn Schwartau, Information Warfare (N.Y.: Thunder's Mouth Press, 1994; not necessarily recommended, but indicative). The literature on computer crime and security is extensive. Much of what sells is of the bogeyman variety, but serious works exist, for instance, one from the Computer Science and Technology Board of the National Research Council, Computers at Risk (Washington, D.C., National Academy Press, 1991).
39. Many holes persist because they are concomitants of desirable features. Passwords, for instance, chosen by users are easier to remember but they are also easier for hackers to guess.
40. A logic bomb is a program that some time after it is inserted destroys a computer's programs and data. A Trojan horse is a program taken in by a host computer which is then subject to attack from it. A sniffer sits on a host network and collects passwords and other similarly revealing information.
41. According to U.S. News and World Report's Triumph Without Victory: The Unreported History of the Persian Gulf War (N.Y.: Times Books, New York, 1992). the U.S. was able to hack Iraq's air defense computers by slipping several cooked electronic microchips into a French-made computer printer smuggled into Iraq during Desert Shield (224-225). The chips contained a virus that disabled computer systems by making it difficult to open a "window" on the computer screen without losing data. Because a peripheral device is rarely the site of a virus, it was a good entry point for insertion. Unfortunately for an otherwise amusing
42. Schwartau discusses "bit flipping", the use of microwave beams as a method to attack computers so that they generate random errors but in a work characterized by anecdotes none is offered on this subject.tall tale, a printer is rarely checked for a virus because it is designed to send control codes (e.g., "printer out of paper"), not operational codes, back to the computer; an attempt to send bits running code would be treated by the printer as erroneous or irrelevant to printer control codes. Had this incident been a good hack as the tale suggests, the United States might have wanted to do it again, so why would anyone talk about it?
43. Eliot Cohen, "What to Do about National Defense," Commentary, 98, 5 (November 1994), 21-32. Cohen argues that "the networking of military organizations by electronic communications willþcreate new opportunities for warfare byþcomputer worms and virusesþ" (23). He adds that "Future tools of information warfare will include satellite television broadcasts, the disruption of financial systems, the forging of all kinds of electronic messages, and the corruption of databases" (31), and goes on in the next paragraph: "Elite command units worry about their members -- trained in the black arts of breaking and entering, not to mention other, far nastier, criminal skills -- going bad. It has rarely happened, in part thanks to successful screening and training; but as the military breeds more information warriors, one wonders if such screening will continue to be effective. The temptations of computer hacking are far wider and stronger (among other things, it is much less violent and can be far more lucrative) than, say, assassination for pay." See also Peter Schwartz's interview with Andrew Marshall in Wired, 3, 4 (March 1995), 138.
44. One of the more outrageous fallacies to have garnered serious research dollars was the concept that U.S. forces could somehow broadcast viruses into enemy computers. It might be possible if the computer systems of the opposition were designed to accept over-the-air submissions of executable code, but who would design a system to do that?
45. In theory, a communications system can be jammed -- rendered inoperable by having its nodes flooded with meaningless messages that prevent meaningful traffic from getting through. In practice, jamming requires knowing the precise architecture and capacity of the system nodes and links. Straightforward jamming is difficult to accomplish without leaving a fat bit-stream trail pointing back to the perpetrator. Some networks can be rendered inoperative by "alert storms," in which a glitch in the system causes the network nodes to send out alert messages that slow the system and thus make it generate yet more alert messages. Attacking a system this way, though, requires perhaps a better knowledge of the system than its designers had.
46. Computer security experts whisper darkly that banks have buried large losses due to computer fraud but have, of course, kept silent on the subject.
47. In an important exception to that generalization, the Internet has become a conduit for a large chunk of the DOD's nonsensitive but, in bulk form, essential logistics traffic.
48. The January 1991 incident, during which phone service in the northeast United States was crippled, was traced to a specific piece of software from one vendor.
49. Conversation with author at NIST, 9 March 1994.
50. More plausibly, a component in a military item meant for field use may, on receiving a signal at a given frequency, either die or go crazy. Similarly, equipment tied to networks may have trap doors available to the original vendor. Both situations are more plausible security faults than poisoned chips in commercial use.
51. Military systems vary, of course, in security regimes -- most are office automation systems. A recent red team test suggests that a moderately skilled hacker could assume superuser status on a surprisingly high percentage of systems used by the DOD. In almost all of these cases, penetration was not discovered. One reason is that computer systems administration (hence security) is not a career track in the military. The Defense Information Security Administration (DISA) has been given almost $1 billion to lower those percentages.
52. Thus, it might not have been in North Vietnam's interest to hire hackers to disrupt U.S. systems just when the country was trying to build support in the U.S. for disengagement of U.S. forces.
53. Another problem is that effective tools of computer security usually require encryption and digital signatures, best served by PKE, but this technology poses the greatest threat to NSA's core competence, signals intelligence.
54. If the United States were to take down Teheran's phone lines without owning up, who would notice the disruption, given the number of times those systems ordinarily malfunction?
55. Because computer security experts generally regard hacking as immoral, most of them would be reluctant to cooperate even with government hackers; and sensitive customers might want to see the source code, to assure themselves of the security of a system.
56. Some information flows (e.g., television, broadcast, telephone conversation) are also a large part entertainment. Although cutting them off might hurt morale, it would remove a major distraction from the war effort. Removing imported entertainment would leave a population with no alternative to local, hence chauvinistic, sources of culture and political influence.
57. See M.C. Libicki, "What Makes Industries Strategic" (Washington, D.C.: National Defense University, McNair Paper No. 5, November 1989), which explores this logic.
58. Important exceptions include steel and military goods.
59. Information Terrorism, a forthcoming book by Paul Strassmann, the former DoD czar for information systems, presents a broader view of hacker war than as personally directed attacks.
60. An intelligent agent might be used to book flights; it would know that its owner, for instance, preferred an aisle seat in the back and, for short connections, would rather rent a car and drive than take a puddle- jumper.
61. (N.Y.: Ace Science Fiction, 1984).
62. See, for instance, Martin C. Libicki and CDR Jim Hazlett, "Do We Need an Information Corps?" in Joint Force Quarterly, 2, 88-97.
63. Even though potential opponents of the United States are likely to try almost everything (e.g., ground-to-air systems or target stealth and hardening) but air-to-air combat to neutralize U.S. air power, the U.S Air Force's desire to purchase the F-22 for air supremacy persists.
64. That is, it can detect and counter the frequency hopping, spread spectrum, or chirping systems of another side quickly for the same effect.
65. For instance, if 1.2 KHz bandwidth signal (sufficient for an STU-3 digital signal) is spread over a 120 MHz band, then a blind jammer must be roughly a hundred thousand times as powerful as the original signal generator (assuming the distances are the same) to do its job.
66. According to this concept, a piece of intellectual property (e.g., a video) would be altered or salted slightly with pseudo-random bits for each customer, who may then choose to copy the product illegally for a friend. If the friend's copy is found, enough bits in the original will indicate the original (and guilty) party.
67. Worse, the Services are cutting back on Foreign Area Officers, so that the cultural context of this wiring may be missing.
.
.
.
[library]
.
.
.
.
.
.