Idiosyntactix
Strategic Arts and Sciences Alliance


The Brand Name of the Media Revolution

[library]

Concerns and Remedies

Concerns and Remedies

C4IFTW will bring about a series of changes that will profoundly affect both the nature of information available to the warfighter and how this information will be disseminated. Concerns arise regarding the impacts that these changes will have on the decision process and upon decision makers. Other concerns involve new or increased vulnerabilities associated with information age systems and processes. Finally, a set of concerns centers on our ability to design and acquire systems given the information age realities of increased reliance on COTS hard-ware and software and the ever-shrinking technology life cycle.

An analysis of the specific concerns identified revealed that suggested remedies fell into the following groups:

Concepts/Doctrine
Education Training, and Exercise
Testing System Design and Specifications
Organization and Procedures
Tools, Models, and Decision Aids

Specific issues, grouped into the five areas identified above, were identified and examined against the types of remedies appropriate to avoid or manage them. As these areas were reviewed and remedies considered, the same basic set of remedies arose over and over again, making it clear that a coherent program of remedies can and should be developed. The remainder of the chapter is devoted to a discussion of each of the following areas of concern and the remedies that are associated with them:

Nature of the available information
Dynamics of information dissemination
Impact on military decision making
Vulnerabilities arising from the information systems themselves
C2 design and acquisition issues

Available Information

Non-essential information swamping critical information becomes a major issue for C4IFTW. The sheer volume of information received could frustrate the ability to quickly identify critical information for the decision at hand. To avoid situations in which more information than can be processed is presented, decisions must be made about what information is really needed, what is nice to have, what is irrelevant, and what is potentially distracting or confusing.

The requirement for information clearly depends upon both the mission and the situation. Unless individuals are given an opportunity to think through what they really need, requirements for information will always be inflated. Furthermore, this cannot be a "paper" exercise. Individuals with appropriate military experience must be placed in realistic situations and must be allowed to experiment with different amounts and types of information. The lessons learned from these experiments can be used as inputs to requirements and design analyses.

While refining information-related requirements in a more systematic manner will certainly help, it will not be sufficient to avoid the effects of information overload. Better education and training, devoted to information processing under stress and in environments characterized by uncertainty, are needed to develop the necessary skills to handle these information-rich situations. System designs must track these "information domains."

Finally, practice is key to perfecting and maintaining the skills necessary to function in an information-intensive environment. Therefore, exercises, "on the job" training, and continuing professional education need to be added to complete the necessary set of "remedies" associated with increases in the amounts of information that will be provided.

Sophisticated presentations can also obscure vital information and/or mask poor quality or incomplete data. Designing presentations that illuminate issues and facilitate decision making involves tradeoffs and choices between "raw" or unprocessed data and information that contains a mixture of "fact" and inference. Often fusion algorithms or decision aids "fill in the blanks" and provide users with inferences from available data. In some cases, valuable information is lost in the process. The remedies to address this concern include those discussed above as well as the development of better visualization techniques enabling individuals to better understand the nature of the underlying data for a given presentation.

Uncertainty regarding the quality of the information being presented or its integrity could lead to a lack of confidence that inhibits use of information or intelligence systems. Decision makers clearly need confidence in the reliability, currency and accuracy of data in order to act on it. In the information age, the integrity and authenticity of the data are important as well and should be considered as additional MOMs for information. In addition to the remedies discussed above, effective defensive IW protection measures and decision aids need to be developed that can permit decision makers to rely on the authenticity and integrity of the data. Presentation techniques that convey the quality of the underlying data are an important issue in their own right.

Dynamics of Dissemination

Not only is the amount of information available dramatically increasing as the information age unfolds, our ability to widely disseminate this information is keeping pace. As information sources proliferate, individuals are increasingly receiving inputs from multiple sources in a less than coordinated manner. This asynchronous arrival of information has been found to confuse and distract decision makers. Studies have also shown that the weight individuals place upon information may be related to the order in which that information is received. This is potentially dangerous because it can lead to differences in individuals' perceptions of a situation.

The C4IFTW concept virtually assures that individuals will be receiving different information in different sequences. To avoid the potential pitfalls associated with this phenomena, education and training are needed to heighten awareness of these issues and help individuals assimilate new data into their "information domains." Doctrine is needed to ensure that behavior is consistent across the organization. Display techniques are required to facilitate information collection and analysis. Decision aids are needed to help synthesize and fuse information on a continuing basis.

As with other concerns discussed previously, practice is a key element in ensuring that individuals develop and maintain proficiencies in dealing with this potentially confusing phenomenon.

Given the thrust of defense initiatives, particularly DISA's Global Grid and the Army's efforts to digitize the battlefield, there will be an enormous increase in the amount of information moving through communication pipes. With the C4IFTW vision of a mix of information "push" and "pull" with an emphasis on "pull," inability to anticipate or control requests for information could result in system degradation, particularly in times of great stress. In these situations, vital as well as non-vital information flow may be affected. To avoid this potentially crippling scenario, appropriate policy, doctrine, and procedures regarding the use of information retrieval mechanisms need to be developed and instituted. Again, education, training, and practice are required to raise awareness of the problem and to develop the skills needed to operate in a "degraded" information environment. Network tools are also needed to provide warnings when the limits of the distribution system are being approached and to help bring the situation under control. Finally, the design of our information distribution infrastructure needs to maximize robustness. The only certainty is that systems will not be used exactly as intended or under precisely the conditions assumed in their design, development, and testing.

Decision Making

The linkages between information quality, distribution, communications patterns, and decision making are complex and diverse. A review of organization theory, group dynamics, information theory, and past research on command and control offers key insights into these linkages and how they function.

First, when information is freely available, role overlap tends to be commonplace. Superiors tend to micromanage, particularly when the stakes are high; there are no higher stakes than combat. Subordinates, however, when provided with the larger picture historically available only to senior commanders, are also likely to second guess decisions made at higher levels and (in richly connected systems) have the information required to undertake initiatives their superiors may find inappropriate. Avoiding this set of counterproductive behaviors and management practices requires doctrine, appropriate organizational structures, self-discipline, and training.

Second, decision making in an information rich environment increasingly means media attention. The pressures of a "fish bowl" environment affect performance in a variety of often adverse ways. Tendencies to overreact, to act quickly, to appear decisive despite limited information, or to "posture" for the media can only be overcome through realistic training and experience.

When decision making becomes a collective process, which tends to occur when several principals have easy access to one another in a situation they all consider important, decisions tend to converge on options that meet group consensus. This "collective wisdom" has been demonstrated in both theoretical and empirical analyses to tend strongly toward risk averse options or poorly thought out "group-think" alternatives. The "brilliant" alternative or innovative approach foreseen by one individual is unlikely to survive this deliberative process. The potential strength of this collective process, which has excelled at solving complex problems such as those at operational and strategic combat levels, can only be achieved by an open approach to command and control decision making and a doctrine that stresses individual innovation and leadership at all levels.

Fully-connected systems also reduce the need for detailed action coordination by commanders because they make available information that would have to be requested from other elements in a classic military information structure. For example, rather than having to request information about the availability of transportation assets or ammunition needed for a combat operation, a line commander will be able to check stock levels directly through the information grid. This can lead to insufficient or ineffective coordination because subject matter experts are not consulted or because more than one command makes plans to use the same asset but none has a clear commitment of asset availability. Industry experience with richly connected systems has shown that collaborative planning and decision aids (which automatically perform coordination tasks and/or pass information between nodes in decision-making structures) are needed to avoid these problems. In addition, "red team" procedures to cross-check decisions can help to ensure adequate, timely coordination.

As generations of military commanders who have become accustomed to the availability of high density and high quality data about the battlefield mature and move into senior command positions, the expectation of near perfect information and the willingness to delay decisions in the expectation of better information will grow. However, the very rapid pace of future battles, as well as the imperatives of turning inside adversary decision loops, will punish procrastination and inaction severely. The commander who waits for near perfect information will be defeated by one who acts on "good enough" information. Doctrine and effective training for commanders must instill the judgement required to differentiate between sufficient and necessary or desirable information.

Because of the increased pace of battle and the high lethality expected in future battlespace, more and more decisions will be assigned to expert systems. This will include not only "sensor to shooter" linkages where the identification, assignment, and killing of targets must be so rapid that unaided human decision making cannot keep pace; but also other complex domains characterized by rapid developments in logistics planning, air tasking order development, and medivac helicopter routing. However, development, testing, and training are inadequate to ensure confidence in these systems. Testing is particularly important. Technology demonstrations are a good, cost effective way to gain user feedback and to develop positive attitudes toward these systems, but operational testing in realistic field conditions is also necessary to avoid systems failure or lack of use in the field. Failure during early field experience will poison attitudes which can only be overcome slowly and at great expense; thus, care must be taken to involve users early on in the design process.

Finally, by their very nature as automatons, computer systems have no inherent ability to recognize their own limitations. When applied in inappropriate circumstances, they will produce answers which may be "logical" but quite incorrect. The entire process, from concept through design, testing, and doctrine development, must include a recognition of this inherent problem. Ultimately, humans must make sound decisions about when and under what circumstances to rely on automated systems.

Vulnerabilities

As the sophistication of the military information systems support structure grows over time, the inherent vulnerabilities will become more important. Planning and doctrine can minimize these vulnerabilities, but they cannot be safely ignored.

First, all military equipment is in danger of capture. Even rear areas are raided to capture or destroy vital elements of important systems. Hence, steps must be taken to prevent equipment loss, to ensure that losses are known, and to frustrate enemy exploitation of captured systems. Unique keys that identify and authorize users on particular systems, devices that report current locations on key hardware items via satellite, authentication procedures, and security codes will be important defensive systems. Doctrine and training necessary to ensure their proper use will also be necessary.

Moreover, DoD's increasing reliance on COTS hardware and software increases vulnerabilities by making military systems familiar to sophisticated adversaries and by exposing them to soft- ware developers and technicians who are not subject to security regulations. Hence, design and acquisition procedures need to consider security and minimize exposure. Indeed, some systems may be too sensitive to rely on COTS designs or procurements.

As the information "grid" is readily available in the battlespace, the system's vulnerabilities will increase because: (a) the number of valid users with access to the system rises, magnifying the "insider" threat; (b) the number of nodes and connection points grows, providing adversaries with more opportunities to penetrate the system from the outside; and if a compromise does occur, the perpetrator will have access to more information than would have been available in the past.

Indeed, as this system grows and becomes more fully interconnected, the mere task of noticing a penetration or penetration attempt becomes extremely difficult. Often system problems cannot be readily diagnosed as "natural" or the product of information warfare attacks. Even a single penetration can be extremely damaging, particularly in a richly connected information system. Obviously, some data (such as concepts of operations, planning documents, and orders) are extremely sensitive. A well-crafted "worm" or computer virus can spread literally with the speed of light once inside a complex system. Moreover, knowing that databases have been penetrated and may be corrupted can be expected to greatly inhibit decisive and effective decision making. New types of defensive decision aids will be needed to detect, assess, and counter such attacks.

C2 Design and Acquisition

Because the inventory of information systems will inevitably continue to undergo rapid development and replacement, the design and acquisition arenas become crucial in the defense against many vulnerabilities and represent an opportunity for proactive postures to prevent or limit exposure.

As they focus on definitive, exhaustive testing against technical, often arcane, specifications, traditional test and evaluation procedures have developed a bad reputation in the operational community, where they are viewed as often preventing the adoption of a "good enough" system. Technology demonstrations have emerged as a way of exposing new systems to operators and operational conditions without having to address arcane testing standards. Reliance on demonstrations alone can be equally unhealthy because it encourages adoption of systems that have not really been tested at all. A more robust, integrated, and operationally oriented process of user assessment, as well as realistic applications (including baselines and benchmarks to ensure new systems add measurable capability) are needed.

DoD's increasing reliance on COTS is having an almost unnoticed deleterious impact on the U.S. Government's in-house capability to maintain the expertise required to adapt COTS systems and create capabilities not needed by the commercial sector. The engineering base required to meet military standards is an essential element of COTS reliance strategy. A coherent program designed to maintain and exercise this capacity is needed. At least part of this program could be devoted to the post deployment support of information systems. In many cases, these systems will need to be revised in order to maintain interoperability with new systems, a process that necessitates the linkage of COTS systems with military requirements. This means not only building linkages between systems, but also having the capacity to "reengineer" the systems and the processes the systems support.

Because C2 systems are never complete and will be continuously undergoing transitions, the ability to maintain mission capability while upgrading or integrating systems remains crucial. This capability requires planning and creativity. The Army's concept of selecting one unit as a "living test bed" for new ideas and equipment and fielding only what is successful in the chosen environment represents one approach to this problem. Other approaches, such as parallel operation of new and old systems during a test period, may be attractive in some circumstances.

Finally, COTS reliance in military systems is very different from relying on commercial systems. Plans for DoD to rely on commercial satellite communications systems must recognize that other clients can make demands on these systems and may limit DoD's access to them in times of crisis. Moreover, commercial services are not always designed for graceful degradation or fully backed up in the event of system failure. Hence, basic "availability" will be an issue when relying on commercial systems, particularly in times of crisis, and needs to be addressed (a) when contractual arrangements are made and (b) when contingency planning is done for crises.



Next Chapter

.
.
.

[library]

.
.
.

top

.
.
.

(.) Idiosyntactix