Idiosyntactix
Strategic Arts and Sciences Alliance


The Brand Name of the Media Revolution

[library]

Mech article in RISKS 16.92

RISKS-LIST: RISKS-FORUM Digest

Thursday 16 March 1995
Volume 16 : Issue 92

   FORUM ON RISKS TO THE PUBLIC IN COMPUTERS
   AND RELATED SYSTEMS (comp.risks)
   ACM Committee on Computers and Public Policy,
   Peter G. Neumann, moderator

Mistake of platform-specific instructions (Stanton McCandlish)

------------------------------

Date: Wed, 15 Mar 1995 21:28:37 -0500
From: simsong@pleasant.cambridge.ma.us (Simson L. Garfinkel)
Subject: The Manchurian Printer

The Manchurian Printer, (C) 1995, Simson L. Garfinkel
[The Boston Sunday Globe, March 5, 1995, Focus Section, Page 83]

Simson L. Garfinkel

Early this month, Hewlett-Packard announced a recall of 10,000 HP OfficeJet
printer fax copiers. The printer's power supplies may have a manufacturing
defect that could pose an electrical shock hazard. HP says that it
discovered the problem with its printers during routine testing; HP was
lucky: printers can be very dangerous devices. A typical laser printer, for
example, can draw hundreds of watts of power, generate internal temperatures
high enough to burn a wayward human hand, and even, under the right
circumstances, start a fire.

Most manufacturers, of course, try to design their printers to minimize such
risks. Increasingly, however, there is a chance that companies might
intentionally design life-threatening flaws into their products so that the
flaws can be exploited at a later time. These fatal flaws might be
intentionally built into equipment manufactured overseas, as a kind of
"insurance policy" in the event of a war between that foreign country and
the United States. The flaws might form the basis for a new kind of
corporate warfare. Or the flaws might be hidden by disgruntled employees
contemplating extortion or revenge.

Indeed, U.S. military planners are increasingly worried about this sort of
possibility, they place under a heading "Information Warfare." Nevertheless,
although the threat of Information Warfare is very real, an even bigger
danger is that the Department of Defense will use this threat to convince
the new Congress to repeal the Computer Security Act of 1987. This would
effectively allow the National Security Agency to declare martial law in
cyberspace, and could place the civilian computer industry into a tailspin.

To understand what the military is afraid of, imagine the Manchurian
Printer: a low-cost, high-quality laser printer, manufactured overseas, with
built-in secret self-destruct sequence. For years these printers could lay
dormant.  But send them a special coded message---perhaps a long sequence of
words that would never normally be printed together---and the printer would
lock its motors, overheat, and quickly burst into flames. Such an attack
might be the first salvo in an out-and-out war between the two countries.
Alternatively, an enemy company might simply use printers to start selective
fires, damage economic competitors, take out key personnel, and cause
mischief.

Unlike the movie the Manchurian Candidate, the technology behind the
Manchurian Printer isn't science fiction. Last October, Adobe accidentally
shipped a "time bomb" in Photoshop version 3.0 for the Macintosh. A time
bomb is a little piece of code buried inside a computer program that makes
the software stop running after a particular date. Adobe put two time bombs
into its Photoshop 3.0 program while the application was under development.
The purpose behind the time bombs was to force anybody who got an advance,
pre-release copy of the program to upgrade to the final shipping version.
But when it came time to ship the final version of Photoshop 3.0, Adobe's
engineers made a mistake: they only took out one of the bombs.

An engineer inside Adobe learned about the problem soon after the product
was shipped, and the company quickly issued a recall and a press release.
Adobe called the time bomb a "security code time constraint" and said that
"although this is an inconvenience to users, the security constraint neither
damages the program or hard drive, nor does it destroy any files."

It only takes a touch of creativity and a bit of paranoia to think up some
truly malicious variants on this theme. Imagine that a company wants to make
a hit with its new wordprocessor: instead of selling the program, the
company gives away free evaluation copies that are good for one month.
What's unknown to the users of this program is that while they are typing in
their letters, the program is simultaneously sniffing out and booby-trapping
every copy of Microsoft Word and WordPerfect that it finds on your system.
At the end of the month, all of your wordprocessors stop working: Instead of
letting you edit your memos, they print out ransom notes.

Any device that is equipped with a microprocessor can be equipped with such
a booby-trap. Radios, cellular telephones, and computers that are connected
to networks are particularly vulnerable, since an attacker can send them
messages without the knowledge or consent of their owners. Some booby- traps
aren't even intentional. What makes them particularly insidious is that it
is almost impossible to look at a device and figure out if one is present or
not. And there is no practical way to test for them, either. Even if you
could try a million different combinations a second, it would take more than
200 years to find a sequence that was just 8 characters long.

* * *

Information Warfare isn't limited just to things that break or go boom. The
Department of Defense is also worried about security holes that allow
attackers to break into commercial computers sitting on the Internet or take
over the telephone system.

"This nation is under IW attack today by a spectrum of adversaries ranging
from the teenage hacker to sophisticated, wide-ranging illegal entries into
telecommunications networks and computer systems," says a report of the
Defense Science Board Summer Study Task Force on Information Architecture
for the Battlefield, and issued last October by the Office of the Secretary
of Defense.

"Information Warfare could pervade throughout the spectrum of conflict to
create unprecedented effects. Further, with the dependence of modern
commerce and the military on computer controlled telecommunication
networks, data bases, enabling software and computers, the U.S. must
protect these assets relating to their vulnerabilities," the report warns.

Information warfare changes the rules of war fighting, the report warns. A
single soldier can wreak havoc on an enemy by reprogramming the opposing
side's computers. Modern networks can spread computer viruses faster than
missiles carrying biological warfare agents, and conceivably do more damage.
Worst of all, the tools of the information warrior are readily available to
civilians, terrorists and uniformed soldiers alike, and we are all potential
targets.

Not surprisingly, the unclassified version of the Pentagon's report barely
mentions the offensive possibilities of Information Warfare---capabilities
that the Pentagon currently has under development. Nevertheless, these
capabilities are alluded to in several of the diagrams, which show a keen
interest by the military in OOTW---Operations Other Than War.

"They have things like information influence, perception management, and
PSYOPS---psychological operations," says Wayne Madsen, a lead scientist at
the Computer Sciences Corporation in northern Virginia, who has studied
the summer study report.  "Basically, I think that what they are talking about
is having the capability to censor and put out propaganda on the networks.
That includes global news networks like CNN and BBC, your information
services, like CompServe and Prodigy," and communications satellite
networks. "When they talk about 'technology blockade,' they want to be able
to block data going into or out of a certain region of the world that they may
be attacking."

The report also hints at the possibility of lethal information warfare.
"That is screwing up navigation systems so airplanes crash and ships runs
aground.  Pretty dangerous stuff. We could have a lot of Iranian Airbuses
crashing if they start screwing that up," Madsen says. Indeed, says Madsen,
the army's Signal Warfare center in Warrenton, Virginia, has already invited
companies to develop computer viruses for battlefield operations.

Our best defense against Information Warfare is designing computers and
communications systems that are fundamentally more secure. Currently, the
federal organization with the most experience in the field of computer
security is the National Security Agency, the world's foremost spy
organization. But right now, NSA's actions are restricted by the 1987
Computer Security Act, which forbids the agency from playing a role in the
design of civilian computer systems. As a result, one of the implicit
conclusions of the Pentagon's report is to repeal the 1987 law, and untie
the NSA's hands. Indeed, the Pentagon is now embarking on a high-level
campaign to convince lawmakers that such a repeal would be in the nation's
best interests.

This argument confuses security with secrecy. It also ignores the real
reasons why the Computer Security Act was passed in the first place.

In the years before the 1987 law was passed, the NSA was on a campaign to
expand its power throughout American society by using its expertise in the
field of computer security as a lever. NSA tried to create a new category of
restricted technical information called "national security related
information." They asked Meade Data Corporation and other literature search
systems for lists of their users with foreign-sounding names. And, says
David Banisar, a policy analyst with the Washington-based Electronic Privacy
Information Center, "they investigated the computers that were used for the
tallying of the 1984 presidential election. Just the fact that the military is
looking in on how an election is being done is a very chilling thought. After
all, that is the hallmark of a banana republic."

The Computer Security Act was designed to nip this in the bud. It said that
standards for computer systems should be set in the open by the National
Institute of Standards and Technology.

Unfortunately, the Clinton Administration has found a way to get around the
Computer Security Act. It's placed an "NSA Liaison Officer" four doors down
from the NIST director's office. The two most important civilian computer
standards to be designed in recent years---the nation's new Escrowed
Encryption Standard (the "Clipper" chip) and the Digital Signature Standard
were both designed in secret by the NSA. The NSA has also been an unseen
hand behind the efforts on the part of the Clinton Administration to make
the nation's telephone system "wiretap friendly."

Many computer scientists have said that the NSA is designing weak standards
that it can circumvent, so that the nation's information warfare defenses do
not get in the way of the NSA's offensive capability.  Unfortunately,
there's no way to tell for sure. That's the real problem with designing
security standards in secret: there is simply no public accountability.

In this age of exploding laser printers, computer viruses, and information
warfare, we will increasingly rely on strong computer security to protect our
way of life. And just as importantly, these standards must be accountable to
the public. We simply can't take our digital locks and keys from a Pentagon
agency that's saying "trust me."

But the biggest danger of all would be for Congress to simply trust the
administration's information warriors and grant their wishes without any
public debate. That's what happened last October, when Congress passed the
FBI's "Communications Assistance for Law Enforcement Act" on an unrecorded
voice vote. The law turned the nation's telephone system into a surveillance
network for law enforcement agencies, at a cost to the U.S. taxpayer of $500
million.

.
.
.

[library]

.
.
.

top

.
.
.

(.) Idiosyntactix