COL SAFETY 12/7/95
        Bureaucrats apparently belong to an odd subspecies of
humanity that seeks to turn every problem into a document.     Most
of the time that's merely an irritation, but when the bureaucrats
in question are the ones who run the Defense Nuclear Facilities
Safety Board, and the problem is how to avoid accidents with
plutonium, well, frankly I'd rather have a solution than a
document.         Two of the five board members were in Livermore,
Calif., Wednesday for a public hearing on safety procedures at the
Plutonium Facility at the Lawrence Livermore National Laboratory.
The facility in Building 332 was on "standby' from April 6 until
Oct. 13 while lab officials revised their safety manuals in
accordance with new Department of Energy guidelines and teams from
inside and outside the lab evaluated whether they'd done it right.
        The standby was triggered when visitors from the DNFSB
noticed that the Facility had missed a routine Daily Surveillance
— the peculiar capitalizations and the abundant acronyms are part
of the lingo, of course. The Facility's air pressure is kept lower
than outside so that if its air becomes contaminated the
contamination won't leak out, the pressure differential is
constantly monitored and the monitors are equipped with alarms if
anything goes wrong. Those safety measures weren't compromised. But
a new requirement had just been added, that a daily log of
inspections be kept, and nobody thought to tell the people who
worked the first weekend about it.         I wouldn't exactly say
I think "plutonium is my friend" but neither do I overreact to
insignificant risks just because they happen to involve nuclear
materials. The lab could have fixed the problems without shutting
down its operations.        But no harm was done by the temporary
shutdown, and on Wednesday the safety board members came to hear
the lab's "lessons learned" report.        "The missed surveillance
was an indication of other systematic problems," said Donald Alves,
manager of the Plutonium Facility.         That's the kind of
statement eagerly seized on by reporters looking for sensation, but
the kind of problems he means are yawningly far from sensational.
        There was "insufficient formality of operations," Alves
said, and droned on through a thick set of Vu-graphs explaining how
they were now more formal.       Safety Analysis Reports, Technical
Safety Requirements, Surveillance Requirements, Limited Conditions
of Operations, and even a formal Tickler System.     "We moved from
a knowledge-based to a procedural system," Alves said.         But
they haven't moved far enough for one safety board member, Dr.
Herbert Kouts.         "Where is the S/RID?" he wanted to know.
That's a Safety Requirement Identification Document, I found out
later. "You don't have a single document that the department can
point to to say, this is what you're going to do for safety.      
 "S/RID is five years old. I'd have thought the document would
exist by now."         Am I the only one who wonders whether all
this harping on documents is an unambiguously good idea? Strict
adherence to established procedures is clearly desirable, up to a
point. The Chernobyl accident has been blamed on workers who
decided to ignore some safety rules. But when something happens
that isn't provided for in the rules, rigidity turns to paralysis.
        "We have deluded ourselves into thinking that the right
decisions will be ensured if we build enough procedural
protection," says Philip Howard in his recent book "The Death of
Common Sense." But often the opposite happens. "Decisions, if they
happen at all, happen by default."         Howard cites, among many
examples, the billion-dollar flood when the Chicago River broke
through an old railroad tunnel into the basements of downtown
office buildings. The leak had been noticed weeks earlier, when it
was small, but instead of fixing it right away, Chicago officials
hesitated — and then put the job out for bids.       Several
hundred million gallons of river water is bad enough, but do we
really want the people working in the nation's nuclear facilities
trained never to do anything that isn't according to procedure?   
 Evidently the Department of Energy does. Douglas Eddy, who
oversees the lab's facilities for the DOE office in Oakland, said
one of the biggest parts in the move away from knowledge-based
management was training in "verbatim compliance, and having
(workers) understand what verbatim compliance meant."     Please
understand that I am not arguing against safety. "Safety is not
separable from operations," Kouts said in his presentation, and I
agree with that. And there have been plenty of real safety problems
in the history of the American nuclear weapons program. The safety
board was set up in 1988 precisely to provide an independent voice
to address them. Those interested in what it does can check out its
new Internet home page, at www.dnfsb.gov, which offers "expanded
public access to documents related to its public health and safety
mission."         But before we all march headlong into a future of
"verbatim compliance" I hope someone is considering at what point
the outcome is not more safety, but less.       
