#3 Management causes of the accident
Posted by RDN under Mind & body on 22 January 2011
In some sense all errors are human. Reactor 4’s design made it fallible, but Soviet secrecy made it impossible for its designers to explain the weaknesses of their work. Soviet bureaucracy also made it likely that the reactor might not be well built and maintained…
In some sense all errors are human. Reactor 4’s design made it fallible, but Soviet secrecy made it impossible for its designers to explain the weaknesses of their work. Soviet bureaucracy also made it likely that the reactor might not be well built and maintained. If top people were over-confident (at least in public), junior people were over-confiding, and had to be so in private as well as in public.
The design and planning elites were secretive about their failings. The plant’s managers were both rule-bound and aware that many rules were flawed and had to be over-ridden in some circumstances, including on the fatal night.
The Soviet elite and their servants
The RBMK was designed and managed by the elite of a sprawling network of institutes, ministries, and military organisations, some of which over-lapped, and all of which thrived by their ability to deliver the goals of the most senior echelons of the Communist Party, which itself had complicated and competing channels of power.
Mostly designed within powerful institutes run by powerful Academicians of the soviet science establishment, the RBMK reactors went on to be built and run by ministry officials, many of the most powerful of whom worked with the Ministry of Medium Machine Building, which was a secretive military-industrial complex with covert operations, and which controlled cities and plant throughout the Soviet Union. But there was also a competing energy ministry (which actually ran the Chernobyl plant). All these bodies were committed to maintaining a supply of nuclear-generated electricity.
The designers not only produced a fallible reactor, but they wrote a manual for its operation which diverted attention away from its weaknesses and helped its operators miss the warnings signs about them.
This was a culture in which senior people, including designers, were not frank with the people who had to work their machinery and those who ran the machinery could not rigorously interrogate the designers.
There was little regulatory tension. Though following the Three Mile Island scare in the US one Academician had been charged with producing an institute with a regulatory role, it was weak. There was no-one outside the nuclear bureaucracy charged with seriously challenging it, and certainly no challenger who was feared as much as those demanding that power keep flowing.
It would be quite wrong to paint these people as careless or stupid. Far from it, there was idealism, intelligence and passion in many of them. Unit 4 blew up because it was routine to try to improve safety procedures. The same Soviet apparatus which produced the April 26 accident, also managed a vast amount of nuclear generation with few major accidents (though it denied those it had). It was a similar apparatus, after all, which first put a man in space, and has always been able to service its space stations.
Alcohol was probably as important as subservience in defining how things used to be run. And then there was the problem of over-manning, which ensured that whilst there was a very large team to run everything, elementary good sense required that how the real power structures worked, and who the real workers were, be clear and understood.
But within their lights, almost everyone was conscientious. In a typically Soviet manner, they were required to be obedient, politically-correct, but capable of ducking-and-weaving as well.
Indeed, one might argue that every country produces a nuclear elite which mirrors the strengths and weaknesses of its national culture. It became fashionable to argue that what the Soviet Union most lacked was a “safety culture”, and there’s merit in the argument. This is to say that everything everyone does around the plant should be predicated on alertness and caution. There is a sense, too, that an industry cannot have a safety culture unless it is institutionally challenged by regulators. Actually, even a “safety culture” may produce its own problems: for instance, a box-ticking devotion to precaution, with imagination gradually stifled. Anyway, for intelligence, ingenuity and courage, it would be hard to beat the Soviet machinery which found itself with the Chernobyl accident on its hands. We see that, arguably, in the extraordinary response of the soviet system to the crisis.