29 December 2007Paul Rogers
On 16 July 1945, an experimental plutonium-fuelled implosion device with a power of over 10,000 tons of conventional high explosive (i.e., ten "kilotons") was tested in the New Mexico desert. The nuclear age was born. Within a month, two more atom-bombs had been used, this time against Japanese cities, and the United States and its allies had already set up a production line to produce two bombs a month, destroying Japanese cities one by one until the war ended.
In the event that was not needed, for Japan surrendered shortly after being on the receiving end of the second bomb, dropped on Nagasaki on 9 August 1945. Thereafter, in the incipient years of cold war with its Soviet adversary, the United States moved rapidly to become a nuclear superpower; within three years it had built a stockpile of fifty bombs. The Soviet Union tested its own first nuclear weapon in 1949; by 1953 the rival states had tested far more destructive thermonuclear weapons, and Britain had become the world's third nuclear power. There were attempts in this first phase of the nuclear age to contain its dangers - including proposals named after the US presidential adviser Bernard Baruch and the then Soviet delegate to the United Nations, Anatoly Gromyko (both presented in June 1946) - but both collapsed amid a welter of east-west suspicions.
An age of fear
At the time, the destruction of Hiroshima and Nagasaki was to many strategists no more than an extension of the already intense air war against Nazi Germany and Japan. What had previously required a thousand or more bombers now required just one, but there was in this view no intrinsic difference between the destruction wreaked on Dresden and Hamburg, Tokyo and Osaka, and the devastation of Hiroshima or Nagasaki.
By the mid-1960s, things were looking very different. The US and Soviet Union (joined by then as nuclear powers by France and China as well as Britain) were gradually amassing what would become thousands of nuclear weapons. Many of these were grotesquely larger in destructive capacity than the earlier bombs, thus making the term "kiloton" redundant and requiring the measure "megaton" (million tons) to be introduced.
Both sides developed multi-megaton thermonuclear bombs. To take one example: the Soviets put a twenty-five megaton warhead - 2,000 times the size of the Hiroshima bomb - on their SS-9 inter-continental ballistic missile (ICBM). If fired at central London, a single SS-9 warhead would have destroyed the entire city out to the M25 orbital motorway, killing at least 7 million people and starting fires across southeast England.
By the early 1980s, the United States and the Soviet Union had between them amassed well over 60,000 nuclear weapons; other states had by then joined or planned to join the nuclear club, and there was collectively enough firepower to destroy all the world's major cities many times over. A nuclear war - which seemed far from improbable as the "second cold war" unfolded, with Ronald Reagan as US president and before Mikhail Gorbachev came to power in Moscow - would have unleashed a "nuclear winter" across the whole of the northern hemisphere, rendered vast tracts of territory radioactive and set back the human community by centuries.
There is an argument that the very lethality of nuclear weapons meant that they helped keep the peace, so great was awareness and fear of their power of annihilation. This argument is most unwelcome in the many parts of the "third world" where proxy east-west wars were fought during the cold war. These wars - in Korea, the Horn of Africa, Vietnam, Afghanistan, Angola, Nicaragua and other countries - killed at least 10 million people, maimed millions more and wrecked economic life.
In any case, the case for the nuclear danger's constraining effect involves a high degree of after-the-fact rationalisation. The unfolding of the cold war, a gradual process marked by sudden crises and tense moments, was also a perilous time in which the world came close to disaster on several occasions. As well as political stand-offs (such as Cuba) which took the superpowers to the brink, there were large numbers of accidents involving nuclear weapons, some of which were lost and never recovered. A more sanguine assessment of the cold war, then, is that no one "won" it and humanity avoided disaster more by luck than by good judgment.
A generation later, the nuclear age is far from over. The arsenals may have shrunk but there are still many thousands of weapons in existence. Israel, India, Pakistan and North Korea have all joined the cluster of nuclear-weapon states, and there has been dispute for years over the nature and extent of Iran's nuclear plans. The spread of technological know-how and materials reinforces the current concern with proliferation rather than all-out nuclear war. But there is an element of complacency here too; the very fact that the cold war between the superpowers is over has tended to lull many people into a false sense of security. The world may no longer - at least at present - be on the edge of a nuclear abyss, but it does face clear and probable dangers (which are highlighted by proliferation) unless the ideal of a nuclear-free world can be realised.
A century of risk
The period since the high point of the cold war is one that offers a fresh historical perspective on the ongoing nuclear age. It is clear that 1945 inaugurated a period in human history when technological developments made it possible, for the first time ever, for the human community to inflict massive damage on the entire planet. Moreover, later developments indicate that in this respect the nuclear age was followed by several comparable technological innovations whose effects too could be catastrophic.
In this larger perspective, it is very probable that a specific century - 1945-2045 - marks a period when the human community faces the choice of taking decisions that lead to immense self-destruction or of acquiring the wisdom to handle its own destructive capabilities. Nearly two-thirds into that century, there are both worrying and hopeful signs.
The original danger, of nuclear war, remains; though there is the beginning of a recognition across the political spectrum that a period is opening which offers the best chance for at least a decade to curb proliferation. The next, 2010, review conference of the nuclear non-proliferation treaty (NPT, which was signed in 1968 and entered into force in 1970) might have some real prospect of progress.
In this respect, a new face in the White House after the November 2008 could make a difference. An even bigger boost would come if the British abandoned their militarily ridiculous and massively expensive plan to replace the Trident nuclear system (see "Britain's nuclear-weapons fix", 28 June 2006). Some countries have already gone nuclear-free - Kazakhstan, Ukraine, Belarus and South Africa - but none were significant players on the nuclear stage; a change by the British, in contrast, would be a significant boost to the non-nuclear case. The fact that the final Trident-replacement decision is not due for several years - even though government and parliament sanctioned the replacement in March 2007 after a simulacrum of public debate - means that a revision of view is still in principle possible (see "Britain's defence: all at sea", 6 December 2007).
The current recognition of nuclear dangers is, then, a cautiously hopeful indicator. But far too little attention is given to biological weapons. This is less because they are at present capable of being used to catastrophic effect than because of the potential that biotechnological and genome-based scientific developments could lead to the production of bio-weapons that really would carry this capacity. An international agreement that bans such weapons is in place - the biological and toxin weapons convention (BTCW, which was signed in 1972, and entered into force in 1975), but it is toothless without proper verification systems.
There was hard work by negotiators in Geneva from the late 1990s to strengthen this treaty. After it came to power in January 2001, the George W Bush administration wrecked the process - partly because of its antipathy in principle to any kind of multilateral arms treaty, and partly over concerns over commercial confidentiality if its biotech companies were exposed to inspection. Other countries hid behind the United States's action, the net effect being to waste almost a decade in the effort to control weapons that could become unimaginably potent. Once more, a change in the White House might make a difference, but only if other states and citizen groups are alert to the urgency of the issue.
Both nuclear and biological weapons are the products of human, technological ingenuity that also embody a potential to create catastrophe. There may well be other such developments in the next two or three decades; they include directed-energy weapons or the offshoots of the nanotechnology revolution. Together, however, these trends represent just one of the two existential risks of the current, 1945-2045 century.
The ozone example
The second such risk was not recognised in 1945. Indeed, it was not widely apparent for another forty years. But by the mid-1980s, environmental scientists had begun to understand that, for the first time in history, the activities of the human community could actually affect the entire global ecosystem. True, local impacts had been felt for hundreds, even thousands of years; and the effects of (for example) air pollution, water pollution, land dereliction, and desertification had all been felt in individual countries. But this was something different.
The first sign was the progressive damage to the ozone layer in the upper atmosphere caused by chlorofluorocarbons (CFC) pollutants, seen most clearly in each spring's "ozone hole" over Antarctica. The fact that the ozone layer blocks harmful ultraviolet radiation from the sun made its loss potentially catastrophic; the danger was sufficiently obvious for rapid international action to be taken to phase out use of CFCs. The Montreal protocol (which was signed in 1987, and entered into force in 1989) was impressive for the speed at which it was agreed. The ozone layer may still take decades to recover fully, but the CFC ban remains a unique example of a worldwide agreement to respond to a planetary threat.
The challenge of responding to ozone depletion, however, is easy compared to that with regard to climate change. The risks from climate change are every bit as great as the loss of the ozone layer; and to control and limit carbon emissions in the way needed will require intentional transformations in economic organisation that exceed any such change in the last several centuries. Here, the outlook for progress is in the balance.
A major problem is that although climate change may be speeding up it is still happening slowly; the main effects may be felt only around 2030-40, yet effective action has to be taken by 2012-2017. This will be difficult. Ozone depletion was obvious and immediate, and solutions were straightforward. Climate change is complex; some aspects remain disputed; some countries could actually gain in the short term; and there are powerful vested interests in the fossil-fuel industry and global politics that seek persistently to deny the trends.
The next mountain
What, then, of the balance-sheet? In the century from 1945 to 2045, the human community has been and is required to confront two existential problems. The first is its capacity to damage and possibly even destroy itself through new weapons technologies. It survived the first dangerous phase of the nuclear era less by wise decisions than by good fortune, and it now has an opportunity to render nuclear weapons obsolete. The less widely acknowledged danger of new bio-weapons may actually be more problematic, but progress on both issues could be rapid if a new period of serious multilateral engagement can be fostered by 2009-10 (see Dan Plesch, "Disarmament: the forgotten issue", 12 December 2007).
The second problem is the recently developed capacity to have an impact on the entire global environment. Here, too, the message is mixed. Ozone depletion was in a sense the "marker" of humanity's new capacity. It was specific and had a relatively straightforward solution but the rapidity of international action in the mid-1980s was still impressive. Climate change is a problem of far greater scope, and will be considerably more difficult to handle. At the same time, awareness of the issue is growing month by month.
In this respect, it is the period through to around 2015 that is the key. If genuinely major changes are made in that time - perhaps through a unique combination of citizen action and political acumen - then the prospects could be good. By 2045, the world may have learned both to control adverse technologies and safeguard the biosphere. In the context of the entire century, it may well be the seventh decade - 2005-15 - that proves pivotal.
The world has entered interesting times, and they are about to get more so.
http://www.opendemocracy.net/article/conflicts/global_security/century_change