Writing & Lobbying

It is up to all of us to inform people about the TPNW. Please submit news or views that we might publish by email


6 – 9 August 2020, Seventy five years after the event, as survivor Setsuko Thurlow urges us to sign the pledge to work for the Treaty there is a time for refection as well as a call for action, and one of the traditions around doing this is fasting. Iona Soper, is one of those undertaking this challenge. Her reflections for each day of the fast follow.


Why do we choose to fast? 

Let’s talk about Operation Starvation, or, The Other Way They Won The War. 

On 01 July 1946 – less than one year after the bombings – the United States Strategic Bombing Survey was released. The report was created by a group of experts, aiming to provide an impartial review of Anglo-American strategic bombing activities during the war. With regards to the atomic bombings of Japan, the report concluded: “Based on a detailed investigation of all the facts, and supported by the testimony of the surviving Japanese leaders involved, it is the Survey’s opinion that certainly prior to 31 December 1945, and in all probability prior to 1 November 1945, Japan would have surrendered even if the atomic bombs had not been dropped, even if Russia had not entered the war, and even if no invasion had been planned or contemplated.” 

This piece of writing has been intrinsic to the campaign against nuclear weapons as evidence of the fact that the atomic bombings were not – as is the historically popular account – necessary to bring about the end of the war in the Pacific, and that the allied forces were well aware of this. After all, US Ministry of War secretary Henry Stimson was later to comment “No effort was made, and none was seriously considered, to achieve surrender merely in order not to have to use the bomb.” But I don’t want to talk about that right now. I want to talk about the circumstances that made this statement so accurate. I want to talk about the other way they won the war. 

‘Operation Starvation’ refers to a campaign of strategic mining by US Air Forces and the US Navy, in order to blockade, disrupt and destroy Japanese ports and coastal and shipping lanes. From the 27th of March 1945, the US Air Force laid 12,135 mines, sinking or damaging 670 ships. At more than 1,250,000 tons lost, Operation Starvation had sunk more ships per tunnage in the final months of the war than had been lost by all other nations combined. 

Japan had already been victim to rigorous Naval blockading throughout the war, with allied subamarinal forces sinking its merchant fleet, intercepting transports of goods and troops, and cutting off nearly all the oil imports essential to weapons production and military operations. Japan had already suffered some of the worst hunger of any of the nations during the war. Of 1.74 million military deaths from 1941 to 1945, as many as 1 million were due to starvation. Operation Starvation, much like the atomic bomb, upped the stakes of war by targeting the civilian population of Japan for suffering. 

Operation Starvation all but halted Japan’s import of critical raw materials and food. The combination of Naval blockades, the coastal mining and incendiary raids from the US Air Force over urban and industrial areas meant that Japan’s overall rate of production in 1945 sat at just one third of the figure for the year before. By 1945 the average Japanese citizen was living in starvation. One survey found that the average caloric intake for a Japanese citizen in 1945 was less than 80% of the minimum required for basic health and physical performance. 

It was predicted by experts of the time that by the end of 1946 the number of deaths by starvation would exceed seven million. 

Japan had been stripped of all access to food and essential resources, and it’s population was weak and dying. Long before August of 1945, it was accepted that the prospect of continuing the war into 1946 was virtually an impossibility. Surrender was an inevitability. And then came the bombs. 

The bombs destroyed many of the cities’ supplies and irradiated what it did not. Unknownst to the surviving populations, already living in profound hunger, crops, grains and even streams and rivers had become toxic and deadly. The damage caused by Operation Starvation continued to obstruct Japanese supply lines, hindering efforts to provide relief and and distribute what little food remained. By the end of 1946, the average caloric intake for the populations of Hiroshima and Nagasaki was at just 40% of the minimum required for basic health. In the months following the bombings, groups of evacuated children, now orphans, returned to the remains of the cities and lived there in makeshift camps in starvation. Survivors recall that some were so hungry, they died with stones in their mouths. Although an official count of Japanese deaths by starvation following their surrender in September 1945 has never been taken, a respected Japanese historian, Daikichi Irokawa, has written that “immediately after the 1945 defeat, some estimated that 10 million people were likely to starve to death”. 

Reflecting, 75 years on, I have to ask. How many more innocents had to suffer and die – are still suffering, and dying, all because the allies felt the need to win the war twice? This notion that the atomic bombings were necessary to bring about the end of the war is utterly prevalent in the collective mindset. I hear it most frequently in classrooms, when I go to speak about the experiences of A-bomb survivors, and it breaks my heart, because that’s where I first heard it too, from my teachers, and were it not for my circumstances launching me head first into the anti nuclear campaign four years ago, I’d probably still believe it now. 

Not only do we need to evolve the way we educate the next generations about nuclear weapons, but we need to evolve the way we have conversations about security, both in public and behind closed doors. Public discourse has favoured the atomic bombings on the shaky grounds that they prevented even more widespread violence and starvation, as if, despite Japanese feelers for an amicable surrender, these were the only two options available. In April 1945, when Field Marshal Henry Maitland Wilson wrote from Washington that the US was eager to know British views on the approaching use of the atomic bombs against Japan, the following discussions from Westminster focused only on the phrasing of Britain’s assent. 

When the suffering of millions is on the table, and peace has no place in the conversation, we need to start a better conversation. 

FUNDING HUNGER Why do we fast? Let’s talk about the funding of fear over food. In 1983, the International Fast For Life was nominated for the Nobel Peace Prize, in recognition of the widespread global attention it generated for the issue of nuclear disarmament. From August 06, 1983, over 150 fasts took place in 24 countries. For many organisers in Europe and North America, this fast was open-ended, and subject to demands, akin to a hunger strike. For thousands more, fasting in solidarity across the world, this was an unprecedented piece of international direct action, a revolutionary act of total nonviolence. And the movement flourished. In Italy, over 40 new peace groups were formed. Over 300 Parisians were arrested when a Fast For Life banner was hung over the iconic Arc de Triomphe. In Scotland, an open letter, signed by the heads of all Scotland’s major churches, was handed into the Queen and Prime Minister Thatcher. By September, an open letter to fasters was published by the World Council of Churches, reading, “Your fasting has fed the solidarity of all who hunger for disarmament. In your weakness you have made us strong”. 

Fasting in this context takes on a dual purpose. Fast For Life was born in the United States at the height of the nuclear arms race of the Cold War, the threat of an ‘all-out’ nuclear war seemingly imminent. Founders had concluded that “the nuclear crisis of that time was so grave that people of peace may have to offer up their lives in an effort to prevent the continuation of the silent holocaust of world hunger and the impending holocaust of nuclear fire”. And thus fasting was used as a nonviolent way of creating suffering as a means of drawing attention to a much graver evil. But the decision to fast was also born from a desire to emphasise the cruelty of governments’ financial prioritisation of weapons of mass destruction, when so many populations, both foreign and domestic, lived in poverty and starvation. Given that 1983 saw 15% of US citizens living below the poverty line, as well as the beginning of the Ethiopian famine that claimed an estimated million lives, it is little wonder that funding hunger was selected as the proposed alternative to funding death. And it’s an argument that remains crucially relevant today. 

In the 37 years since the first International Fast, food inequality and human hunger has remained a widespread and urgent threat, particularly in regions affected by conflict or poverty. The UN Sustainable Development Goals which were adopted in 2015 and signed by 193 countries, including the UK and the USA, pledge a commitment to “ending hunger, achieving food security and improving nutrition and promoting sustainable agriculture”. The 2020 edition of the UN Global Report on Food Crises put 135 million people worldwide living in crisis – before coronavirus. Now, the pandemic risks almost doubling that number by the end of the year. Vulnerable citizens living on the poverty line in ‘food-crisis’ countries may be tipped into malnutrition by the resulting scarcity of food coupled with rises in prices. And for those already malnourished, as little as a 5-10% decrease in caloric intake can prove to be lethal. 

Lockdown regulations and safety measures designed to save citizens from the spread of the virus, in particular restrictions of movement, are restricting labour practises and obstructing the production, processing and transporting of food, delaying the whole process and reducing the availability of many foods and essential foodstuffs. This shortage will hit the hardest among those already most vulnerable to starvation: the unemployed, the impoverished, and the displaced. Guterres, Secretary 

General of the United Nations, writes that “At this time of immense global challenges; we must redouble our efforts to defeat hunger and malnutrition; We have the tools and the know-how. What we need is political will and sustained commitment by leaders and nations. This report should be seen as a call to action”. 

Multiple agencies have attempted, since 1949, to calculate the financial cost of ‘ending world hunger’. In truth, a settled figure is surely unreliable due to the high number of inconsistent factors that create and maintain poverty and food insecurity – war, economic collapse, unemployment, climate change, pests, and, as we have seen all too cruelly this year – outbreaks of disease. Typically using 2030 as the target year for success, best recent estimates place the figure at anywhere from 7 billion USD per year to 265 USD per year. By comparison, the United States administration’s current plans for the U.S. nuclear weapons programme are projected to cost 494 billion USD from 2019 to 2028. In the United Kingdom, the replacement nuclear weapons programme is expected to cost 205 billion GBP (268 billion USD). Global spending on nuclear weapons is projected to reach a trillion USD over the next ten years. In 2019, it amounted to almost 140,000 USD, every minute. 

Those who favour nuclear weapons will argue that the reaverting of funds would require defunding defence and reinvesting the funds into international humanitarian aid, to them, a dangerous and unthinkable choice. To this, I would ask, what is food security, if not defense against starvation? It’s 2020 and we no longer have a Ministry of War. Once we pause to consider defence policy through the lens of humanitarian security, the ending of world hunger (along with the other Sustainable Development Goals) are revealed as a far more appropriate use of ‘defence’ funds than weapons of mass destruction. But the choice to prioritise military spending doesn’t just harm those living in non-nuclear states far across the world. Food insecurity and malnourishment exist in the countries who choose to have nuclear weapons too. 

In China, over 150 million people are malnourished. In Russia, 21 million live in poverty. In Pakistan the percentage of the population living in food insecurity is 36%. India’s rate is higher still at up to 40%. Although accurate data is hard to gather, the North Korean regime has long stood accused by humanitarian groups of ‘starving its own people for a nuke’. In the UK, 16% live in food insecurity, and in England, the rate of death by starvation has almost doubled since 2001. Since the beginning of lockdown, there has been an increase of almost 4 million people seeking help from food charities and food-banks. In the United States, the number of households living in food insecurity has risen from 11% in 2018, to 22-38% as of the beginning of America’s lockdown in April 2020, with these numbers set to continue rising as economic and employment fragility continue to reduce or remove incomes for vast swathes of the population. 

The decision to spend vast chunks of the nations GDP maintaining a nuclear weapons programme while food insecurity remains an issue for their own electorates, is testament to just how far removed world leaders remain from the needs, and the wills, of their own people. I’m sure there isn’t a soul on earth who lies awake at night, restless from hunger, but comforted by the thought of nuclear weapons. The reframing of our conversations about security is imperative if we are to achieve the Sustainable Development Goals and tackle the very real threats that look all of us in the face. How many more must we leave defenceless against starvation – how many will be lost from climate change – how many will perish from preventable diseases – before we recognise the full cost 

of our greed for power? When it comes to true defence, nuclear weapons systems are weapons of mass distraction. So we fast. 


Why do we fast? Let’s talk about nuclear winter, or The Other Climate Change. 

One of the greatest lies we campaigners tell ourselves to sleep at night, is that the prospect of global destruction by nuclear weapons is less likely now than, say, at the height of the Cold War, when the collective global stockpile of nuclear warheads sat at three times the number in circulation today. Fewer bombs equals less suffering, right? If only. Nuclear weapons don’t exist in a vacuum, unaffected by the dealings of mankind. Transport accidents, misfires, the threat of cyber-terrorism, strategic positioning of nuclear firepower in ‘host’ nations, increasing nuclear capability, and increasing military tensions between the growing number of countries with a nuclear stockpile, all contribute to the culture of nuclear insecurity in which we find ourselves today. The Doomsday Clock, as we are so fond of preaching, sits closer to midnight than ever before. 

But it’s not just about the likelihood of a bomb being used. It’s about what’s going to happen if it does, and how utterly unprepared we are for what will follow. Throughout the anti-nuclear campaign, we have often centered our understanding of the humanitarian crisis reaped by nuclear weapons purely in relation to the immediate impact of the bomb, and the long lasting medical consequences of direct or genetic exposure to radiation. We must take the time to consider that when it comes to the use of nuclear weapons, the horrors of the immediate impact, are likely only the beginning of a far longer sustained period of widespread suffering and harm. 

Nuclear Famine theory first entered public discourse in the mid-1980s. Of course, anyone familiar with the aftermath of the Hiroshima and Nagasaki bombings would have been acutely aware of the practicalities of the issue. However the fact that the bombings had coincided with ‘Operation Starvation’ – a six month campaign of cutting off Japanese food supplies by the US Navy and Air Force – meant that analysis of the starvation caused solely by the bomb was impossible. An official count of the number of deaths in Japan caused by starvation in the initial Post-War years was never conducted, though Japanese scholarship puts the number at six figures. 

In the mid 1980s however, a group of more than 300 scientists from over 30 countries came together to create a report assessing The Environmental Impact of Nuclear War. Among their conclusions published in 1985, they predicted that in the aftermath of a global nuclear conflict, if adequate measures were not undertaken to preserve food security, the billions of survivors would be plunged into “massive levels of malnutrition and starvation,” even in non-violent countries, and, in dire situations, “only a small fraction of the current world population could expect to survive a few years”. In a similar publication by the National Academy of Sciences in 1986, it was stated that “the primary mechanism for human fatalities (in a nuclear war) would likely not be from blast effects, not from thermal radiation burns, and not from ionizing radiation, but, rather, from mass starvation”. 

This works on two levels. The first is the immediate impact of nuclear war on food supplies and distribution chains. Stores of foodstuffs, pesticides and fertilizers, agricultural equipment, and transport lines for distribution can all be destroyed in the blasts of a strategic nuclear attack. The following uncontrollable fires can devastate crops in the fields and foodstuff stockpiles in the cities. 

Contamination of the water and soil will disrupt agricultural practices and strip much of the land’s fertility. Radioactive dust particles carried by the wind can contaminate surfaces miles from their origin (let us not forget that Scottish sheep were still being tested for radioactive contamination from the Chernobyl disaster in 2012). Unlike the survivors of the 1945 atomic bombings, 21st century survivors will be all too aware of the dangers of eating contaminated food, and will thus be forced to make the impossible choice between starvation and possible irradiation. In the years that follow a nuclear war, the disruption of global distribution lines, the inevitable breakdown of the global economy and loss of incentive for international commerce, as well as the chaos of a society of displaced, sick and traumatised peoples, will only deteriorate the issue of global food security. 

The second is the issue of a nuclear winter, or, ‘The Other Climate Change’. A Nuclear Winter specifically refers to the cooling of the Earth’s surface temperature, triggered by an injection of soot (in particular, black carbon caused by a firestorm) into the stratosphere, which would then block natural sunlight from reaching the earth and create a rapid cooling effect, disrupting agricultural practises and causing widespread famine in the process. This effect is not theoretically limited solely to nuclear explosions – the eruption of the Tambora volcano in Indonesia in 1816 for example, caused a ‘year without a summer’ in the Northern Hemisphere which caused widespread crop failure, famine, and economic collapse. However these concerns were born out of the paranoia of the Cold War, amid fears of a global nuclear war in which black carbon would be released in unprecedented quantities, along with concerns about the amount of carbon already released from nuclear weapons tests – hence its name – ‘Nuclear Winter’. Many attribute the end of the Cold War nuclear arms race to the growth of these concerns, which forced nations to frame the use of nuclear weapons in terms of the damage done to non-combatant countries, as well as their own populations. 

Since the end of the Cold War, despite the depletion of the global stockpile of nuclear warheads, the risk of a nuclear winter has become less of a superstition and more of a very tangible threat, developing in tandem with the decline of global grain stockpiles, the growing number of nuclear-armed states, the promotion of low-yield nuclear weapons by world leaders, and the increasing strain placed on our climate’s natural balance. While the prospect of an ‘all-out’ global nuclear war seems less likely, the risks posed by even a ‘small scale’, regional conflict between two nuclear armed states continue to grow. It’s hard to wrap our heads around the scale of nuclear weapons development since the second world war. For reference, the Tsar Bomba tested by the Soviets in 1961, held the explosive capacity equivalent to 3800 Hiroshima sized explosions. In order to demonstrate the extent of the fragility of the current situation, contemporary studies on a nuclear winter focus on the premise of a regional conflict between two nations, such as India and Pakistan, each using one ‘small’ 15 kiloton thermonuclear warhead over an urban population. In this scenario, just 0.03% of the explosive power of the current global stockpile is enabled, a destructive force equivalent to ‘only’ 100 Hiroshima sized explosions. 

These studies have found that the heat caused by absorbing shortwave radiation would actually elevate the black carbon soot beyond just the injection caused by the blast of the explosion, situating it high within the stratosphere, meaning it would remain within the atmosphere for approximately six years – in comparison to one year following the Tambora volcanic eruption. There would be a ‘global average surface cooling’ of 1.25 degrees Celsius, which would remain at 0.5 

degrees cooler than average a decade later. A cooling of several degrees would occur over large areas of the Northern Hemisphere, with changes in temperature the most severe over land. 

US corn and soybean harvests would remain at a loss of 10% for a decade. Huge climatic disruption would be caused in all regions, even those far removed from the sites of detonation, including a global average decrease in rain of 10%, the reductions to monsoon season over the Asian continent being the most severe in this case. The resulting obstruction to wheat, rice and maize production in China alone, would create not only major food insecurity for the 1.3 billion in China, but also cause a famine putting at risk the lives of almost a billion already malnourished people living in developing countries, as well as the food supplies for the entire populations of countries highly reliant on food imports, which would likely be halted as panic and hoarding began to take place on an international scale. This would also presumably lead to gross inflation in global food prices, making food inaccessible to the world’s poorest in every nation. United Nations grain reserves in 2020 (let’s be kind and assume they’re unaffected by contamination and food is highly rationed) currently sit at roughly enough grain for four months. 

In the face of human starvation at an unparalleled scale, what more of a wake up call is needed for the nuclear powers of today to disarm? There exists today an unprecedented transparency of information about nuclear weapons, how they came to exist, how they have been used, and what they are ultimately capable of. The censorship that prevailed in the decades after the second world war has been lifted, along with many secrets of the Cold War. Atmospheric weapons tests have been replaced with virtual simulations capable of calculating every aspect of damage caused. Hibakusha voices have been amplified across the world. The Red Cross has stated that they would not be able to provide relief following a nuclear attack. Low quality simulation technology is freely available online – I’ve been known to break it out in the classroom to let the kids see for themselves how different kiloton yields and blast zones work. How is it that in the face of such freedom of knowledge, we have found ourselves in 2020, at only one hundred seconds to Midnight? And, more importantly, how much closer will we allow ourselves to get?