From Hiroshima to Ukraine: Nuclear Taboo and Strategic Morality Under Pressure

2 hours ago 9

Eight decades after the destruction of Hiroshima and Nagasaki, the question of whether the atomic bombings were legitimate remains unresolved and structurally central to contemporary strategic thought. The decision taken by President Harry S. Truman is most often framed as the outcome of extreme constraints: Allied exhaustion after years of total war, fierce Japanese resistance, and the prospect of a devastating ground invasion of the home islands. Yet this narrative collides with a deeper moral question: can the deliberate annihilation of cities be justified in the name of saving more lives, or does it represent an absolute rupture with the principles of just war and the protection of civilians? The near-total militarization of Japanese society blurred the line between combatants and non-combatants, anticipating dilemmas that now resurface in urbanized and asymmetric conflicts from Gaza to Ukraine. Hiroshima and Nagasaki also inaugurated a lasting nuclear taboo: an unwritten but powerful prohibition on the use of nuclear weapons, which has helped prevent their re-employment while paradoxically reinforcing their political centrality. Between strategic necessity and radical moral transgression, nuclear weapons have become the ultimate test of our capacity to link power to responsibility. Preventing the return of the unthinkable requires sustaining a demanding synthesis between military prudence, humanitarian law, and an ethic of responsibility that takes seriously the survival of political communities — and of humanity itself.

August 1945: A Foundational Turning Point

In October 2025, Donald Trump’s announcement that the United States would resume nuclear testing —after more than three decades of de facto moratorium— signaled that nuclear capability was again being staged as a visible marker of status and resolve, not merely as a silent deterrent. Presented as a conditional response to possible Russian and Chinese tests, this declaration reactivated competitive logics of technological demonstration and raised immediate alarms among international verification and monitoring bodies such as the Comprehensive Nuclear-Test-Ban Treaty Organization, which warned of the cumulative erosion of arms control norms and the symbolic crossing of a threshold that many states considered politically and morally closed. In this renewed nuclear climate, the memory of Hiroshima and Nagasaki returns, not as a distant episode but as a living fault line in the international order. On 6 and 9 August 1945, the U.S. bombings of Hiroshima and Nagasaki inaugurated the nuclear age by combining, in a single act, military innovation, total destruction, and political signalling. Since then, historians, strategists, and philosophers have debated whether Truman’s choice was the least catastrophic option available or an avoidable moral catastrophe rationalized after the fact. Detailed archival work and revisionist interpretations confront each other: some emphasize the anticipated casualties of a land invasion and the pressure to end the war swiftly; others stress diplomatic alternatives, the looming Soviet entry into the conflict, and the desire to shape the post-war balance of power in Asia (Walker 2016; Alperovitz 1996; Maddox 1995; Bernstein 1995). The result is not consensus, but a structured disagreement that already contains the central problem of nuclear morality: whether extreme necessity can ever legitimize deliberately targeting civilians.

This unresolved tension reverberates in the twenty-first century, where nuclear discourse has re-emerged at the heart of crises in Europe and Asia. The war in Ukraine has been accompanied by explicit and implicit nuclear threats from Moscow calibrated to deter NATO involvement and freeze territorial gains; debates in Washington, Moscow, Beijing, Paris, London, and New Delhi revisit the vocabulary of “credibility,” “red lines,” and “escalation dominance,” as analyzed in contemporary deterrence and risk studies such as those of leading strategic institutes. Against this background, Nina Tannenwald’s concept of the “nuclear taboo” is crucial: she shows how, since 1945, a strong normative inhibition has developed against nuclear use, embedded in state practice, legal argument, and public discourse, and interacting with—but not reducible to—deterrence logics (Tannenwald 2007; Tannenwald 1999). Hiroshima and Nagasaki thus function as a strategic and moral mirror: they oblige today’s decision-makers to confront the link between historical memory, legal obligations, and the practical conditions under which the nuclear threshold might, or might not, hold.

Truman at an Impasse: Alternatives and Calculations – Military Urgency and the Limits of Strategic Rationality

By the summer of 1945, the Pacific War had become a prolonged catastrophe for Japan and a mounting burden for the United States, yet Tokyo showed no clear sign of unconditional surrender. Massive incendiary raids had devastated major cities; the naval blockade strangled imports of food and fuel; industrial capacity eroded; and yet the imperial leadership clung to the hope of extracting better terms or exploiting divisions among the Allies. The Japanese high command prepared Operation Ketsu-Go, a last-ditch defense of the home islands founded on near-total mobilization, suicide tactics, and the expectation that inflicting intolerable casualties on invading U.S. forces might shift American political calculations (Frank 1999). In Washington, planners faced a strategic impasse: defeat Japan decisively, but at what cost and by which means, in a context where domestic expectations demanded rapid victory and minimal additional bloodshed after V-E Day.

The first option, Operation Downfall, envisaged a two-stage invasion: Olympic against Kyūshū, followed by Coronet against Honshū and the Tokyo Plain. Intelligence assessments, informed by bitter experience at Iwo Jima and Okinawa, projected levels of resistance ranging from ferocious to apocalyptic. Some estimates anticipated hundreds of thousands of U.S. casualties; others, assuming mass mobilization and kamikaze tactics, reached the scale of one million (Giangreco 2009). Coupled with potential Japanese military and civilian deaths on an even greater scale, these projections placed Truman in front of a foreseeable slaughter that would be hard to justify politically to a war-weary American public. The specter of writing letters to hundreds of thousands of new bereaved families weighed heavily, reinforcing receptiveness to any option that promised to end the war quickly.

The second option, intensified blockade and continued conventional bombing, rested on the assumption that Japan, already economically prostrated, would eventually capitulate under the cumulative effects of famine, infrastructural collapse, and psychological exhaustion. In practice, this meant accepting months— perhaps more than a year —of additional suffering, with the likely death of millions of civilians from hunger and disease while the imperial regime clung to power (Maddox 1995). For U.S. leaders, this path looked less dramatic militarily but morally troubling and politically uncertain; it postponed peace and risked eroding support at home and unity among Allies.

A third option, championed by some Manhattan Project scientists and diplomats, was a technical demonstration: an atomic explosion on an uninhabited site, under international observation, preceded by a clear ultimatum to Japan (Alperovitz 1996). The idea promised to preserve some moral restraint while showcasing overwhelming technological superiority. Yet the fear of technical failure or reduced psychological impact, combined with doubts about Japanese leaders’ willingness to surrender without concrete devastation, led to its rejection. Truman and his advisers judged that a failed demonstration would embolden the enemy and weaken U.S. leverage vis-à-vis both Tokyo and Moscow.

Beyond these calculated scenarios, the atomic bomb also appeared as a politico-strategic instrument for managing the emerging post-war order. Historians such as Barton J. Bernstein and Tsuyoshi Hasegawa emphasize that U.S. leaders saw in the bomb a way to end the war before the Soviet Union could significantly shape the settlement in East Asia, and to signal a new hierarchy of power (Bernstein 1995; Hasegawa 2005). In this reading, Hiroshima and Nagasaki were not purely dictated by battlefield necessity but also by diplomatic anticipation: the first act of nuclear statecraft as much as the last act of World War II. The resulting decision thus sits at the crossroads of operational urgency, domestic constraints, alliance management, and systemic signaling—an intersection in which rational calculations coexist uneasily with blind spots and moral evasions (Freedman, 2003).

The Ethical Dilemma: Strategic Effectiveness or Crime?

The standard defense of the bombings rests on a consequentialist claim: by forcing a rapid Japanese surrender, the atomic strikes saved lives overall—American and Japanese—by avoiding a prolonged blockade or full-scale invasion. This narrative presents Hiroshima and Nagasaki as a tragic but necessary shortcut: an extreme evil that prevented an even greater slaughter (Walker 2016). In this framework, Truman’s responsibility is reframed as the burden of “dirty hands” in Michael Walzer’s sense: a leader who authorizes morally abhorrent acts to avert outcomes judged worse, accepting guilt as the price of political responsibility (Walzer 1977). A deontological critique, articulated forcefully by G. E. M. Anscombe, rejects this calculus. For her, the direct and intentional killing of civilians is intrinsically wrong, regardless of anticipated benefits, and the scale or speed of ending the war cannot convert massacre into moral necessity (Anscombe 1956). On this view, the bombings were not a grim lesser evil but a categorical crime: state terrorism by other means. Just war tradition and modern IHL, with their emphasis on discrimination and non-combatant immunity, lend strong support to this position. The central tension emerges clearly: if there are acts that must never be committed, then the atomic bombings fall on the forbidden side of the line.

The debate becomes sharper when alternatives are reconsidered. Research by Tsuyoshi Hasegawa and others argues that Japan’s strategic position in summer 1945 was already untenable and that the Soviet declaration of war might, combined with continued conventional pressure, have compelled surrender without nuclear attacks (Hasegawa 2005). Revisionist interpretations stress the role of signaling to Moscow and the desire to shape post-war geopolitics, suggesting that Hiroshima and Nagasaki were, at least in part, acts of demonstration rather than strict last-resort necessities. In response, defenders of the decision highlight uncertainty: leaders could not rely on hypothetical future developments when faced with concrete ongoing casualties. This unresolved clash between outcome-based reasoning and principle-based prohibitions continues to structure contemporary nuclear ethics. If the bombings are justified, the threshold for legitimate mass killing is dangerously elastic; if they are condemned as criminal, the entire edifice of nuclear deterrence—which relies on the conditional threat of repeating them—rests on an ultimatum that no one can ever morally carry out. Nuclear strategy thus embeds a moral paradox: it claims to preserve peace by threatening acts that, if actually performed, would undermine the very values it purports to defend (Walzer 1977).

When Civilians Become Targets: Blurred Lines and Dangerous Rationalizations

A central difficulty in judging Hiroshima and Nagasaki lies in the blurred boundary between civilian and combatant in total war. By 1945, Japanese society had been deeply mobilized: industrial workers produced military goods; students and housewives were enrolled in civil defense; the Volunteer Fighting Corps prepared men and women for last-ditch resistance; propaganda framed sacrifice as a duty for all (Gray 2011). Proponents of the bombings have argued that such militarization turned cities into integrated war machines, rendering the distinction between civilian and combatant largely obsolete. In this perspective, Hiroshima and Nagasaki are portrayed as nodes of military production and command rather than innocent urban spaces.

International humanitarian law, however, insists that even in total war, civilian immunity cannot be dissolved into vague notions of “collective responsibility“. Civilians are protected unless and for such time as they take direct part in hostilities, and attacks must not be indiscriminate nor cause damage clearly excessive relative to the concrete military advantage anticipated (ICRC 2005). Treating entire populations as legitimate targets because they contribute economically, symbolically, or morally to the war effort empties the principle of distinction of its substance. It retrofits legality to destruction, rather than constraining force in advance. Philosophers such as Michael Walzer and Jeff McMahan warn against the slide from targeted military necessity to the normalization of terror bombing (McMahan 2009). Once we accept that “everyone is involved,” the door opens to devastating attacks on cities as such; Hiroshima and Nagasaki then appear less as anomalies than as logical endpoints of a way of thinking already visible in Dresden, Tokyo, and Hamburg. To read them only as the product of nuclear novelty is to miss this continuity: the atomic bomb magnifies, rather than creates, the problem of civilians turned into metrics of psychological pressure.

These ambiguities echo ominously in contemporary conflicts. In Ukraine, Syria, and Gaza, armed actors routinely invoke the enemy’s use of human shields, the dual-use nature of infrastructure, or the alleged collective guilt of populations to justify strikes in densely populated areas. The International Committee of the Red Cross repeatedly recalls that such arguments do not erase legal obligations: claims about shields or militias cannot authorize indiscriminate or disproportionate attacks. Hiroshima and Nagasaki should thus be read as a warning: once the conceptual firewall protecting civilians is breached, technology and doctrine will inevitably push toward ever more destructive thresholds, including nuclear ones. In the nuclear era, this challenge merges with Hans Jonas’ ethics of responsibility, which demands that we assess actions by their long-term consequences for the conditions of human survival. Mass destruction in the name of security undermines the very future that political communities claim to safeguard. Recognizing civilians as bearers of that future — not as expendable variables in a cost–benefit equation — is essential if the taboo on nuclear use is to rest on more than fear alone.

From Trauma to Norm: The Construction and Fragility of the Nuclear Taboo

Out of the ruins of 1945 emerged not only nuclear arsenals but also a slow, uneven process of normative insulation: an understanding that nuclear weapons, though central to deterrence, should not be used in combat again. Nina Tannenwald’s work documents how this “nuclear taboo” formed through a combination of public horror, diplomatic rhetoric, legal argument, and repeated non-use in Korea, Vietnam, the Middle East, and other crises where nuclear options were considered but rejected. The taboo is not mere sentiment; it is a social fact shaped by leaders invoking nuclear restraint, publics expecting it, and institutions codifying it indirectly through arms control and non-proliferation regimes. During the Cold War, nuclear weapons moved to the heart of strategic doctrine while receding from the repertoire of usable weapons. Theories of mutual assured destruction, flexible response, and escalation control all revolved around threats that were never executed. Scholars like Lawrence Freedman and Scott D. Sagan have shown how organizational routines, fear of uncontrollable escalation, alliance dynamics, and domestic politics reinforced a practical understanding: any deliberate nuclear use would risk catastrophic retaliation and political suicide (Freedman 2003; Scott & Waltz 2003). In practice, the bomb became sacralized as an ultimate symbol of sovereignty and security, whose actual detonation in war would signify systemic failure.

The war in Ukraine tests this architecture from within. Russian officials have repeatedly brandished nuclear rhetoric to deter NATO, link nuclear protection to annexed territories, and signal readiness to escalate if the regime’s survival is threatened. These threats normalize discussion of nuclear options in public discourse, blurring the once sharp line between the unthinkable and the conceivable. At the same time, the absence of nuclear use — even under severe strain — suggests that the taboo and deterrence still exert constraining force: key actors appear to calculate that crossing the threshold would trigger diplomatic isolation, potential counter-escalation, and long-term strategic disaster. Yet the erosion of arms control agreements, modernization of arsenals, deployment of dual-capable delivery systems, and development of so-called “tactical” nuclear concepts challenge the robustness of the taboo. Andrew Futter (2021) and others warn that as nuclear weapons are rhetorically and technically reintegrated into planning for limited scenarios, the perceived gap between conventional and nuclear use narrows. Without sustained political reaffirmation and institutional reinforcement, the taboo risks being hollowed out into a flexible talking point rather than a binding constraint. Hiroshima and Nagasaki then shift from being a foundational warning to a contested precedent available for dangerous reinterpretations.

Contemporary Strategic Morality: Prudence, Humanity, Responsibility

The nuclear age forces strategy and ethics into permanent confrontation. Hiroshima and Nagasaki demonstrated that a single political decision could annihilate a city in seconds and alter the course of history; subsequent simulations show that a larger exchange could destabilize the planetary system on which life depends. Lawrence Freedman argues that this transformation makes nuclear strategy qualitatively different from previous forms of statecraft: the margin for error is dramatically reduced, and the moral stakes are radically heightened (Freedman 2003).

A first implication is that the classical language of “military necessity” must be re-evaluated. In industrialized, urbanized wars—from Aleppo to Mariupol—the temptation persists to justify massive firepower by the promise of shortening combat and reducing one’s own casualties. But when essential infrastructure, medical systems, water supplies, and food networks are destroyed, the suffering unleashed extends far beyond immediate tactical objectives. Principles of distinction, proportionality, and precaution in IHL are not ornamental constraints; they are minimal conditions for preserving a livable political and moral order, as emphasized by the International Committee of the Red Cross. Nuclear weapons, by design, make compliance with these principles nearly impossible.

A second implication concerns deterrence itself. Nuclear doctrines are built on conditional threats: “if you attack in certain ways, we may respond with nuclear force.” Scott D. Sagan’s work on why states build nuclear weapons shows that motives include not only security, but also prestige, bureaucracy, and domestic politics. These factors introduce irrationalities and path dependencies into systems assumed to be coldly rational. In such an environment, maintaining the nuclear taboo cannot rely solely on abstract mutual vulnerability; it must be anchored in robust institutions, clear communication, responsible political leadership, and societal understanding of the catastrophic stakes.

Finally, a practicable nuclear ethics must, as Walzer suggests, remain addressed to real political actors under pressure, not to idealized moral saints (Walzer 1977). This means articulating standards that recognize fear, miscalculation, and domestic constraints, yet firmly exclude deliberate mass slaughter as an acceptable option. Jonas’s imperative of responsibility pushes this further: decision-makers must internalize duties not only toward current citizens, but toward future generations whose world could be destroyed by a few hours of escalation (Jonas 1984). In this light, the nuclear taboo is best understood not as an abstract moral luxury, but as a rational and ethical firewall that preserves the very possibility of politics over time.

New Nuclear War Simulations: Making the Unthinkable Visible

Since 2022, updated modeling efforts have made the implications of nuclear war more concrete than ever. The Program on Science & Global Security at Princeton University has simulated a NATO–Russia nuclear exchange (“Plan A”), while teams associated With Rutgers University, Penn State University , and the UN Office for Disarmament Affairs have examined climatic and agricultural impacts of various scenarios; the RAND Corporation has integrated these findings into systemic risk assessments. Taken together, these studies do not merely dramatize; they quantify the pathways from limited exchanges to global catastrophe, challenging any residual belief in controlled nuclear war.

These converging analyses illuminate four key points. First, no plausible nuclear conflict remains local: atmospheric circulation, trade interdependence, and financial linkages export consequences worldwide. Second, even “limited” exchanges involving a fraction of existing arsenals can trigger drastic cooling, shortened growing seasons, and dramatic drops in staple crop yields, undermining food security for billions. Third, complex infrastructures—energy grids, transport corridors, digital networks—are so tightly coupled that their destruction in key regions can induce cascading failures far from any blast zone. Finally, the economic shock of a major nuclear exchange, amplified by panic and institutional breakdown, would likely exceed anything in recorded history, making talk of “winning” misleading in any conventional sense. By quantifying famine risks, climate shifts, and systemic breakdown, these models transform the moral intuition of 1945 into empirically grounded strategic knowledge. They indicate that the cost of nuclear use is not merely unacceptable by current ethical standards, but incompatible with the continued functioning of global society. The nuclear taboo thus appears as an analytically justified constraint: a rational response to quantified catastrophic risk, not only a product of historical guilt. If deterrence doctrine does not integrate these insights—treating large-scale nuclear use as strategically self-defeating—then it risks being detached from the material realities it is supposed to manage.

Conclusion

Truman’s choice in August 1945 condensed into a few days the core dilemmas of the nuclear age: the tension between ending war and unleashing unprecedented horror, between protecting one’s own soldiers and annihilating enemy civilians, between demonstrating power and crossing a threshold that cannot be uncrossed. Hiroshima and Nagasaki close the Second World War, but they open a political condition in which the survival of states, societies, and ecosystems can hinge on crisis decisions taken under uncertainty and pressure. Any assessment of their legitimacy inevitably implicates today’s guardians of nuclear arsenals

Since 1945, the non-use of nuclear weapons in combat has depended on a precarious alignment of deterrence, taboo, legal norms, and political prudence. The war in Ukraine, the weakening of arms control, and renewed talk of testing show how quickly this alignment can fray. Nuclear weapons remain embedded in doctrines, budgets, and alliance commitments; they are invoked as ultimate guarantees while their actual employment would constitute a civilizational betrayal. The new generation of simulations confirms quantitatively what Hiroshima and Nagasaki already suggested qualitatively: nuclear war, even on a limited scale, is tantamount to systemic self-destruction. Preserving the nuclear taboo is therefore not only a matter of honoring victims or upholding an abstract moral ideal; it is the practical condition for the continuity of international society and the integrity of the biosphere. This requires political leaders to resist the normalization of nuclear threats as routine signalling, to rebuild arms control and crisis-management mechanisms, and to embed scientific findings on nuclear impacts into strategic planning and public debate. It also requires acknowledging, without evasion, that some instruments of power are incompatible with any defensible vision of humane order if ever actually used.

From Hiroshima to Ukraine, the “ultimate frontier” is less a technological barrier than a moral and political one: the capacity of states to renounce, in practice and not only in rhetoric, the translation of nuclear possession into nuclear employment. To maintain that frontier is to assert that there are forms of victory that destroy what politics exists to protect, and that these must remain beyond the limits of acceptable choice. The task is not to forget 1945, but to read it more rigorously—armed now with historical insight, legal principles, and scientific evidence—and to ensure that the unthinkable remains both imagined clearly and refused absolutely.


Table 1: Consequences of Recent Nuclear War Simulations (2022–2025)


References

Alperovitz, Gar. The Decision to Use the Atomic Bomb and the Architecture of an American Myth. New York: Vintage, 1996.

Anscombe, G. E. M. “Mr Truman’s Degree.” Pamphlet auto-publié, 1956 ; repris dans Collected Philosophical Papers, vol. III: Ethics, Religion and Politics.

Bernstein, Barton J. “Understanding the Atomic Bomb and the Japanese Surrender: Missed Opportunities, Little-Known Near Disasters, and Modern Memory.” Diplomatic History 19, no. 2 (1995): 227–273.

Frank, Richard B. Downfall: The End of the Imperial Japanese Empire. New York: Penguin Books, 1999.

Freedman, Lawrence. The Evolution of Nuclear Strategy. 3rd ed. Basingstoke: Palgrave Macmillan, 2003.

Giangreco, D. M. Hell to Pay: Operation Downfall and the Invasion of Japan, 1945–1947. Annapolis, MD: Naval Institute Press, 2009.

Gray, Colin S. War, Peace and International Relations: An Introduction to Strategic History. London: Routledge, 2007 (1ʳᵉ éd.) ou 2011 (2ᵉ éd.).

Hasegawa, Tsuyoshi. Racing the Enemy: Stalin, Truman, and the Surrender of Japan. Cambridge, MA: Harvard University Press, 2005.

International Committee of the Red Cross (ICRC). Customary International Humanitarian Law. Volume I: Rules. Cambridge: Cambridge University Press for the ICRC, 2005 ; et Protocol Additional to the Geneva Conventions of 12 August 1949 (Protocol I).

International Committee of the Red Cross (ICRC). War in Cities: Preventing and Addressing the Humanitarian Consequences for Civilians ; voir aussi le billet de blog ICRC “Proximate ‘human shields’ and the challenge for humanitarian law and action.”

Jonas, Hans. The Imperative of Responsibility: In Search of an Ethics for the Technological Age. Chicago: University of Chicago Press, 1984.

Maddox, Robert James. Weapons for Victory: The Hiroshima Decision Fifty Years Later. Columbia: University of Missouri Press, 1995 (rééd. brochée 2004).

McMahan, Jeff. Killing in War. Oxford: Oxford University Press, 2009.

Tannenwald, Nina. “The Nuclear Taboo: The United States and the Normative Basis of Nuclear Non-Use.” International Organization 53, no. 3 (1999): 433–468.

Tannenwald, Nina. The Nuclear Taboo: The United States and the Non-Use of Nuclear Weapons since 1945. Cambridge: Cambridge University Press, 2007.

Walker, J. Samuel. Prompt and Utter Destruction: Truman and the Use of Atomic Bombs Against Japan. 3rd ed. Chapel Hill: University of North Carolina Press, 2016.

Walzer, Michael. Just and Unjust Wars: A Moral Argument with Historical Illustrations. New York: Basic Books, 1977.

Further Reading on E-International Relations

Read Entire Article