Fusion Reactor: $65 Billion and Still No Electricity

Aerial view of the ITER site in France. (Photo: Oak Ridge National Laboratory/Wikimedia Commons)

As defined by World Nuclear News, the international fusion project known as ITER, exists “to prove the feasibility of fusion as a large-scale and carbon-free source of energy. The goal of ITER is to operate at 500 MW (for at least 400 seconds continuously) with 50 MW of plasma heating power input. It appears that an additional 300 MWe of electricity input may be required in operation. No electricity will be generated at ITER.”

Four hundred seconds. No electricity.

ITER, which stands for International Thermonuclear Experimental Reactor, is a collaboration between 35 countries that was first conceived in 1985 and formally agreed to on November 21, 2006. Construction began in 2010 at the Cadarache nuclear complex in southern France.

The  official seven group founding members of ITER are China, the European Union (then including the UK, which remains in the project), India, Japan, Korea, Russia and the United States.

By the time ITER is actually operational — if it ever is — it will have gobbled up billions of dollars. Currently, those cost estimates range wildly between the official ITER figure of $19-23 billion (likely a gross under-estimate) and the U.S. Department of Energy’s (DOE) current estimate of $65 billion.

The starting price when the project began was around $6.3 billion.

If the DOE numbers are right, then those 400 seconds will cost $16.25 million a second. Just to prove that fusion power is possible. Without actually delivering anything practical at all to anyone.

Whatever the costs, they are too high to be remotely justifiable, given the end product and the far more compelling and essential competing needs of the world right now.

This first appeared in Beyond Nuclear International.

Linda Pentz Gunter is the editor and curator of BeyondNuclearInternational.org and the international specialist at Beyond Nuclear.