hckrnws
Some history might be worth going in to here.
The AP1000 design was an evolution of an earlier design called the AP600. The AP600 conceptual design apparently didn't find any customers because the economics of it were unappealing compared to larger reactors. In a nutshell, if you're going to build a huge containment building etc etc etc, it doesn't cost (in theory) that much more to go even bigger and get more power out of it.
So why SMRs?
It turned out that because you were building so few AP1000s, and building them with different workforces each time, that there was very little learning going on.
Some bright spark seeing what's gone on in other industries had the bright idea that if you could build a nuclear reactor in a factory, and increase the number of units built and minimise the amount of work done on site, you'd start getting learning rates and economies of scale that easily outweigh the lower inherent efficiency of a smaller reactor.
At 300MW, this reactor seems to be too big to be "back of a few trucks" construction, and thus you still have the bespoke construction project issues, and sufficiently small that the construction costs of 4 of them are likely to be much higher than one AP1000 (the AP1000 actually puts out about 1100 MW).
The best way to solve this problem is to build a huge shipyard-based reactor-making gigafactory that makes floating gigawatt scale reactors that are floated to location. You can operate them offshore or on land.
This was actually attempted in the 1970s by Offshore Power Systems (joint venture between Westinghouse and Newport News). They bought and installed the world's largest gantry crane on Blount Island in Jacksonville Florida and got a license from the NRC to build the first 8 of them. Sadly no one bought and they shut it down. Crazy story. Super interesting from a rapid decarbonization perspective.
I believe this is how Rosatom makes their VVER pressure vessels, their factory being called Atommash[1]. They are currently delivering a number of them.
That is incredible! If this project actually went into production, we would be living in a very different world right now...
People always say this, but if it was so great wouldn’t someone have bought some? The money never seems to work out.
There were at least some confounding factors beyond how good of an idea this was. The primary expected electricity customer was oil refineries off the New Jersey coast. The Atlantic Generating Station was to be the first of these power plants. But after the 1973 oil shocks, oil refinery energy demand fell more than expected. By the time things were looking up again, Three Mile Island happened in 1979 and the general sentiment was turning against nuclear.
The primary thing that happened in the 1970s to ruin the US market for nuclear was PURPA, the Public Utility Regulatory Policies Act. This act began the opening of US electricity markets to non-utility providers. It turned out there was very large potential for this: cogeneration, and just low overhead fossil generation, especially as natural gas became seen as more abundant and combustion turbines improved.
Those oil refineries you mention became prime targets for cogeneration. Instead of burning fuel to provide process heat (for distillation, for example), a refinery could instead add a combustion turbine topping cycle ahead of that and produce electric power at near 100% marginal efficiency. This reflects the large second law losses involved in burning a fuel to produce relatively low temperature process heat. Basically any industrial process that used low or moderate temperature process heat could become a source of electric power via cogeneration after PURPA was enacted.
Interesting comment! But wouldn't swapping in a nuclear reactor allow for the same cogeneration setup? So it would be still be about the cost of the fuel they're burning versus the cost of managing a nuclear reactor.
Combustion turbines operate at a much higher temperature (as much as 1300 C) than a nuclear reactor, especially a LWR (about 325 C), so the exhaust temperature can be much higher. Also, a turbine is much cheaper: a simple cycle combustion turbine is maybe $500/kW(e), vs. perhaps 20x that for a NPP.
The reason a combustion turbine can be so hot is that the heat is produced in the working fluid itself, by a chemical reaction. There is no need to conduct that heat from a solid material at higher temperature. All the solid parts that might be heated by the gas can be cooled by actively flowing a cooler fluid through them (air, steam).
But if the main thing they want is low temperature process heat, then the specific method of generating electricity shouldn't actually matter? NPPs also produce electricity plus lower temperature 'waste' heat.
Unless the waste heat put out by the NPP is too low temperature to be used for process heat? Or the real advantage is that the refineries ramped up their electricity generation (turning waste fractions into electricity), with the process heat becoming a secondary concern?
Direct use of nuclear heat had nothing to do with PURPA. PURPA wouldn't have made it more likely to be considered, since it wasn't putting power onto the grid. If nuclear wasn't being used for industrial heat before that, it wouldn't suddenly have become reasonable afterwards. Instead, PURPA would have made that less likely, since it would have improved the economics of the competing technology of burning fuels for process heat (since a valuable and saleable additional product, power at very high marginal efficiency, could now be produced and sold to the grid.)
There is the question of why nuclear had not and has not penetrated the market for industrial heat. I suspect it has something to do with nuclear not scaling down well, not being suited to industrial applications that may require more flexibility, and concerns about nuclear requirements bleeding over into design and operation of otherwise non-nuclear industrial processes.
Where will process heat come from in the future? Possibly heat pumps. The thermal energy from these (or from resistive heaters, at high temperature) is much cheaper to store, per unit of energy, than electrical energy in batteries, so this would provide a large, cheap source of dispatchable demand to ease renewable integration.
Thanks, that's excellent context.
I doubt it was the money that killed this. Probably regulatory and environmental concerns. Imagine the headlines and phrasing of news articles.
Instead we got this crap that is a floating barge of environmental ruin:
https://karpowership.com/en/global-presence
Notice how it's only been installed in 3rd world countries.
You might think that, but OPS went through an astounding amount of regulatory review. They got an actual license from the NRC to build the first 8.
This article goes into detail of you have access: https://www.newyorker.com/magazine/1975/05/12/the-atlantic-g...
Wow they really bury the lede on what fuel they run on. They say "dual fuel" but don't say which those fuel are, but external searching suggests natural gas or low-sulfur heavy fuel oil. Not sulfur-free, so I'm sure these are great for emissions.
Also: Renewables have downsides, but they are vastly cheaper and more proven.
There is no reason to assume that the downsides of various other techs are in any way more manageable. (And ironically some of the solutions proposed for the downsides of nuclear are also what you need for renewables, i.e. storage to maximize use of expensive infrastructure with low opex.)
> And ironically some of the solutions proposed for the downsides of nuclear are also what you need for renewables, i.e. storage to maximize use of expensive infrastructure with low opex.
It's exhausting to see this claim see the light so often, despite being debunked so frequently.
Nuclear doesn't require storage.
Some people claim that economies could be made with storage because then you wouldn't need to build production equivalent to 100% of your peak consumption.
The dishonest discourse happens when renewables proponent claim that this small potential optimization means nuclear needs storage just as much as renewables do. Actually, losing your power source is not equivalent to shaving a few %s of budget.
This idea also completely ignores the fact that power grids don't optimize for efficiency, but for stability. We want redundant production, and the ability to move as many cursors as possible on the grid, in order to balance for issues. The ability to ramp up a power plant isn't just useful to produce peak load, it also helps mitigating e.g. losing a power line or a power transformer.
Assume a capacity factor of 50% then to handle peak loads? Now the nuclear costs $240-440/MWh. That is ridiculous costs, worse than Europe during the last winters Russian war induced gas crisis.
https://www.lazard.com/research-insights/levelized-cost-of-e...
It's hard to give numbers for 100% nuclear since no one does that, but yes, France has load factors between 60 and 80% depending on the reactor type.
https://www.iaea.org/fr/newscenter/news/contribution-du-nucl...
The report you quote already cites load factor around 90% for new builds, a 50% load factor wouldn't double the costs. That being said, 90% load factor for a grid with a high share of nuclear is probably extremely optimistic.
The reason such high capacity factors can be reached in countries that have a low nuclear share is that nuclear has an extremely low marginal cost, and is always picked over plants that require fuel.
True, the fixed costs are $20-35/MWh. Deduct that from the figure before doubling. Not that it matters.
Running a pure nuclear grid will never get 90% capacity factor. That relies on only taking the base load and offloading the actual hard part to other generators. Exactly what nuclear proponents like to say will be impossible for renewables.
> That relies on only taking the base load and offloading the actual hard part to other generators. Exactly what nuclear proponents like to say will be impossible for renewables.
It's certainly not impossible, since that's what we do currently. The reason we would like to stop doing it in the future is co2 emissions, not a feasibility barrier.
Now my point was that we don't have experience of massive storage. And in the absence of storage, nuclear costs go up, while renewables output goes down. One is painful but manageable, we don't have a plan to handle the other.
Personally, I wish we spent a bit less subsidies on renewable production, and a bit more on grid-sized storage, so we can have an answer on costs. Without this, discussions on storage-enabled grids are conjectures at best.
Once you electrify your heat demand seasonal variability will massively increase, driving down load factors dramatically.
Today electricity accounts for 20% of energy use, European estimates see that going up to 50% by 2050. A large chunk of that increase is heating.
Obviously other countries and climates will not see the same patterns here.
For the benefit of the bystander: this person actually has no idea what energy seen planning looks like in real life. Economic optimization is absolutely at the core of it. Energy System Models (Essen) are economic optimization models. Depending on country and climate, the factor of the sibling comment of 50% is absolutely realistic.
Also these details don't really matter: Storage is a viable solution to the problem that Renewables are not dispatch able. The only argument against storage could be economic: That the inefficiencies and cost of storage render nuclear cheaper after all.
Not sure whether you understand the difference between the prospective world of public analysts, and actually running a power grid. I know there can be a divide between the two, and some self-agrandizement on each side. Picking reliability over efficiency is the sound economic choice.
Please note that my comment is not about the viability of storage for renewables.
I semi-regularly talk to people that plan the European power grid, yes. At the core of extension and development studies is still an optimal power flow, which is an economic optimization. Stability (which has many meanings here) is usually studied as a second step.
Security of Supply is yet another issue.
Anyway, if you can use storage to lower you nuclear costs by 5% overall, would it not be used? If your comment is not about viability of large scale storage, then what is it about? Storage is dispatchable, and an economically optimal 100% renewables grid will have a peak capacity far above actual peak demand, meaning it's typically curtailed (or you run electrolysers which again can beade dispatchable and grid-forming as loads) which again means you have plenty of ability to react.
Resilience research tends to favor decentralized solutions as well. Look at how Puerto Rico is being rebuilt.
> I semi-regularly talk to people that plan the European power grid, yes.
Hard to know which group of people you talk to. This kind of work happens at many different levels, for many different reasons. TSO investment decisions are not made by the same people than power production investment decisions. There is some coordination, but no centralized decision-making.
To the best of my knowledge of at least one major European TSO, investment decisions are not taken at european level. And while there is strong european coordination in grid management, it is still cooperation and not an unified decision process. Any europe-level investment advice would have to be evaluated and selected by the TSO to become effective. Also the further you go from the field, the more likely studies are to consider spherical cows. Still useful, but not the full picture.
> Anyway, if you can use storage to lower you nuclear costs by 5% overall, would it not be used?
Of course it would, and we do use reversible dams to optimize costs. My point is that:
1) that's a whole-production optimization, and the trigger for that optimization is increasingly renewables rather than optimizing for nuclear production stability
2) the difference between paying 5% more and having to cut consumers from the grid is massive. for any TSO trying to save money is a goal, but curtailing consumption is an absolute failure criterion, second only to damaging the grid itself.
> Resilience research tends to favor decentralized solutions as well.
I might be biased by my personal experience, but when it comes to a power grid, planning ahead of time is much more valuable than decentralization. There is a difference between avoiding concentration of material, something which can be handled with central oversignt, and decentralization, which creates a coordination cost and sometimes loss of information (which is the case when small renewable production is connected to the distribution network rather than the transport network).
The TSOs coordinate in ENTSOE, they are not responsible for actual energy production. But they are responsible for many aspects of system stability, and ENTSOE also coordinates security of supply studies.
How to make sure that everything that is needed in order to have sufficient production capacity in all scenarios is also built is not in the responsibility of TSOs. Of course when we consider how to develop flexibility and system service markets we are exactly talking about making sure that everything that is needed _is_ built. Not through centralized planning but by making sure we have a market where we pay for people to provide what is needed.
The big questions, like energy only vs capacity markets, are directly about making sure the necessary investments are made, no matter what technology options are backing them up. You don't need central planning, but you need ESM studies to guide your market designs and long term political decisions (i.e. do we need a hydrogen grid).
And the fact that right now we don't have a market (or other incentives/regulations) for virtual inertia is a real problem worrying TSOs (much more so than the hypothetical issue of under-supply in 10-20 years that you raise for which the basic organisational mechanisms are in place).
> 2) the difference between paying 5% more and having to cut consumers from the grid is massive. for any TSO trying to save money is a goal, but curtailing consumption is an absolute failure criterion, second only to damaging the grid itself.
Of course, but what is your point? That renewables require massive storage to ensure supply at all time? Yes. That's obvious.
But, as every study and common sense shows, it turns out massive storage is also helpful for a nuclear grid, especially once we get into electrifying the heat sector (an important point you have not acknowledged at all).
Finally security of supply studies also assume that conventional generators sometimes fail for unpredictable reasons. In Texas it wasn't the renewables, it was the gas system freezing over that almost fucked up the grid.
I still don't get what your initial complaint was about. The point I made was pretty explicitly: "Renewables require massive storage to solve the fact that they are not dispatchable. But actually so does nuclear for completely different, economic reasons." What was it that you object to in that statement? What are you "debunking"?
Finally: As to the idea that energy systems are optimized for reliability first, and that renewables threaten this somehow:
Germany had an average of 12 outage minutes in 2020 on 49% renewables.
US: 280 outage minutes on 23%.
Germany outage minutes have been trending down as well last I checked, with the general explanation I have heard being that the focus on renewables has forced people to coordinate better, and check their grids more thoroughly than they did before.
> Of course, but what is your point?
That you can't compare two requirements when the consequences for failing to meet them are so different. To make an iffy comparison, you don't need brakes on a skateboard just as much as you need brakes on a car, even if a skateboard would stop a bit faster with brakes.
Incidentally, a few remarks on some unrelated points that were brought up:
> Not through centralized planning but by making sure we have a market where we pay for people to provide what is needed.
And we're seeing how well it's going.
> How to make sure that everything that is needed in order to have sufficient production capacity in all scenarios is also built is not in the responsibility of TSOs.
No, but as coordinators of the grid, TSOs have a lot to say about the impact and viability of production choices. There's a reason why TSOs publish their own studies on this topic.
> much more so than the hypothetical issue of under-supply in 10-20 years that you raise for which the basic organisational mechanisms are in place
I don't remember raising that point. Long-term supply volume decisions are a political choice, on which TSOs have no unique insight.
> Finally: As to the idea that energy systems are optimized for reliability first, and that renewables threaten this somehow
You keep bringing renewables in the discussion, I really don't know why.
I am not sure I see the point of continuing this. Other than I initially thought you clearly are knowledgeable, yet you refuse to cleanly argue your point. You attacked me directly for bringing up that storage has a role to play in nuclear as well as renewables, and now you claim that renewables are not the point.
I never claimed that they played the same role either. Again, why is that an issue?
That said, the point of storage for nuclear is exactly also that you don't build out the peak load, so if large scale storage failure leading to undersupply is your worry (as you indicate elsewhere), then that is a concern for this scenario as well.
But again, why? Storage backed energy grids are already a reality: we were running Europe on stored Gas for much of the winter. We are not talking about hypothetical batteries but about storing methane/H2 and operating gas power plants in these scenarios. Nothing in this tech stack is unproven by now, it is primarily a question of economics.
You insinuate (without clarifying) that markets arent working, yet we have fantastically reliable grids in Europe. Finally, the degree to which economics trumps reliability can be seen very directly in the energy-only vs capacity market debate. Energy only puts the critical safety decisions in the hand of market actors. Negative and very high positive prices in the market are supposed to incentivise building a base load that will only run for a few weeks a year and feeding it with flexibly produced gas/storage. Capacity markets put the decision how much of this to build I. The hands of the planners and engineers instead. Much to the horror of most engineers and planners I know energy-only markets won the political debate a few years ago. It seems politicians really hate giving engineers money to spend on reliability directly.
This debate probably would work if we were at a bar or conference, but it's not really clear it makes sense here...
> yet you refuse to cleanly argue your point.
It's not a very large point, and I believe I fleshed it enough already.
> so if large scale storage failure leading to undersupply is your worry (as you indicate elsewhere)
I don't remember mentioning that. If we work under the hypothesis that we have large scale power supply available at reasonable cost, then yes, there is no reason not to use them. Currently, pumpable hydro is the closest we have, and it's more or less capped out here.
> But again, why? Storage backed energy grids are already a reality: we were running Europe on stored Gas for much of the winter.
This is the first time I hear someone mention gas as storage, in the context of a power grid. If you forget emissions, then, I guess, why not. But if you forget emissions, 90% of the public discourse on the european power grid stops making sense. To date, I have knowledge of no large-scale environmentally viable way to produce this fuel in a renewable way.
> You insinuate (without clarifying) that markets arent working, yet we have fantastically reliable grids in Europe.
The grid was already working well before the introduction of markets. Since then, in my neck of the woods we have seen a complete failure of investment in the production side. In other countries that already had a built grid like us, renewal was only possible with heavy public subsidies.
I would welcome an explanation on what you believe markets brought to an already working grid.
Power2Gas has been a major topic. And the idea that this will become highly relevant in about a decade is behind the idea to allow Gas infrastructure investment as Green in Europe. This means Hydrogen but also synthetic natural gas.
https://en.wikipedia.org/wiki/Power-to-gas
Methanation and methane storage is at the same level as Hydorgen storage in optimal models. This here has 500 citations:
https://arxiv.org/pdf/1801.05290.pdf
They also cite something I haven't personally read, but which goes back all the way to 2009, again with hundreds of citations:
M. Sterner, Bioenergy and renewable power methane in integrated 100% renewable energy systems, Ph.D. thesis, Kassel University (2009).
Of course as the roundtrip efficiency is better, its preferable to store Hydrogen directly if that is feasible (and this is why a pure electricity sector optimization model will never show methane).
But fundamentally the idea has been for a long time to run Gas power plants on carbon neural gases. This is an explicit point in the EU Taxonomy of labeling Gas investments as green. They can only do so if they retrofitted to running on carbon free gas from 2035 onwards. Siemens is selling their stuff as H2-ready for that exact reason [1].
Pumped hydro can not be meaningfully expanded, its only a small part of the overall solution.
[1] https://www.siemens-energy.com/global/en/news/magazine/2022/...
I can see that there's a lot of momentum for that, indeed. But as you told me earlier, we moved from a system where engineers and planners offered options to choose from to a system that is driven by political will, and implementation concerns come down the line.
Both systems have their flaws: the first one is clearly detrimental to democratic oversight, and may sometimes cause authoritarian problems, where administrators don't see or overlook problems caused to individuals.
On the other hand, political-first systems tend to kill unviable or uncompetitive projects way too late, because implementer signal is attenuated.
A good example of this is the development of nuclear power in France and the UK in the 60's: both countries had a local reactor project which was politically favored. France, where engineers had a strong influence on politicians, killed it quickly and bought US licenses for its program. The UK on the other hand moved on with their local graphite-gas reactors, which proved much harder to implement, and hampered their program.
All of this to point out that power2gas is currently at a very early stage, and since this program is dominated by political will, it's extremely hard to know how well it will work, let alone have a good idea of the economic figures.
Maybe it will work. I hope so. But it's certainly not a done deal, and it's extremely unsettling to see our countries' energy safety be debated based on a few scientific papers or a planned demonstrator by people that won't have to actually implement it.
The biggest downside (compared to solar wind) of nuclear always was payback time on investment.
Solar wind is volatile and non-dispatchable, so you need something to balance it. Of course we can always say heck it and live with constant brownouts and highly variable energy cost, but that is a non-solution in my book. You can overprovision generation, you can build dispatchable storage solution or dispatchable alternative generation source. In the end you need this because of variability and non-dispatchability of solar wind. Nuclear does not have this property, ergo it does not need storage to prevent brownouts. Sure, some models suggest that storage could under certain circumstances maybe help shave off a few percent of the cost, but that is the extent nuclear needs storage.
And this brings us back to original issue of payback time. Solar wind has lower payback time, because the cost of variability is borne by the grid, i.e. classical example of privatizing the profits and socializing the losses.
Until the outage of half your nuclear plants is similarly borne by the grid.
https://www.nytimes.com/2022/11/15/business/nuclear-power-fr...
> You can overprovision generation, you can build dispatchable storage solution or dispatchable alternative generation source.
Every model I have seen does that. And for Europe at least that total cost of doing all that is below nuclear.
> Every model I have seen does that
Not really. Both overprovisioning and dispatchable storage are hugely expensive, therefore dispatchable alternative generation is employed, which is gas. So in an attempt to move away from carbon-emitting generation these models make carbon-emitting generation integral part of the system.
What models based on renewables actually do is try to find a problem for a solution: throw away grid stability requirements out of the window altogether, introduce huge variabilities and hope that spending power inequality will mask brownouts, i.e. it is hoped that pricing changes and consumer reaction to those changes will be fast enough for load to disconnect voluntarily before parts of grid are forcibly disconnected.
Please show a single publication that does what you claim it does. A single one.
I don't think they exist, or if they do they will not be the prominent highly cited ones. Yes, storage and methanation and electrolysis are expensive. These expenses are explicitly modeled in the energy system models. Modeling these expenses for a complete system that satisfies the security of supply constraints is the very essence of what these models do [1]. Criticism against them would be exactly that they don't assume some market/price dynamics but a perfect central planner.
Flexible loads are a fundamentally different thing than brownouts. Electricity consumers opt in to being flexible, and this is about sophisticated industrial consumers, not private consumers anyway. The market mechanism you describe would only be true for energy-only markets anyway, and no one is proposing energy only markets for primary and secondary control energy that balances out short term variability.
[1] E.g. Figure 7 in https://arxiv.org/pdf/1801.05290.pdf gives the cost breakdown for a 0-Carbon configuration of generation storage (and optionally transmission expansion) investment that can serve all loads for the entire year under different assumptions for other energy sectors participation in flexibility provision. While this modeling set up is a lower bound on the costs, a deeper analysis shows that this is actually pretty accurate especially when storage investment is reasonably high, as the overall storage cost is dominated by seasonal variability which is highly regular.
That's what Russia is doing now. They already have one mobile power plant in operation.
https://en.wikipedia.org/wiki/Russian_floating_nuclear_power...
ThorCon (https://thorconpower.com/) is currently working on this approach.
A side benefit of being on or near the ocean is immediate access to a thermal sink. Nuclear power plants (and also fossil fuel thermal plants) divert substantial amounts of water for cooling. If fresh water is increasingly in short supply around the world, that's a problem.
The usefulness of the ocean as a thermal sink though depends on water temperature which can vary a lot depending on where you are.
Yes this caused major issues in France this year, the surface water was so warm it was not ecologically sound to use it for cooling so some nuclear plants had to shut down or throttle.
Seaside plants don't have this problem in France, never have, never will.
The issue happens with river plants, more specifically those that release cooling water back to the river.
This was vastly exaggerated probably to distraction the attention from the French public that the vast majority of the plants being put offline was due to the fact that they had their maintenance postponed till the limits, and then you had a bunch of reactors that, per regulations, needed to be turned-off at the same time. Only a few of the plants had this thermal problem, the ones that discharge hot water directly into the rivers.
> The usefulness of the ocean as a thermal sink though depends on water temperature which can vary a lot depending on where you are.
No. The difference between pumping water at 15°c or 30°c is negligible when you're cooling water at 325°c. You may have to pump a tiny bit more water, but that's a non-problem when you have an ocean.
Fair enough.
Is it actually cheaper and easier? Seawater isn’t exactly known for its ease of use and non-corrosiveness and fouling is a real problem.
Not to mention all the sea life that isn’t going to respond well to high temp water spewing out..
> Not to mention all the sea life that isn’t going to respond well to high temp water spewing out..
Ideally, you're not dumping the hot water back into the harbor. At least from a naive theory perspective, you could put huge cooling towers on deck and cool the reactor 100% through evaporative cooling. Bonus: the huge steam/mist cloud out of the cooling tower could help delay global warming through marine cloud brightening [0]. Whether the deck of a floating reactor has enough space and stability to support cooling towers of the appropriate size is a question for the engineers.
That's how most reactors in first world countries with temperate climates work anyway. Because it really messes up the biosphere of your river/coast, if you just dump 3GWh of thermal power straight into the water.
That's too native. Output is indeed cooled but only enough to be only fraction hotter then input. Then dumping is safe again.
This is also why most of those power plants have to shut down. It's not the physical lack of water, but instead input is already so hot that output would already be over the upper limit.
Nobody build enough cooling towers to cool with them "needlessly".
Which is also why global warming is disastrous to energy sector. Those older power plants just don't have enough cooling capacity for newer higher temperatures.
You might want to take a second look at IR (heat) absorption spectra:
https://www.e-education.psu.edu/meteo3/sites/www.e-education...
> At least from a naive theory perspective, you could put huge cooling towers on deck and cool the reactor 100% through evaporative cooling.
That's usually not done with saltwater plants, because there are more issues with mineral deposit, i.e. salt.
Nuclear subs and carriers operate in salt water all the time. It's "just" chemistry
Nuclear subs and carriers produce a fraction of the power of a commercial nuclear power plant.
I'm not sure why that matters? Salt water is salt water. An carrier's brake horsepower is ~200MW, so a fraction, but a reasonable fraction. When San Diego had a county-wide power outage in 2011, the carriers at North Island pushed power to the grid to support essential services.
smaller-size systems along with less budgetary oversight means higher maintenance costs are less likely to be problematic.
Why did no one buy a floating reactor?
Too expensive.
Also the AP1000 takes a superheavy forging press which can take a few 100 tons of steel. There are just so many of those, only in certain countries (North America does poorly) and those presses can build a limited number of reaction vessels.
A smaller reaction vessel needs a smaller press which is easier to find.
The problem is they are constrained by the needs of the one customer they definitely have, the US Navy. It's their need for reactor power plants for their nuclear submarines and aircraft carriers that is funding the development of these systems and setting the basic requirements. The commercial market is an afterthought.
You would think it would be economical to just move the work force around with the projects. You get the talent, but built in place.
It's harder to find quality people who are willing to relocate their family every time a job is finished.
A large portion of the workforce which recently completed Vogtle units 3 and 4 temporarily relocated to the site from around the country. Many people lived at RV camps opportunistically setup by farmers or others with large empty lots. Especially if you had family, living out of an RV was a good way to minimize expenses and maximize how often you could afford to travel back home.
At big jobs like these, especially outside a handful of major metro areas, there will never be enough local labor. Even for "unskilled" positions, you don't want too many unproven or inexperienced people as they're likely to cause trouble, such as hurting themselves or others, or not consistently showing up on time--schedules can be both grueling and erratic, and many people can't hack it.
Tesla could never reach full capacity at GF1 because they couldn’t find enough people willing to move to Sparks, NV. Also why they picked Austin for that GF.
The nomad tradespeople you mention exist, but they are rare, and most likely dwindling as they get old and die and no one replaces them (a problem all trades currently).
Yeah, I was thinking if you can afford to pay people to go live on an oil rig you could pay people to live in a small town in a rural area.
To be fair, in a lot of cases you'd probably build multiple units on the same site which would get you some learning effects. But compared to wind, solar, and batteries...
Do you know many pipe fitters?
I believe that is how KEPCO operates on the reactors they build outside of Korea.
it's not only the workforce, it is also the difference in bureaucracy and regulations between each state which limits knowledge transfer.
Time to quote the late Admiral Rickover, again, who described the AP300 in quite a lot of detail back in 1953. [1]
"An academic reactor or reactor plant almost always has the following basic characteristics:
1. It is simple.
2. It is small.
3. It is cheap.
4. It is light.
5. It can be built very quickly.
6. It is very flexible in purpose (“omnibus reactor”)
7. Very little development is required. It will use mostly “off-the-shelf” components.
8. The reactor is in the study phase. It is not being built now."
1. https://whatisnuclear.com/rickover.htmlSeriously? I’m insanely pro-nuke, and work in the nuclear sector, but Westinghouse should really focus on their AP1000. They’ve already got deals in place with several countries, and now that they’ve got them running in more than one country, there might be a chance of bringing costs down some. The light water SMR market is getting crowded rather fast. They’re competing with both GE (and their BWRX-300) and NuScale (with their modular designs).
Given the track record of AP1000 construction, who do you think would ever order one?
I don't see a future for this. If somebody thought it was a feasible design for construction in Western countries, they could buy the half completed sites at Summer for a song.
I also think 300MW is too big. It's not really small enough to get the supposed benefits of small and modular.
The entire nuclear industry appears to be off its rocker when it comes to designs that are possible to construct for any reasonable amount of money. They bungled the AP1000 so badly that it bankrupted Westinghouse, and there's no reason to believe that they learned enough from the experience to do better next time. It's all terribly depressing, and completely soured me on the possibility of nuclear for the next 20-40 years. And in 20-40 years, look at how far storage, wind, and solar will have developed. It's unlikely that my nuclear will ever be a dominant energy source.
What's sad is that it's mainly the US that has problems with this. China has managed to bring multiple of this reactor design to fruition while the US still screws around with building just two. I live in Georgia and the almost decade of overruns and corruption are being about to be paid for by Georgia Power customers via rate hikes.
France and Finland have had excessive difficulty with their EPR.
My hypothesis is that modern economies with high wages will simply not be able to build a nuclear reactor economically. It appears to require a certain amount of 20th century technological advancement, but not so much economic advancement that labor costs are too high.
France had a much better track record with nuclear in their first round of building. France can also build large construction projects without the massive cost overruns that the US has had. But they can't seem to get a next gen reactor built.
The first EPR project in France (Flamanville-3) had problems documented in an official report (dubbed 'Folz', per the name of its main author), sadly AFAIK it wasn't translated into English: https://www.economie.gouv.fr/rapport-epr-flamanville
A piece about it: https://www.archyde.com/the-folz-report-draws-up-a-severe-as...
This study may also offer some hints: https://www.sciencedirect.com/science/article/pii/S030142151...
Because those were the first EPRs ever. Now that most caveats and issues are discovered, next ones should be drastically faster and easier.
> France can also build large construction projects without the massive cost overruns that the US has had
Not always, some easily go overboard, like with the Parisian Philharmonic, or the absolutely massive Grand Paris Express (~200km of new high capacity metro lines around Paris) which has seen the budget go up from an original estimate of 19 billion to 35 billion (euros). Completely normal considering the scale and complexity, but still.
Hinkley Point C, started 12 years later, is going swimmingly I hear! Going to cost the consumers ~$150/MWh and it is starting to look likely that EDF even at that incredibly high price takes a loss on it.
> Since construction began in March 2017, the project has been subject to several delays, including some caused by the COVID-19 pandemic,[10] and this has resulted in significant budget overruns. As of May 2022, the project is two years late and the expected cost is £25–26 billion,[11] 50% more than the original budget from 2016. It is currently planned to be commissioned in June 2027 and has a projected lifetime of 60 years. In February 2023, EDF announced that costs would rise to £32.7bn and completion would be delayed by a further 15 months to September 2028.[2][12]
https://en.wikipedia.org/wiki/Hinkley_Point_C_nuclear_power_...
Started before all problems had been fixed with Flamanville and Olkiluoto, it's a part of the initial generation.
Do you have any examples of a very high cost reactor dropping by a significant amount on a subsequent build?
South Korea was able to eke our small decreases in cost (though some people have ended up in jail for falsifying record keeping). But otherwise, costs tend to rise. And I'm definitely not aware of a drop of 50% or so, to bring these excessive costs into the realm of competitiveness.
Is not the definition of insanity is doing the same thing over and over and expecting different results?
When building complex infrastructure for the second/third/etc. time? Of course not. Economies of scale is the term you're looking for.
When has nuclear ever had economies of scale?
Hinkley is the fourth time, still no dice.
It's a bit simplistic to blame only labour costs. Large, complex civil projects often have cost and schedule overruns because contractors know they're going to get paid anyway so are incentivised to make the project slower and more expensive.
Smaller, simpler projects don't tend to have the same issues because contractors know they can be replaced easily if they fail.
Renewable projects also can be rolled out in stages, so if the contractors are scammers it shows up early.
French scenario could have been avoided is much better loans where secured. Delays skyrocketed costs for to fat % on those financing agreements.
Which is 101 of big projects anyway. Hard to understand why french project wasn't deemed to risky to investors.
Interesting hypothesis…
The whole industry, and its fan base, has descended into overt magical thinking. It is sad, and not a little ridiculous.
Some older plant blocks were in the 300-500 MW range so I guess you could replace them with this ?
Those older plants in the US have been shut down because they can't even earn back their operating costs.
This seems to be a smaller version of the AP1000. Nice big containment vessel. That's a good thing. Fukushima had an undersized containment which failed, and Chernobyl didn't have much of one at all. Three Mile Island had a good containment vessel, and so their meltdown damage was contained.
Some of the "small modular nuclear reactor designs are built on the assumption that they don't need a full scale containment vessel because nothing can go that wrong. That may not be a good assumption. There's a lot to be said for large amounts of steel and concrete around reactors.
Which SMRs don't have a full scale containment vessel?
None, although I think some (NuScale?) save money by sharing pressurized containment volume among multiple reactors. Just hope you don't see multiple correlated meltdowns.
Containment vessels could also be made smaller if steam filtering were used instead of planning to just contain and condense it. I don't know if any SMRs plan to do this.
NuScale's containment vessel is a metal cylinder around the reactor.[1] Each reactor has its own containment vessel. It's just slightly larger than the reactor, like Fukushima and Peach Bottom. Here's a good drawing.[2] No user-serviceable parts inside. The building is not a containment.
[1] https://www.nrc.gov/docs/ML1535/ML15355A411.pdf
[2] https://spectrum.ieee.org/media-library/image.jpg?id=2557419...
As other poster mentioned, each nuscale module is fully self contained. Not only that, but they've modeled every module having a simultaneous scram in a 12 module plant, and modeled that the water level was lower than nominal. Still walk away safe.
Didn't the PRC buy 4 AP1000s and then start building their own knockoff? (not pirated so much as that was one of their demands and Westinghouse accepted)?
Anything bigger than 1600MW, Westinghouse doesn't get a licensing fee for it getting built. So they are building 1600MW size right now.
SMR is being sold as smaller thus inferring smaller mess when disaster strikes but I don't see that anywhere can you shed some light on whether it's safer? perhaps the smaller containment meant it can be constructed to a higher threshold?
Smaller tends to mean it's easier to make things like passive emergency cooling systems (e.g. using convection and gravity), so you're not dependent on the diesel gensets actually starting when they're really needed.
maybe for the immediate future. But their best prospects by far lie in their LFR (Lead Fast Reactor) concept.
Lead is such a superior concept for reactor design and neutron economy, and mostly held back by the pace in material science.
Discussed 9 hours ago, 140 points, 99 comments https://news.ycombinator.com/item?id=35816789
Highlight of the thread, IMHO, from 'beefman:
https://news.ycombinator.com/item?id=35818261
>Not small, modular, or new. But certainly a decent reactor. They clearly want something to slot in against the GE BWRX-300. But they would probably be better served by going all-in on their eVinci reactor, which is small, modular, and new
>https://www.westinghousenuclear.com/energy-systems/evinci-mi...
Tiny reactors like that have been suggested for remote locations, like mining sites, but they make little sense. Renewables + batteries + diesel would be cheaper.
We could use it here in South Africa.
The bulk of our power comes from coal fired plants but due to the low quality coal for cost reasons and thus high sulphur and lack of preventative maintenance it results in boiler tube leaks nearly every week.
Some of our plants are at the end of their lives and their replacements are no better off due to corruption and a malady of technically incompetent builders (looking at you Hitachi South Africa).
7000MW out of commission so we get daily power cuts for half the day.
Those like myself who have the ability just install home solar systems for some measure of sanity while most companies are using diesel standby generators for their infrastructure.
Walking into the office park and you can hear the chugg chugg and smell the fumes.
South Africa almost got something similar, the Pebble Bed Modular Reactor (PBMR). Well ahead of its time and excellent technology, with Westinghouse involved as a partner at one point and having bought some of the IP. [0]
If only that had been continued.
[0] https://en.wikipedia.org/wiki/Pebble_bed_modular_reactor
Pebble bed reactors have problems. The fuel is more expensive to fabricate, activated dust comes off the pebbles as they abrade each other, the pebbles can crack and jam, and the volume of the spent fuel is high, increasing the cost of dealing with it.
We'd need 23 of them. Depending on the cost, it would be better than nothing. Currently, I think it costs R10 mil to generate 1MW through solar; so R70 billion with solar.
These are pretty big SMRs - 300 MWe, compared to 77MWe from NuScale. Definitely an iteration of reduced size from their 1 GWe model, as opposed to a "as small as reasonable" design.
It's cool that they call out 15 MWe / min "load following" capability. Ramping up and down in response to renewables will be an important function of any nuclear reactor installed today.
> Ramping up and down in response to renewables will be an important function of any nuclear reactor installed today.
Realistically, it's not going to happen. PWRs are inherently stable, that's one of their big selling points. If anything goes wrong and the reactor becomes too hot, the water expands, moderates less the neutrons and they don't slow down enough to trigger fission events, so the rate of fission decreases. If for some reason the reactor stops producing enough heat, the water cools down and more fission starts happening. And there are other negative feedback processes in place too, for example Doppler broadening [1].
This is great for safety, but not so great for load following. If you want to load follow, it's probably going to be many times cheaper to just invest in a bunch of batteries.
[1] https://en.wikipedia.org/wiki/Doppler_broadening#Application...
That depends on what they mean by load following. While you can load-follow by changing reactor power (like the French do extensively). Westinghouse has long been promoting thermal storage based load following in their other reactor designs [1]. Where instead of perturbing the reactor's power, you divert the thermal output to a molten salt thermal battery when you want to decrease power suddenly, and use the battery to pre-heat feedwater when you want to increase power suddenly. For their LFR design they are claiming they should be able to load follow within 65-125% of nominal full power (ramping at 10%/minute). As long as the load-following averages out to 100% power over a long/short enough time period the reactor never has to change power level.
The only really needed to do this at any thermal plant is to over size the steam turbines, install some piping, and build an insulated salt tank.
Of course Westinghouse hasn't built any plants with that feature since it doesn't make economic sense without variable energy pricing.
[1] "Status Report – Westinghouse Lead Fast Reactor," (Westinghouse Electric Company LLC, United States of America), https://aris.iaea.org/PDF/W-LFR_2020.pdf
Light water reactors (like the AP300) don't get hot enough for molten salt storage. That requires one of the high temperature Gen IV reactors.
That just means for LWR applications you need to switch the storage media. Any thermal plant could implement heat storage if they identify a medium with a large latent heat of fusion that is around where you want to preheat your working fluid to.
The salt used for energy storage is normally a eutectic mixture of ~60% sodium nitrate and ~40% potassium nitrate (not NaCl) and has a melting temperature of ~260 °C. The secondary on a PWR has a maximum temperature of 275 °C. So within the liquid range of typical thermal storage salts, though I suspect finer tuning of the salt composition would be used to reduce the melting temperature to closer to the feed water temperature of ~220 °C.
Heat storage is technically possible at any temperature, but it's much more cost effective at 600°C than 300°C.
And if you're going with batteries, you might as well charge them with a source that has low levelized cost. That's not nuclear.
> 15 MWe / min "load following" capability
I know nothing of nuclear. What sets this rate? Is it some system level thermal gradient limitation? Or maybe the complexity of the safety around the movement?
Mostly it is the half lives of the reaction products, sometimes the half lives of the decay products or intermediate products. Ramping up and down was historically somewhat difficult, famously one of the contributing factors to the Chernobyl accident was the reactor having been turned down to a very low level on the previous day. You have to be careful not to accidentally poison your reactor during the day when the solar farms are chugging out megawatts only to not be able to ramp up in the evening when the sun sets.
Part of it is also that whole system have large amount of "inertia". We are talking about megawatts of power. First in heated water and then passed through a massive turbine. This whole process carries for a bit. And you want it to run the grid and not the other way around.
> And you want it to run the grid and not the other way around.
As simple as it is, this really blew my mind. I never considered that all the plants need to work together, as a system, each with "inertia", but all without ringing/oscillations. Thanks, I'll need to dig into this deeper!
I can only assume that having grid batteries can really clean things up.
The turbines have some perks in this. Batteries can supply power, but they are not as good in smoothing things out than actual large physical masses in thermal power plants and hydro plants.
But it is kinda amazing to know that all of these generators run essentially in sync. If one slows down all of them slow done.
The so far most successful SMR, BWRX-300, is also that size, and also a scaled down version of a long lineage of reactors. They are shooting for the same price target as well. So they are just going where the market is going.
Question about SMRs generally: does the fact that random private companies can apply to buy one of these, mean that nuclear reactors are now basically "state-secret free" as a technology? Or are there still state secrets, but they're all in the e.g. uranium enrichment part of the pipeline — that buying one of these gets you no closer to seeing — rather than in the reactor design?
All power reactors are free of state secrets and owned by private institutions. Some information is required by law to remain secret, such as locations of security hardening features, cameras, guns, etc.
Good old Eisenhower and Atoms for Peace, 1953.
> Some information is required by law to remain secret, such as locations of security hardening features, cameras, guns, etc.
This was more what I was trying to motion toward — if you sell a power reactor to the private market, aren't you basically guaranteeing that "locations of security hardening features" and so forth will be able to be scrutinized at leisure by foreign state actors who can then use that information to attack other power infrastructure made to the same template?
true, this information will also be available to everyone in short time, but i don't think it's a big problem.
secrecy is a sliding scale, and delaying and thinning out access to this information makes it "more secure" but surely some cleaning person will tell their buddies over a pint.
The security of nuclear power stations themselves is overseen in the US by the federal government - there’s a good description here:
https://www.nei.org/resources/fact-sheets/nuclear-plant-secu...
I worked on automated conversion of a large collection of documents for nuclear power stations some years back, and there was a lot of bureaucracy around who could access the documents, and how. I wasn’t supposed to take any files home, for example, which meant I couldn’t copy them onto a laptop. Management claimed that this was due to federal laws, but I didn’t ever investigate the legal basis.
Unintuitively, the principles nuclear reactors operate on are really very simple. Put enriched uranium rods close together and they get hot, hot enough to create steam and hence electricity. The complexity comes from designing and engineering for the reliability and safety scenarios. So no, no state secrets.
Building a nuclear warhead requires assembling a large sphere (ish) of material well above critical mass faster than it can explode. Only then does the chain reaction efficiently consume a lot of fuel before it breaks up (inertial confinement). Usually, you need to use explosives to blast the different pieces of fuel into each other. This requires very accurate control of the explosions that ignite the warhead, and is where the secrets are.
A nuclear reactor uses fuel juuuust barely above critical mass so the reaction is slow. It's a very different thing. Producing the fuel is just basic centrifuge isotope separation, which has many non-fission applications.
well, not really. the fuel in typical reactor is lowly enriched uranium (LEU), meaning that only a small percentage of the fuel is actually fissile (ie. it breaks when interacting with one neutron and release energy). So no, it can't get sustain an exponentially growing reaction chain in any condition, since neutrons are absorbed by the non-fissile part of the fuel
> juuuust barely above critical mass
Interesting. Everyone says that nuclear reactors are safe and material can't be used to build bomb, how are we ensuring that were always only tiny bit above CM... And this bit doesn't grow unexpectedly.
As laic, i see many failure modes, starting with simple accidents in handling of rods or bad storage
It's actually just below Critical Mass. (For obvious reasons).
You need sufficient fissile material to provide the increased neutron flow generating heat which you can use to power generators, but you have to then extract that heat efficiently to prevent your nuclear core from literally melting.
This is small: https://www.radiantnuclear.com/
Yes, I understand it isn't comparable (1.2MW). But portable doesn't require construction. Surely something in-between can be built?
If you like that one, you'll also love an actually-operated version of this: ML-1 from the 1960s. It operated out in Idaho at the National Reactor Testing Station.
That article is _hilarious_
"Extensive shielding was omitted in favor of a personnel exclusion zone of 500 feet (150 m) while in operation"
There's no "Add to cart" on the page. What kinda online store are they running?
I like how there’s a contact form at the bottom where you can presumably write something like “one nuclear reactor, please.”
My thoughts exactly.
This is the nuclear industry's third attempt at "SMR". It is what they bounce to when the large-scale reactors get too expensive.
"THE FORGOTTEN HISTORY OF SMALL NUCLEAR REACTORS"
https://spectrum.ieee.org/the-forgotten-history-of-small-nuc...
Reactors are not small enough until they are rolling off an assembly line.
They don't have to be small to roll off an assembly line.
France: https://twitter.com/Mangeon4/status/1632627857837924352
Florida: https://whatisnuclear.com/offshore-nuclear-plants.html
Have they sold any of these to anyone yet?
Of course not, they have to get first the approval from the Nuclear Regulatory Commission. They estimate this would take 3 years. I don't find that unrealistic, since it's based on an existing (i.e. approved) reactor design, AP1000. NuScale got their approval last year, and they applied in December 2016 [1]. So about 6 years for a design submitted by a new company with no prior experience whatsoever.
[1] https://www.nrc.gov/reactors/new-reactors/smr/licensing-acti...
Nit: it's "Modular"
Too big.
How do I write my state to buy one?
I need one.
Crafted by Rajat
Source Code