But even if you have a working stellarator that's a very long way from an economically viable energy source. You've still got to a) figure out how to cheaply convert the released energy into electricity (and the baseline way of doing that in D-T fusion is...a steam turbine), and b) figure out materials that can survive the radiation bombardment for a sufficiently long time.
In sunny places (and I fully acknowledge that's not all of the world) it's still going to be hard to beat sticking bits of special glass out in a field and connecting wires to it.
But we should sure as heck keep tinkering away at it!
I don't think the point of this project is to be closer to economic viability, but to demonstrate an approach that would lead to faster economic viability due to allowing faster iteration and evaluation of small scale experimental designs.
In HN terms they are demonstrating a significantly faster REPL by keeping the project small and minimising use of esoteric or highly bespoke components.
It's the closest you can get to building your own stellarator by walking into radioshack. I think it's a pretty cool idea.
Not sure why steam engine is written like it's some crazy reveal. It's the accepted way of doing it in most other fossile fuel plants, to the point that there's a saying for it - someone, somewhere X is boiling water so you can boil water. There are other designs that have been proposed, but I'm not aware of anything in production that takes heat and turn it into electricity at powerplant scale. Eg RTGs exist, but only if you need a computer power supply's worth of power for decades.
Perhaps because the steam part alone, even if powered by pixie dust and magic, can't compete on price with solar, and probably solar and batteries either (varies with latitude and time but the cost is going down and the latitude is getting wider).
I think OpenAI is investing into a fusion design that avoids steam for exactly this reason, so it's not just an anti-nuclear talking point.
Newer solar panels don’t need full sun to function. It’s economically viable to place them further north and in cloudier climates now. So the area where these are alternatives are viable may be shrinking faster than you would expect.
Indeed, though in colder climates you do have the problem that peak electricity demand is precisely when you get minimum solar production.
But in sunnier, warmer parts of the world (which notably includes India, Pakistan, Bangladesh, Indonesia, Nigeria, Egypt, Ethiopia, Iran, Mexico, and Brazil, amongst others), over the next few decades it's hard to see anything much competing against solar and batteries for the bulk of energy usage.
That isn’t true yet, because most cold places don’t use electricity for heating; in Boston for example the norm is oil or gas furnaces. Europe typically uses gas for heating, as we all learned during the Ukraine war’s beginning. When I lived in the Boston area, our electricity use, peaked in the summer during air-conditioning season. That is, until we got a heat pump, then our electricity spiked like crazy in the winter. but we are long way from having the majority of homeowners using heat pumps for their primary heat source. (Also heat pumps are kinda awful compared to furnaces in my experience so we might never get there).
Heat pumps work great under the right circumstances. In cold climates like Boston and north of it, ideally you want ground source, which means digging deep (or wide) holes to flow water through instead of just pulling heat from outside air. Alternatively, (or additionally, ideally) they're best used in homes with very tight air envelopes, good insulation, and heat/energy recovery ventilation (HRV/ERV) systems. Installed in radiant heating systems in well-built houses, heat pumps are fantastic and you'll be comfortable all year for pennies on the dollar.
The reality is that most (US) construction, especially older, is just terrible in terms of air seal and insulation. Couple that with a potentially undersized air-source heat pump, which gets inefficient as the outside temp gets near the low end of its operating range, and you will not have a great time. But even then they are a good way to supplement gas heating, so you can limit furnace use to the coldest weeks of the year and do the environment a favor, as well as cool in the summer.
You shouldn't spend that much. If you need an air conditioning system, making it a bidirectional heat pump is negligible unless people are trying to rip you off. (They are)
Comparing a cold-climate AC to a cold climate heat pump shows how confused YOU are.
ACs in cold climates are cheap. Cooling a house down 20 degrees doesnt take a lot of BTUs. Plus they’re just cheaper per-BTU and a typical northeast homeowner is going to be comfortable installing a cheap brand because it’s not life-or-death if your AC goes out.
When you start talking about having a heat pump being the primary heat source in a cold climate home, that’s a different ball game. First you need WAY more BTUs to heat a house in winter lows — even NYC hits single-digits most winters, that’s a 60 degree temperature differential. And you need to be prepared for the worst temperatures which basically necessitates a top-end brand like Mitsubishi which is rated down to historic-ish lows.
Not to mention that many homes don’t have central ducts, so converting to a heat pump means ductwork or mini splits..
No, the difference in cost between central air conditioning (a heat pump that works in one direction) and a heat pump that works in both directions is negligable unless you're being ripped off.
Here’s a concrete example that took 5 minutes to find. Two condensers from the same company, same seller, same SEER, same series, same tonnage, except the second does heat. The heat pump is 29% more expensive.
2 Ton 14.3 SEER2 Trane Air Conditioner Condenser - RT Series. $1615
And despite the handwringing, air source heat pumps can work fine in pretty cold places (eg Norway).
You can, right now, buy air source heat pumps rated to -28C (-14F). While not enough for record lows in, say, Chicago or Toronto, it's plenty for places like New York, Seattle, and Vancouver.
Rated to -14F doesn’t really bring a lot of comfort. NYC has reached that temperature; NJ has reached -34. In a typical winter you might stay above zero every minute of every day. But when that big arctic blast comes through and your heat pump efficiency tanks, right at the same time your house is bleeding its heat faster than ever, that could be pretty dangerous. So, every heat pump installer recommends a having a backup (at least the dozen I’ve talked to in the Northeast US). it’s hard to convince regular people that heat pumps are the future when you’re also told to have a backup.
This year I installed a heat pump with a 10kW auxiliary heater for my home in southern Ontario. With an appropriate air handler, both the heat pump and auxiliary heater can run at the same time. The setup seems fine to me.
If anything having a backup heat source makes me feel secure. A lot of things can go wrong with a gas furnace; some things can go wrong with a heat pump; but 10kW of electric restive heat is dead simple and just by itself it can provide 34000 BTU/h, which is 60% of the output of my old gas furnace. I don't think the duty cycle of my gas furnace ever exceeded 60% running time.
The auxiliary heater consists of two 5kW heaters, so even if one fails, there is still the other.
True, but you could always have a battery back up for the blower and such.
Do you have a wood burning back up? At my house in Massachusetts, we had a wood-burning stove that we could always fire up in a pinch. It never came to that, but it felt good knowing that we can always just burn a super hot fire and not freeze to death.
While I could have had a battery backup for the blower, I never did, so I guess I'm not missing out. :)
I live in an urban environment, so I'm not too worried about losing power for an extended period of time, and if I do I would probably just leave temporarily.
In my rural living fantasy I have a Ford F150 Lightning which I use as a "portable" backup battery that can be recharged by driving it to a Level 3 charging station. That and either a pellet stove and/or a Masonry Heater fireplace that sits opposite to a large equator (south) facing window.
It sounds like you're suffering from the typical grift where contractors over-charge because they don't want to do the work you're asking for. This is common but its a problem with contractors. The standard way to build in the US involves illegal immigrants doing poor quality work.
If there is a heat pump rated to -14F and an area has historic lows of -34F it sounds like a bit more than just grifting going on. These people are setting up a system where, if everything works to spec, the customer may well freeze to death one winter.
I don't know much about how heat pump performance degrades, but going in to a life-and-death situation where the main heat system is operating outside rated performance seems like poor planning.
No one is going to install a system that puts you in a life or death situation. Worst case is it uses resistive heating to deal with extreme cold. Systems like that are widespread and still save people money.
Despite being effective at low temperatures when properly sized, and with traditional heating when needed, they aren't efficient during the coldest days when you have a leaky house. That's where ground source systems clean up.
My main issue with GS is the complexity. gas furnaces are pretty bulletproof; in my experience they are really reliable and east to fix (or just replace in a one-day job) if there are issues. Air source is a little more complicated to fix but not too bad. Ground source though, part of the system is literally buried underground!
Though solar panels prefer if it isn't too hot. Still, any such inefficiencies are easily outweighed by their low cost. In Netherlands it is common to install practice (and advised) more Wp than the inverter can handle. Maybe also a solution if it gets too hot, make it up by having more panels.
As someone living in the further north, the issue of sun not going above the horizon for months at a time makes them a non-starter without 100% coverage from other sources, making them rather expensive without externalizing the costs to someone else
Depending on the location, there could be sources of geothermal energy, like Iceland. There are also various prototypical approaches at storing solar energy for the winter months, like molten salt (obviously a bit too spicy outside of industrial settings), or generating hydrogen and storing it in iron ore. The latter approach was recently pioneered by ETH Zürich.
I don't live nearly that far north and the low sun angle for much of the year frequent cloud cover and couple weeks a year that a panel will have snow on it adds up to it just not being economically viable vs other generation methods. If you DIY the whole thing out of used panels and other low cost parts then it makes good sense but that is a huge time sink but that's kind of the exception. I hate my utility and look into stuff like this every year or so so it's not like I'm using 2010 numbers for panel prices either.
Have you also considered putting the solar panels straight up? This when using bifacial solar panels. Might help with getting the snow off. People started using them as fences, though you need the space/land for that.
Also, the various times I noticed talk about solar panels in the US it seems it is way overpriced, coupled with more expensive options being chosen. E.g. loads of micro inverters where a string inverter would make way more sense (economical).
Not affiliated, but e.g. https://www.solar-bouwmarkt.nl/ (use Google Translate), how do those prices compare to what you're seeing locally for panels and string inverters?
The other half needs energy too though! And high-latitude areas are known to have dense enough populations and exceedingly high economic productivity, from Sweden to Ireland to New England.
It could be viable to generate biofuels using solar energy and use these for heating. One would get the best of both worlds - carbon neutrality and the possibility to use existing infrastructure. But it requires that the price for biofuels falls below the price for fossil fuels.
>It’s economically viable to place them further north and in cloudier climates now
Which it really isn't unless we're neglecting to count in the need for long-term backup power. There still exists some amount of solar even now, though government subsidies and government provided peaker plants may have something to do with it
> Which it really isn't unless we're neglecting to count in the need for long-term backup power.
But that's not what the topic, no? It was about economically viable. It wasn't about fully relying on them. If someone can buy or place a solar panels/plant and it is economically feasible, then it is.
E.g. EU is finally connecting the various electrical grids together. So that electricity can more easily be exported/imported. Yet another way to deal with the fluctuations.
Connecting the various electrical grids together makes each part dependent on the grid as a whole, which transfers the dependencies and risk to each nation. When supply in the grid is low, everyone rushes to bid on the limited resources and prices spikes, leading to high instability in prices. Last time that happened the collective nations of EU spent ~800 billions in subsidies during a few months to bail citizens out, and multiple elections was won or lost based on the willingness to spend subsidies. Following that, grid taxation increased steeply in order to both cover the subsidies and build out new fossil fueled power plants and fuel supply chains in order to handle the next time supply drop.
It is not a perfect way to deal with fluctuations, and it was proven beyond doubt that voters will not accept power prices to run unchecked in a EU connected grid.
The irony of interconnected grids to "reduce volatility" is that any place that previously had a stable grid with low prices are suddenly seeing the exported volatility from the German energy suicide experiment crossing the borders to rape their wallets
> If someone can buy or place a solar panels/plant and it is economically feasible, then it is.
"then it is" what? economically viable? He still needs 100% backup from other sources, so you have to factor that cost in. Possible? Yeah, it's possible, but that wasn't the question.
> "then it is" what? economically viable? He still needs 100% backup from other sources, so you have to factor that cost in.
Nope, you do not need to factor that in. If I put solar panels on my house, I check if it's worth the cost/investment. I am not intending to go fully off-grid, that is not the aim. Similarly for a solar plant. If it's economically viable it means if there's a good return on investment.
> He still needs 100% backup from other sources, so you have to factor that cost in.
Again, if you put solar panels up or if you have a solar plant it does not make any sense to factor in such costs to make a business case/economical sense. You're adding complications that aren't there.
I'm just not ignoring externalities. You can't do your thing _unless_ the other thing is being taken care of, you have a hard dependency on it, so factoring in the cost of that is the right thing to do. You'll pay that cost, too, be it via your net-hookup fees, or taxes that subsidize it. If you're lucky, others will pay more than you do and you can make more money. That's economically viable at a small scale but does not scale far, because you quickly run out of other people who foot the bill.
Much like tax havens do not scale, because they don't produce anything of value, their concept does not work without other countries where the value is being created.
Globally it's the US that is the exception. Solars fall apart elsewhere be it solar irradiance or population density or prevalence of flying debris or whatever it might be.
Only in the US, Australia, and few other places it makes sense to just put up some panels for free energy. Incidentally also sometimes apply to EV arguments.
The next option for you would be to take the panels that you would have used and put them in an area where they work better.
Use the extra electricity to power machines that hydrogenate CO2 extracted from the atmosphere and turn it into methane.
Methane generated this way, (being the principal content of natural gas), would allow us to deliver nearly carbon-free fossil fuels to you and people in your biome, and everyone wins.
Well as long as you have vast amount of storage capacity + overprovision, or an alternative source of on-demand electricity, the cost of which I never see included when comparing to on-demand energy sources like nuclear of fossil fuels.
> Well as long as you have vast amount of storage capacity + overprovision
Why would that need to be? The full needs need to be taken into account. But that's a TCO calculation, not something to add to the solar cost.
Nuclear energy in Europe tends to be way more expensive than initially budgeted. Resulting in a crazy difference in the kWh cost vs solar/wind. And there's more ways to store "electricity" than just batteries.
Because what is implicit in saying that solar (or wind) is cheaper than nuclear per W produced is that it is a viable alternative to nuclear. But to be an alternative you need to be able to meet the demand whether there is light/wind or not (with a near zero probability of a blackout). And if you need to do that then you need either a bunch of alternative on demand sources of energy (UK is using LNG) or some big storage capacity (+overprovision to fill them when it is sunny/windy). Nuclear doesn't need that. If you don't factor those costs you are comparing apples and oranges.
> Because what is implicit in saying that solar (or wind) is cheaper than nuclear per W produced is that it is a viable alternative to nuclear.
I do not see that implication.
Solar and wind is significantly cheaper than nuclear. That doesn't mean it is a replacement. That implies that there's a great way to solve the obvious drawbacks solar and wind have because of the high cost difference, plus speed/time things can be built.
A country needs to figure out their TCO and the energy mix. Which means yeah, the volatility needs to be solved. Which means that there needs to be more than solar/wind. But at the same time, nobody wants to invest in nuclear. It isn't commercially viable. It is important to not have to high electricity prices. Wikipedia has quite a section on the newest nuclear power station in the UK and the kWh cost for consumers at https://en.wikipedia.org/wiki/Hinkley_Point_C_nuclear_power_.... The cost was initially estimated at £24 per MWh but could now be £92.5.
Nuclear power stations have crazy cost overruns. The initial estimations are far too low (except maybe in China because they have recent experience).
> And if you need to do that then you need either a bunch of alternative on demand sources of energy (UK is using LNG) or some big storage capacity (+overprovision to fill them when it is sunny/windy). Nuclear doesn't need that.
Nuclear doesn't need a backup? UK built nuclear and has LNG. It's not so black and white as you stated.
That was true in the 70s. I think anything built in europe since the 80s can do load following and France does that every day. I think by regulation new plants must be able to adapt their production by something like 5% per 10min (or something of that order of magnitude), well within the range you need to meet intra day variations.
Now whether it is optimal economically is another question. If you have some sources of energy in your grid that cost per usage (eg fossil fuels), you should rather switch those off than nuclear which costs the same whether you use it or not. But if your grid is almost all nuclear (eg France), you do load following.
That’s news to me. How do these newer reactors do it? I thought nuclear reactors are slow to ramp up. Are they over producing thermal energy relative to the plants electricity output?
The newer ones can. Though that's partly my dislike. There's always yet another new technological solution for the various drawbacks. Which often are unproven (e.g. SMRs) and that often results in crazy cost overruns.
Nuclear has a crazy high fixed cost so ideally you still run it continuously.
Steam turbines are very good at what they do, so I don't think that's the problem. Turbines are well understood technology so they're they're the major impediment to economic viability then everything else must be working amazing.
I do think economic viability will be a major problem though. The fusion hype crowd focus on Q>1 being their major milestone but that's still a long way away from it being profitable to operate a fusion plant.
IIRC, a cascade of turbines, going from high-temperature water steam to basically room-temperature ammonia, can reach a total efficiency exceeding 50%.
Of course, ammonia is chemically active, and water at 700K is also chemically active, so the turbines, as usual, require special metallurgy and regular maintenance.
Turbines are indeed very well understood and mature technology, and even where the fuel is almost free and the reactor (a coal burner) is cheap, steam turbines are too expensive to compete with solar electricity in most places now.
How many more trillion $$$ are going to be pissed away on this?
I'd say, shelve the idea for 50 to 75 years and then look at it again.
In the mean time, I think we could make major headway on a global elecrtic grid, that connects whatever part of the planet is sunny with all the rest. Add to this some major storage capacity, and I think we could resolve almost all of our energy problems with the money that would be wasted on further fusion efforts.
> In sunny places (and I fully acknowledge that's not all of the world) it's still going to be hard to beat sticking bits of special glass out in a field and connecting wires to it.
Why not both? Can't we utilize the special glass to fetch energy from a man-made fusion reactor, in similar ways as we use it to fetch energy from the natural fusion reactor in space?
Treating this as a serious question: because the big reactor in the sky has a built-in gravity well and a very big sphere of vacuum that works as a free containment system while still letting just enough energy out to be useful. Also we don’t need to refill it or handle the spent fuel.
D + T or D + D fusion reaction produces neutrons and hard gamma radiation. The special glass cannot capture the energy of these directly.
The natural fusion reactor has a blanket that is literally thousands of kilometers thick. It effectively converts much of this into longer-wavelength electromagnetic radiation, from ultraviolet to infrared, with quite some visible light. That's what the special glass can make use of.
Also, the highly radioactive blanket is kept hundreds of millions km away from consumers, which helps alleviate the problem of disposing of radioactive waste. With a planet-based fusion reactor, we'd have to think where to put thousands and thousands of tons of slightly radioactive concrete which would realistically serve as a blanket.
I'm having trouble understanding what's actually been accomplished here. The article provides a good overview of Tokamak vs Stellarator, but seems to jump back and forth between proclaiming this as an innovative breakthrough and saying it's just a framework to test ideas.
> In terms of its ability to confine particles, Muse is two orders of magnitude better than any stellarator previously built
Is it? It doesn't seem as if they have reached first plasma or have plans to do so anytime soon. Using electromagnets to not only confine but also to control the the plasma is a big selling point of the stellarator design, and they don't seem to address this.
This seems really cool, and I love the idea of lower-cost fusion. (Or even just functional fusion.) There are about a dozen companies making real progress in fusion, but I can't quite figure out what this team has actually accomplished.
Seems like the premise is that building these small experimental stellarators is a major cost in doing fusion research and therefore if we can bring the cost of these down from billions to less than a million, more teams can do more research faster, even if this specific design never generates any economic power. I have no idea if this premise is true or not -- I'm just a layman who read the article.
“PPPL researchers say their simpler machine demonstrates a way to build stellarators far more cheaply and quickly, allowing researchers to easily test new concepts for future fusion power plants.”
This quote reminded of the SpaceX’s approach to engineering and why they have leapfrogged past Boeing. Instead of spending 10-20 years and billions into a single design, SpaceX iterates.
I want to like this but at break even temperatures these things just melt. How about making ship stones from nuclear waste? You could make car engine sized batteries that would effectively last for years if properly shielded and which would provide essentially free power to run a dozen homes.
how one can estimate a progress of a given design? For example on the photo there are no walls protecting the people in the lab from neutrons. Even the fusor 60 years ago running at 100KV already generated neutron flux requiring such walls.
There's absolutely no way to get fusion with permanent magnets and copper coils.
This is a plasma test stand, and because it is so simple, you can potentially quickly iterate through different field configurations. This is at least a little bit useful, because a full Stellerator is extremely complicated to take apart, so you can't just change the coils around if you want to change something.
People don't understand the fundamental problem of fusion. It's a problem of energy loss. Of enormous energy losses.
Roughly speaking energy can be mechanical, for particles or radiative, for photons. The first one is proportional to the temperature (the famous NRT) and the second is proportional to the fourth power of the temperature. The constant of proportionality is very small, and at regular temperatures we generally don't think of it that much. But at millions of degrees Kelvin, it starts to dominate all considerations.
The heat always moves form hot to cold. In the case of particles the heat flow is proportional to the difference in temperature, and in the case of radiation with the difference in temperature to the power 4. But heat also travels from particles to photons and vice-versa. It doesn't matter how.
The problem with fusion is now this. Suppose that you have a super-duper device, let's call it brompillator. It brings an amount of deuterium-tritium mix at the required temperature, let's say 10 million Kelvin. Now that volume of plasma is surrounded by cold stuff. You can imagine that you have some mirrors, or magnetic fields, or some magic stuff, but the cold hard stuff is that that plasma will want to radiate to the exterior and the flow of heat would be proportional to the surface area times the fourth power of the difference in temperature. Since for all practical purposes the outer temperature is zero, we are talking about the fourth power of 10 million Kelvin. Now that constant of porportionality is very small, it is called the Stefan-Boltzman constant and has a value of about 10^7 W m^-2 K^-4. Let's say the surface area is 1 square meter. So the heat loss happens at a rate of 10^-7 times (10^7)^4 = 10^21 Watts. That is 10^12 GigaWatts. One GW is the output of a decent sized nuclear power plant.
Of course, you can try to shield that plasma, but that shield has to be 99.99999....9% effective, where the number of 9s needs to be about 15 or so.
That is the immensity of the challenge that nobody is willing to tell you about.
How was this overcome in the case of the thermonuclear bomb? People imagine that once you have a fission bomb, you just put some deuterium-tritium mix next to it, and voila, you have a fusion bomb. No. The world's greatest minds have worked at this issue for about 5 years. The solution was something like this: if you first compress significantly the volume of fusion fuel, then the heat losses are much smaller (remember they are proportional to the area, and that's proportional to the square of the radius). They will still be tremendous, but you don't even aim to keep the reaction going for a long time. The duration of the fusion reaction in a thermonuclear bomb is still classified information, but public sources put it at the order of 1 microsecond. The heat losses are still tremendous, but for a short moment the heat gains from the fusion reaction are even greater, so ignition is achieved.
In the NIF experiment that achieved more than breakeven 2 years ago, the fusion lasted less than 10 nanoseconds [1].
If someone thinks the brompillator will achieve fusion and that will run for years, or even hours, or seconds, they don't understand the fundamental problem. Unfortunately, nobody is willing to ask hard questions about this, not even Sabine Hossenfelder.
I don't disagree with this statement, fusion researchers do care about energy loss when they're evaluating fusion reactor feasibility. They talk more about neutron loss, bremsstrahlung radiation and synchrotron radiation instead of blackbody radiation. A paper on this: https://arxiv.org/pdf/2104.06251
I tried to search more about plasma energy losses, and it becomes extremely complicated with insane amount of equations. One thing that I can get is that you can't model fusion reactor plasmas as a blackbody radiator because plasma is that complicated. If plasma is simpler then we should either have fusion already or we have given up on fusion research a long time ago
> because fusion reactor plasma is optically thin, it doesn't radiate following blackbody radiation law.
It still follows the laws of blackbody radiation - it's just that the emissivity of the body is part of the equation.
A classical blackbody has a emissivity of 1. This means not only that it absorbs radiation really well, it also means it's really good at radiating energy away.
Things that have low emissivity (all things transparent and all things reflective) are also really bad at radiating energy away. This is used for solar-thermal collectors today: you make them from an engineered material that is completely black at in the visible range, but highly reflective in the infra-red. That way, they absorb sunlight and get hot, but they don't lose heat energy because they can't radiate it away as heat radiation.
And yes, fusion plasma is extremely, extremely transparent. Not only is it extremely thin (ITER or Wendeltstein 7-X contain only 1-2g of hydrogen during operation), hydrogen is also extremely bad at absorbing gamma-rays (black body radiation at 1e8 K).
Inertial confinement fusion like the NIF is not intended to run continuously, so the 2ns duration is irrelevant. The surface area for the calculation is not the surface area of the machine, but the surface area of the volume in which the fusion is occurring which could be very much smaller than that.
The heat loss is practically limited by the mass of hydrogen fusing in the machine. To have a continuous heat flux of 10^21 watts you would need to fuse ~4*10^5 kg of hydrogen every second. Which clearly these machines are not intended to do.
> Inertial confinement fusion like the NIF is not intended to run continuously, so the 2ns duration is irrelevant.
Indeed. I do think ICF has a future. The issue I described applies to machines that attempt to achieve sustained fusion. Pulsed fusion is ok.
> The heat loss is practically limited by the mass of hydrogen fusing in the machine.
Yes, but it goes the other way too. If the heat loss is to high you can't sustain fusion because you can't stay at the required temperature for long enough.
> People don't understand the fundamental problem of fusion. It's a problem of energy loss. Of enormous energy losses.
I'm not sure that's even true, because if you manage to crack that, you still have the problem that your sustainable reaction is pumping out most of its energy in the form of very fast neutrons, which are (a) very hard to harvest energy from and (b) extremely bad for people and materials if you don't. You could have a self-sustaining reaction that you can't actually use!
It requires much higher temperatures, and thus suffers much higher Bremsstrahlung. You can sniff out BS quickly with anyone claiming a steady state aneutronic reactor. A working aneutronic design would necessarily be pulsed. Not that it can't be done, but you'd first need to pass through DT and DD performance metrics and then go another order of magnitude. No one's done that yet.
High energy neutrons leave the system. They cause damage to the container (ie neutron embrittlement) and that's a separate problem. But the real problem is the energy loss from the system.
Charged particles can be contained. Personally I think there are limits to even that because a high-temperature plasma is turbulent [1]. Containing that is just a hugely difficult problem.
I'm not convinced that nuclear fusion will ever be commercially viable.
Also, the fusion reactors will inevitably have poor volumetric power density, due to limits on power/area through the first wall and the square-cube law.
Engineering studies of stellarators found they tend to be larger and have worse economics than tokamaks.
We need a liquid with high heat capacity, large hydrogen content (for high neutron interaction rate) and a solid 300 year engineering history in heat engine applications. Better if it is nontoxic and environmentally friendly as well.
I'm a physics layman, and I'm having some trouble with uniting the content of your comment with the fact that existing magnetic confinement experiments have reported maintaining a plasma at the right temperature for longer times (not with fusion, but with microwave heating, and with the power of those heaters in the 10MW range).
Have I understood the consequences of those reports wrong? Does the heat loss you talk about only occur with fusion? (And if so, is it even a problem if the conditions for fusion to occur can be created by external heating this "easily"?)
In order to protect astronauts from decompression, the hull of a spacecraft has to be insanely good at stopping gas particles. Not 99.5% good, but like 99.9999999…% with 20 zeros! That’s very good.
But a thin metal sheet has no trouble doing this, as demonstrated by the Apollo lunar lander.
Some things are just not as hard as they sound. Magnetic confinement works very well. It easily achieves the necessary 9’s.
It’s just hard to keep it stable at millions of degrees, but that’s a different problem.
I'm not sure, but we can try to figure out what is going on. And by the way I'm a physics layman too. I just read a lot of books about fission and a few about fusion too, it happens to be my hobby. When I'm bored, the bookmarks that I browse are [1] and [2].
So, when reports state that the a certain temperature was achieved and sustained for a certain period of time, what are they actually saying? We could go and find an article and get into some details, but I imagine they say that somewhere in the plasma that temperature was reached and sustained. But it is quite likely that that region is quite microscopic, maybe a very, very thin inner torus inside a larger torus. There is a gradient of temperature from the region where the announced temperature happens to the walls of the device. But one way or another that thin inner region can't have a surface area of anything close to 1 square meter. To get to 1 GW of power, you need 10^-12 square meters, and to get to 10 MW you need 10^-14 m2. That's about the surface area of a torus of (circular) length 3 m and diameter 1 femtometer. 1 femtometer is roughly the size of a nucleus of deuterium or tritium, so in principle this is the minimum diameter of a torus where you can talk about fusion.
Just wanted to say thank you for this comment, fascinating and perfect example of the beauty of HN. Relatively fresh off The Making of the Atomic Bomb and while fusion was not at all a focus, this (incomplete) impression is exactly what I came away with.
Is there any chance you'd recommend any books related to these topics? The walk through decades of revelations in physics was the most enjoyable aspect of that book, I'd love to continue building on that story.
Well if you liked "The Making of the Atomic Bomb" I can strongly recommend the follow up "Dark Sun" that covers both the Soviet atomic bomb program and the development of the H-bomb by the US.
Not sure how I missed that Rhodes wrote a continuation! The clarity of writing about physics for a layperson has been wonderful, glad to find there's more. Much appreciated.
One of the points I took away from that book being that "H-bomb" weapon design is as much about fission as fusion - most designs being fission-fusion-fission with most energy coming from the latter fission stage.
That book is absolutely great. As the sibling comment mentions, the Dark Sun is also great.
Here are some more books I read on this topic. One was written by someone who was very close to the Ulam and Teller inner circle: "Building the H Bomb" by Kenn Ford. Another one is "Sun in a Bottle: The Strange History of Fusion and the Science of Wishful Thinking" by Charles Seife. And finally, you can't go wrong with any book written by James Mahaffey.
Cool.
But even if you have a working stellarator that's a very long way from an economically viable energy source. You've still got to a) figure out how to cheaply convert the released energy into electricity (and the baseline way of doing that in D-T fusion is...a steam turbine), and b) figure out materials that can survive the radiation bombardment for a sufficiently long time.
In sunny places (and I fully acknowledge that's not all of the world) it's still going to be hard to beat sticking bits of special glass out in a field and connecting wires to it.
But we should sure as heck keep tinkering away at it!
I don't think the point of this project is to be closer to economic viability, but to demonstrate an approach that would lead to faster economic viability due to allowing faster iteration and evaluation of small scale experimental designs.
In HN terms they are demonstrating a significantly faster REPL by keeping the project small and minimising use of esoteric or highly bespoke components.
It's the closest you can get to building your own stellarator by walking into radioshack. I think it's a pretty cool idea.
Yep, sure. And that's great.
Not sure why steam engine is written like it's some crazy reveal. It's the accepted way of doing it in most other fossile fuel plants, to the point that there's a saying for it - someone, somewhere X is boiling water so you can boil water. There are other designs that have been proposed, but I'm not aware of anything in production that takes heat and turn it into electricity at powerplant scale. Eg RTGs exist, but only if you need a computer power supply's worth of power for decades.
Perhaps because the steam part alone, even if powered by pixie dust and magic, can't compete on price with solar, and probably solar and batteries either (varies with latitude and time but the cost is going down and the latitude is getting wider).
I think OpenAI is investing into a fusion design that avoids steam for exactly this reason, so it's not just an anti-nuclear talking point.
Newer solar panels don’t need full sun to function. It’s economically viable to place them further north and in cloudier climates now. So the area where these are alternatives are viable may be shrinking faster than you would expect.
Indeed, though in colder climates you do have the problem that peak electricity demand is precisely when you get minimum solar production.
But in sunnier, warmer parts of the world (which notably includes India, Pakistan, Bangladesh, Indonesia, Nigeria, Egypt, Ethiopia, Iran, Mexico, and Brazil, amongst others), over the next few decades it's hard to see anything much competing against solar and batteries for the bulk of energy usage.
That isn’t true yet, because most cold places don’t use electricity for heating; in Boston for example the norm is oil or gas furnaces. Europe typically uses gas for heating, as we all learned during the Ukraine war’s beginning. When I lived in the Boston area, our electricity use, peaked in the summer during air-conditioning season. That is, until we got a heat pump, then our electricity spiked like crazy in the winter. but we are long way from having the majority of homeowners using heat pumps for their primary heat source. (Also heat pumps are kinda awful compared to furnaces in my experience so we might never get there).
>heat pumps are kinda awful
Heat pumps work great under the right circumstances. In cold climates like Boston and north of it, ideally you want ground source, which means digging deep (or wide) holes to flow water through instead of just pulling heat from outside air. Alternatively, (or additionally, ideally) they're best used in homes with very tight air envelopes, good insulation, and heat/energy recovery ventilation (HRV/ERV) systems. Installed in radiant heating systems in well-built houses, heat pumps are fantastic and you'll be comfortable all year for pennies on the dollar.
The reality is that most (US) construction, especially older, is just terrible in terms of air seal and insulation. Couple that with a potentially undersized air-source heat pump, which gets inefficient as the outside temp gets near the low end of its operating range, and you will not have a great time. But even then they are a good way to supplement gas heating, so you can limit furnace use to the coldest weeks of the year and do the environment a favor, as well as cool in the summer.
The issue with supplementing with gas is that you still need a gas furnace. And if you have a gas furnace, why spend $10k+ on a heat pump?
You shouldn't spend that much. If you need an air conditioning system, making it a bidirectional heat pump is negligible unless people are trying to rip you off. (They are)
Are you one of the grifters or just confused?
Comparing a cold-climate AC to a cold climate heat pump shows how confused YOU are.
ACs in cold climates are cheap. Cooling a house down 20 degrees doesnt take a lot of BTUs. Plus they’re just cheaper per-BTU and a typical northeast homeowner is going to be comfortable installing a cheap brand because it’s not life-or-death if your AC goes out.
When you start talking about having a heat pump being the primary heat source in a cold climate home, that’s a different ball game. First you need WAY more BTUs to heat a house in winter lows — even NYC hits single-digits most winters, that’s a 60 degree temperature differential. And you need to be prepared for the worst temperatures which basically necessitates a top-end brand like Mitsubishi which is rated down to historic-ish lows.
Not to mention that many homes don’t have central ducts, so converting to a heat pump means ductwork or mini splits..
All in all it’s not really a negligible thing.
No, the difference in cost between central air conditioning (a heat pump that works in one direction) and a heat pump that works in both directions is negligable unless you're being ripped off.
Just saying something doesn’t make it true.
Here’s a concrete example that took 5 minutes to find. Two condensers from the same company, same seller, same SEER, same series, same tonnage, except the second does heat. The heat pump is 29% more expensive.
2 Ton 14.3 SEER2 Trane Air Conditioner Condenser - RT Series. $1615
2 Ton 14.3 SEER2 Trane Heat Pump Condenser - RT Series. $2090
https://hvacdirect.com/trane-a4ac4024d1-2-ton-14-3-seer2-air...
https://hvacdirect.com/trane-a4hp4024d1-2-ton-14-3-seer2-hea...
That very quickly pays for itself.
Goalposts movin’ like crazy
Goalposts haven't moved. Refer to my first post and stop spreading FUD about heat pumps.
You’re spreading the FUD
There's a green trend in much of central-north europe to use electric heating, mainly by promoting air heat pumps.
And despite the handwringing, air source heat pumps can work fine in pretty cold places (eg Norway).
You can, right now, buy air source heat pumps rated to -28C (-14F). While not enough for record lows in, say, Chicago or Toronto, it's plenty for places like New York, Seattle, and Vancouver.
Rated to -14F doesn’t really bring a lot of comfort. NYC has reached that temperature; NJ has reached -34. In a typical winter you might stay above zero every minute of every day. But when that big arctic blast comes through and your heat pump efficiency tanks, right at the same time your house is bleeding its heat faster than ever, that could be pretty dangerous. So, every heat pump installer recommends a having a backup (at least the dozen I’ve talked to in the Northeast US). it’s hard to convince regular people that heat pumps are the future when you’re also told to have a backup.
This year I installed a heat pump with a 10kW auxiliary heater for my home in southern Ontario. With an appropriate air handler, both the heat pump and auxiliary heater can run at the same time. The setup seems fine to me.
If anything having a backup heat source makes me feel secure. A lot of things can go wrong with a gas furnace; some things can go wrong with a heat pump; but 10kW of electric restive heat is dead simple and just by itself it can provide 34000 BTU/h, which is 60% of the output of my old gas furnace. I don't think the duty cycle of my gas furnace ever exceeded 60% running time.
The auxiliary heater consists of two 5kW heaters, so even if one fails, there is still the other.
Do you worry about losing electricity though? A bad blizzard that knocks out power could be dangerous.
That is a worry. Of course a power outage would also knock out my old gas furnace too. Without the blower running, etc. the gas furnace wouldn't run.
True, but you could always have a battery back up for the blower and such.
Do you have a wood burning back up? At my house in Massachusetts, we had a wood-burning stove that we could always fire up in a pinch. It never came to that, but it felt good knowing that we can always just burn a super hot fire and not freeze to death.
While I could have had a battery backup for the blower, I never did, so I guess I'm not missing out. :)
I live in an urban environment, so I'm not too worried about losing power for an extended period of time, and if I do I would probably just leave temporarily.
In my rural living fantasy I have a Ford F150 Lightning which I use as a "portable" backup battery that can be recharged by driving it to a Level 3 charging station. That and either a pellet stove and/or a Masonry Heater fireplace that sits opposite to a large equator (south) facing window.
It sounds like you're suffering from the typical grift where contractors over-charge because they don't want to do the work you're asking for. This is common but its a problem with contractors. The standard way to build in the US involves illegal immigrants doing poor quality work.
If there is a heat pump rated to -14F and an area has historic lows of -34F it sounds like a bit more than just grifting going on. These people are setting up a system where, if everything works to spec, the customer may well freeze to death one winter.
I don't know much about how heat pump performance degrades, but going in to a life-and-death situation where the main heat system is operating outside rated performance seems like poor planning.
No one is going to install a system that puts you in a life or death situation. Worst case is it uses resistive heating to deal with extreme cold. Systems like that are widespread and still save people money.
Sounds like a grift
Despite being effective at low temperatures when properly sized, and with traditional heating when needed, they aren't efficient during the coldest days when you have a leaky house. That's where ground source systems clean up.
My main issue with GS is the complexity. gas furnaces are pretty bulletproof; in my experience they are really reliable and east to fix (or just replace in a one-day job) if there are issues. Air source is a little more complicated to fix but not too bad. Ground source though, part of the system is literally buried underground!
The buried part will never need fixing unless you build it on a fault line or elon musk tries to dig a hyperloop under your house.
> But in sunnier, warmer parts of the world
Though solar panels prefer if it isn't too hot. Still, any such inefficiencies are easily outweighed by their low cost. In Netherlands it is common to install practice (and advised) more Wp than the inverter can handle. Maybe also a solution if it gets too hot, make it up by having more panels.
As someone living in the further north, the issue of sun not going above the horizon for months at a time makes them a non-starter without 100% coverage from other sources, making them rather expensive without externalizing the costs to someone else
Depending on the location, there could be sources of geothermal energy, like Iceland. There are also various prototypical approaches at storing solar energy for the winter months, like molten salt (obviously a bit too spicy outside of industrial settings), or generating hydrogen and storing it in iron ore. The latter approach was recently pioneered by ETH Zürich.
https://ethz.ch/en/news-and-events/eth-news/news/2024/08/iro...
Sure. But, globally, you're the exception, not the rule.
I don't live nearly that far north and the low sun angle for much of the year frequent cloud cover and couple weeks a year that a panel will have snow on it adds up to it just not being economically viable vs other generation methods. If you DIY the whole thing out of used panels and other low cost parts then it makes good sense but that is a huge time sink but that's kind of the exception. I hate my utility and look into stuff like this every year or so so it's not like I'm using 2010 numbers for panel prices either.
Have you also considered putting the solar panels straight up? This when using bifacial solar panels. Might help with getting the snow off. People started using them as fences, though you need the space/land for that.
Also, the various times I noticed talk about solar panels in the US it seems it is way overpriced, coupled with more expensive options being chosen. E.g. loads of micro inverters where a string inverter would make way more sense (economical).
Not affiliated, but e.g. https://www.solar-bouwmarkt.nl/ (use Google Translate), how do those prices compare to what you're seeing locally for panels and string inverters?
Technically this is correct. Look at the small circle near the equator here which encircles more than half of world population: https://en.m.wikipedia.org/wiki/Valeriepieris_circle
The other half needs energy too though! And high-latitude areas are known to have dense enough populations and exceedingly high economic productivity, from Sweden to Ireland to New England.
It could be viable to generate biofuels using solar energy and use these for heating. One would get the best of both worlds - carbon neutrality and the possibility to use existing infrastructure. But it requires that the price for biofuels falls below the price for fossil fuels.
I was commenting on the parent
>It’s economically viable to place them further north and in cloudier climates now
Which it really isn't unless we're neglecting to count in the need for long-term backup power. There still exists some amount of solar even now, though government subsidies and government provided peaker plants may have something to do with it
> Which it really isn't unless we're neglecting to count in the need for long-term backup power.
But that's not what the topic, no? It was about economically viable. It wasn't about fully relying on them. If someone can buy or place a solar panels/plant and it is economically feasible, then it is.
E.g. EU is finally connecting the various electrical grids together. So that electricity can more easily be exported/imported. Yet another way to deal with the fluctuations.
Connecting the various electrical grids together makes each part dependent on the grid as a whole, which transfers the dependencies and risk to each nation. When supply in the grid is low, everyone rushes to bid on the limited resources and prices spikes, leading to high instability in prices. Last time that happened the collective nations of EU spent ~800 billions in subsidies during a few months to bail citizens out, and multiple elections was won or lost based on the willingness to spend subsidies. Following that, grid taxation increased steeply in order to both cover the subsidies and build out new fossil fueled power plants and fuel supply chains in order to handle the next time supply drop.
It is not a perfect way to deal with fluctuations, and it was proven beyond doubt that voters will not accept power prices to run unchecked in a EU connected grid.
> EU is finally connecting the various electrical grids together
European grids have been connected for decades, way before new renewables were a thing.
> European grids have been connected for decades, way before new renewables were a thing.
Agreed, but not in the amount of capacity that they're aiming for now due to increased (expected) volatility. See e.g. https://energy.ec.europa.eu/topics/infrastructure/electricit... which sets a 2030 target of 15%.
The irony of interconnected grids to "reduce volatility" is that any place that previously had a stable grid with low prices are suddenly seeing the exported volatility from the German energy suicide experiment crossing the borders to rape their wallets
> If someone can buy or place a solar panels/plant and it is economically feasible, then it is.
"then it is" what? economically viable? He still needs 100% backup from other sources, so you have to factor that cost in. Possible? Yeah, it's possible, but that wasn't the question.
> "then it is" what? economically viable? He still needs 100% backup from other sources, so you have to factor that cost in.
Nope, you do not need to factor that in. If I put solar panels on my house, I check if it's worth the cost/investment. I am not intending to go fully off-grid, that is not the aim. Similarly for a solar plant. If it's economically viable it means if there's a good return on investment.
> He still needs 100% backup from other sources, so you have to factor that cost in.
Again, if you put solar panels up or if you have a solar plant it does not make any sense to factor in such costs to make a business case/economical sense. You're adding complications that aren't there.
> You're adding complications that aren't there.
I'm just not ignoring externalities. You can't do your thing _unless_ the other thing is being taken care of, you have a hard dependency on it, so factoring in the cost of that is the right thing to do. You'll pay that cost, too, be it via your net-hookup fees, or taxes that subsidize it. If you're lucky, others will pay more than you do and you can make more money. That's economically viable at a small scale but does not scale far, because you quickly run out of other people who foot the bill.
Much like tax havens do not scale, because they don't produce anything of value, their concept does not work without other countries where the value is being created.
Globally it's the US that is the exception. Solars fall apart elsewhere be it solar irradiance or population density or prevalence of flying debris or whatever it might be.
Only in the US, Australia, and few other places it makes sense to just put up some panels for free energy. Incidentally also sometimes apply to EV arguments.
The next option for you would be to take the panels that you would have used and put them in an area where they work better.
Use the extra electricity to power machines that hydrogenate CO2 extracted from the atmosphere and turn it into methane.
Methane generated this way, (being the principal content of natural gas), would allow us to deliver nearly carbon-free fossil fuels to you and people in your biome, and everyone wins.
As someone living north of the arctic circle.. there are other issues than just clouds.
Well as long as you have vast amount of storage capacity + overprovision, or an alternative source of on-demand electricity, the cost of which I never see included when comparing to on-demand energy sources like nuclear of fossil fuels.
> Well as long as you have vast amount of storage capacity + overprovision
Why would that need to be? The full needs need to be taken into account. But that's a TCO calculation, not something to add to the solar cost.
Nuclear energy in Europe tends to be way more expensive than initially budgeted. Resulting in a crazy difference in the kWh cost vs solar/wind. And there's more ways to store "electricity" than just batteries.
Because what is implicit in saying that solar (or wind) is cheaper than nuclear per W produced is that it is a viable alternative to nuclear. But to be an alternative you need to be able to meet the demand whether there is light/wind or not (with a near zero probability of a blackout). And if you need to do that then you need either a bunch of alternative on demand sources of energy (UK is using LNG) or some big storage capacity (+overprovision to fill them when it is sunny/windy). Nuclear doesn't need that. If you don't factor those costs you are comparing apples and oranges.
> Because what is implicit in saying that solar (or wind) is cheaper than nuclear per W produced is that it is a viable alternative to nuclear.
I do not see that implication.
Solar and wind is significantly cheaper than nuclear. That doesn't mean it is a replacement. That implies that there's a great way to solve the obvious drawbacks solar and wind have because of the high cost difference, plus speed/time things can be built.
A country needs to figure out their TCO and the energy mix. Which means yeah, the volatility needs to be solved. Which means that there needs to be more than solar/wind. But at the same time, nobody wants to invest in nuclear. It isn't commercially viable. It is important to not have to high electricity prices. Wikipedia has quite a section on the newest nuclear power station in the UK and the kWh cost for consumers at https://en.wikipedia.org/wiki/Hinkley_Point_C_nuclear_power_.... The cost was initially estimated at £24 per MWh but could now be £92.5.
Nuclear power stations have crazy cost overruns. The initial estimations are far too low (except maybe in China because they have recent experience).
> And if you need to do that then you need either a bunch of alternative on demand sources of energy (UK is using LNG) or some big storage capacity (+overprovision to fill them when it is sunny/windy). Nuclear doesn't need that.
Nuclear doesn't need a backup? UK built nuclear and has LNG. It's not so black and white as you stated.
I believe nuclear isn’t an on demand energy source. It’s great for base load, but it doesn’t scale up or down quickly.
That was true in the 70s. I think anything built in europe since the 80s can do load following and France does that every day. I think by regulation new plants must be able to adapt their production by something like 5% per 10min (or something of that order of magnitude), well within the range you need to meet intra day variations.
Now whether it is optimal economically is another question. If you have some sources of energy in your grid that cost per usage (eg fossil fuels), you should rather switch those off than nuclear which costs the same whether you use it or not. But if your grid is almost all nuclear (eg France), you do load following.
That’s news to me. How do these newer reactors do it? I thought nuclear reactors are slow to ramp up. Are they over producing thermal energy relative to the plants electricity output?
that would be a question for someone closer to the tech than me, but relevant wiki link: https://en.wikipedia.org/wiki/Load-following_power_plant#Nuc...
> but it doesn’t scale up or down quickly.
The newer ones can. Though that's partly my dislike. There's always yet another new technological solution for the various drawbacks. Which often are unproven (e.g. SMRs) and that often results in crazy cost overruns.
Nuclear has a crazy high fixed cost so ideally you still run it continuously.
Steam turbines are very good at what they do, so I don't think that's the problem. Turbines are well understood technology so they're they're the major impediment to economic viability then everything else must be working amazing.
I do think economic viability will be a major problem though. The fusion hype crowd focus on Q>1 being their major milestone but that's still a long way away from it being profitable to operate a fusion plant.
IIRC, a cascade of turbines, going from high-temperature water steam to basically room-temperature ammonia, can reach a total efficiency exceeding 50%.
Of course, ammonia is chemically active, and water at 700K is also chemically active, so the turbines, as usual, require special metallurgy and regular maintenance.
Sure, but your cascade of turbines isn't free.
The capital cost of just the turbines is enough to make it hard to compete with solar and batteries in many situations.
Turbines are indeed very well understood and mature technology, and even where the fuel is almost free and the reactor (a coal burner) is cheap, steam turbines are too expensive to compete with solar electricity in most places now.
That's all just good money after bad.
How many more trillion $$$ are going to be pissed away on this?
I'd say, shelve the idea for 50 to 75 years and then look at it again.
In the mean time, I think we could make major headway on a global elecrtic grid, that connects whatever part of the planet is sunny with all the rest. Add to this some major storage capacity, and I think we could resolve almost all of our energy problems with the money that would be wasted on further fusion efforts.
> In sunny places (and I fully acknowledge that's not all of the world) it's still going to be hard to beat sticking bits of special glass out in a field and connecting wires to it.
Why not both? Can't we utilize the special glass to fetch energy from a man-made fusion reactor, in similar ways as we use it to fetch energy from the natural fusion reactor in space?
Treating this as a serious question: because the big reactor in the sky has a built-in gravity well and a very big sphere of vacuum that works as a free containment system while still letting just enough energy out to be useful. Also we don’t need to refill it or handle the spent fuel.
In theory, yes. Those are called radiovoltaics and are being researched.
No.
D + T or D + D fusion reaction produces neutrons and hard gamma radiation. The special glass cannot capture the energy of these directly.
The natural fusion reactor has a blanket that is literally thousands of kilometers thick. It effectively converts much of this into longer-wavelength electromagnetic radiation, from ultraviolet to infrared, with quite some visible light. That's what the special glass can make use of.
Also, the highly radioactive blanket is kept hundreds of millions km away from consumers, which helps alleviate the problem of disposing of radioactive waste. With a planet-based fusion reactor, we'd have to think where to put thousands and thousands of tons of slightly radioactive concrete which would realistically serve as a blanket.
The actual paper describing the construction of the MUSE Stellarator: https://www.cambridge.org/core/journals/journal-of-plasma-ph...
I'm having trouble understanding what's actually been accomplished here. The article provides a good overview of Tokamak vs Stellarator, but seems to jump back and forth between proclaiming this as an innovative breakthrough and saying it's just a framework to test ideas.
> In terms of its ability to confine particles, Muse is two orders of magnitude better than any stellarator previously built
Is it? It doesn't seem as if they have reached first plasma or have plans to do so anytime soon. Using electromagnets to not only confine but also to control the the plasma is a big selling point of the stellarator design, and they don't seem to address this.
This seems really cool, and I love the idea of lower-cost fusion. (Or even just functional fusion.) There are about a dozen companies making real progress in fusion, but I can't quite figure out what this team has actually accomplished.
What am I missing?
Seems like the premise is that building these small experimental stellarators is a major cost in doing fusion research and therefore if we can bring the cost of these down from billions to less than a million, more teams can do more research faster, even if this specific design never generates any economic power. I have no idea if this premise is true or not -- I'm just a layman who read the article.
The struggle for funding can explain a lot of things
I admittedly don't know much about fusion reactors, but I do love that the thing which you create a star within is called a "Stellarator".
“PPPL researchers say their simpler machine demonstrates a way to build stellarators far more cheaply and quickly, allowing researchers to easily test new concepts for future fusion power plants.”
This quote reminded of the SpaceX’s approach to engineering and why they have leapfrogged past Boeing. Instead of spending 10-20 years and billions into a single design, SpaceX iterates.
I want to like this but at break even temperatures these things just melt. How about making ship stones from nuclear waste? You could make car engine sized batteries that would effectively last for years if properly shielded and which would provide essentially free power to run a dozen homes.
how one can estimate a progress of a given design? For example on the photo there are no walls protecting the people in the lab from neutrons. Even the fusor 60 years ago running at 100KV already generated neutron flux requiring such walls.
There's absolutely no way to get fusion with permanent magnets and copper coils.
This is a plasma test stand, and because it is so simple, you can potentially quickly iterate through different field configurations. This is at least a little bit useful, because a full Stellerator is extremely complicated to take apart, so you can't just change the coils around if you want to change something.
People don't understand the fundamental problem of fusion. It's a problem of energy loss. Of enormous energy losses.
Roughly speaking energy can be mechanical, for particles or radiative, for photons. The first one is proportional to the temperature (the famous NRT) and the second is proportional to the fourth power of the temperature. The constant of proportionality is very small, and at regular temperatures we generally don't think of it that much. But at millions of degrees Kelvin, it starts to dominate all considerations.
The heat always moves form hot to cold. In the case of particles the heat flow is proportional to the difference in temperature, and in the case of radiation with the difference in temperature to the power 4. But heat also travels from particles to photons and vice-versa. It doesn't matter how.
The problem with fusion is now this. Suppose that you have a super-duper device, let's call it brompillator. It brings an amount of deuterium-tritium mix at the required temperature, let's say 10 million Kelvin. Now that volume of plasma is surrounded by cold stuff. You can imagine that you have some mirrors, or magnetic fields, or some magic stuff, but the cold hard stuff is that that plasma will want to radiate to the exterior and the flow of heat would be proportional to the surface area times the fourth power of the difference in temperature. Since for all practical purposes the outer temperature is zero, we are talking about the fourth power of 10 million Kelvin. Now that constant of porportionality is very small, it is called the Stefan-Boltzman constant and has a value of about 10^7 W m^-2 K^-4. Let's say the surface area is 1 square meter. So the heat loss happens at a rate of 10^-7 times (10^7)^4 = 10^21 Watts. That is 10^12 GigaWatts. One GW is the output of a decent sized nuclear power plant.
Of course, you can try to shield that plasma, but that shield has to be 99.99999....9% effective, where the number of 9s needs to be about 15 or so.
That is the immensity of the challenge that nobody is willing to tell you about.
How was this overcome in the case of the thermonuclear bomb? People imagine that once you have a fission bomb, you just put some deuterium-tritium mix next to it, and voila, you have a fusion bomb. No. The world's greatest minds have worked at this issue for about 5 years. The solution was something like this: if you first compress significantly the volume of fusion fuel, then the heat losses are much smaller (remember they are proportional to the area, and that's proportional to the square of the radius). They will still be tremendous, but you don't even aim to keep the reaction going for a long time. The duration of the fusion reaction in a thermonuclear bomb is still classified information, but public sources put it at the order of 1 microsecond. The heat losses are still tremendous, but for a short moment the heat gains from the fusion reaction are even greater, so ignition is achieved.
In the NIF experiment that achieved more than breakeven 2 years ago, the fusion lasted less than 10 nanoseconds [1].
If someone thinks the brompillator will achieve fusion and that will run for years, or even hours, or seconds, they don't understand the fundamental problem. Unfortunately, nobody is willing to ask hard questions about this, not even Sabine Hossenfelder.
[1] https://journals.aps.org/prl/pdf/10.1103/PhysRevLett.132.065...
I don't disagree with this statement, fusion researchers do care about energy loss when they're evaluating fusion reactor feasibility. They talk more about neutron loss, bremsstrahlung radiation and synchrotron radiation instead of blackbody radiation. A paper on this: https://arxiv.org/pdf/2104.06251
So I did some searching, and found this stack exchange asking this question: https://physics.stackexchange.com/questions/415028/how-do-fu... . They argued that because fusion reactor plasma is optically thin, it doesn't radiate following blackbody radiation law. This textbook also say that: https://www.cambridge.org/core/books/abs/physics-of-plasmas/...
I tried to search more about plasma energy losses, and it becomes extremely complicated with insane amount of equations. One thing that I can get is that you can't model fusion reactor plasmas as a blackbody radiator because plasma is that complicated. If plasma is simpler then we should either have fusion already or we have given up on fusion research a long time ago
> because fusion reactor plasma is optically thin, it doesn't radiate following blackbody radiation law.
It still follows the laws of blackbody radiation - it's just that the emissivity of the body is part of the equation.
A classical blackbody has a emissivity of 1. This means not only that it absorbs radiation really well, it also means it's really good at radiating energy away.
Things that have low emissivity (all things transparent and all things reflective) are also really bad at radiating energy away. This is used for solar-thermal collectors today: you make them from an engineered material that is completely black at in the visible range, but highly reflective in the infra-red. That way, they absorb sunlight and get hot, but they don't lose heat energy because they can't radiate it away as heat radiation.
And yes, fusion plasma is extremely, extremely transparent. Not only is it extremely thin (ITER or Wendeltstein 7-X contain only 1-2g of hydrogen during operation), hydrogen is also extremely bad at absorbing gamma-rays (black body radiation at 1e8 K).
Inertial confinement fusion like the NIF is not intended to run continuously, so the 2ns duration is irrelevant. The surface area for the calculation is not the surface area of the machine, but the surface area of the volume in which the fusion is occurring which could be very much smaller than that.
The heat loss is practically limited by the mass of hydrogen fusing in the machine. To have a continuous heat flux of 10^21 watts you would need to fuse ~4*10^5 kg of hydrogen every second. Which clearly these machines are not intended to do.
You raise some good points.
> Inertial confinement fusion like the NIF is not intended to run continuously, so the 2ns duration is irrelevant.
Indeed. I do think ICF has a future. The issue I described applies to machines that attempt to achieve sustained fusion. Pulsed fusion is ok.
> The heat loss is practically limited by the mass of hydrogen fusing in the machine.
Yes, but it goes the other way too. If the heat loss is to high you can't sustain fusion because you can't stay at the required temperature for long enough.
> People don't understand the fundamental problem of fusion. It's a problem of energy loss. Of enormous energy losses.
I'm not sure that's even true, because if you manage to crack that, you still have the problem that your sustainable reaction is pumping out most of its energy in the form of very fast neutrons, which are (a) very hard to harvest energy from and (b) extremely bad for people and materials if you don't. You could have a self-sustaining reaction that you can't actually use!
Aneutronic fusion has been previously mentioned, specifically HB11.
https://en.m.wikipedia.org/wiki/Aneutronic_fusion
It requires much higher temperatures, and thus suffers much higher Bremsstrahlung. You can sniff out BS quickly with anyone claiming a steady state aneutronic reactor. A working aneutronic design would necessarily be pulsed. Not that it can't be done, but you'd first need to pass through DT and DD performance metrics and then go another order of magnitude. No one's done that yet.
You're talking about the same thing.
High energy neutrons leave the system. They cause damage to the container (ie neutron embrittlement) and that's a separate problem. But the real problem is the energy loss from the system.
Charged particles can be contained. Personally I think there are limits to even that because a high-temperature plasma is turbulent [1]. Containing that is just a hugely difficult problem.
I'm not convinced that nuclear fusion will ever be commercially viable.
All while we already have emission-free, reliable and cheap energy production in the form of solar power. [1]: https://www.psfc.mit.edu/research/topics/plasma-turbulence
Also, the fusion reactors will inevitably have poor volumetric power density, due to limits on power/area through the first wall and the square-cube law.
Engineering studies of stellarators found they tend to be larger and have worse economics than tokamaks.
We need a liquid with high heat capacity, large hydrogen content (for high neutron interaction rate) and a solid 300 year engineering history in heat engine applications. Better if it is nontoxic and environmentally friendly as well.
But what if you breathe it in!!!1! Dihydrogen monoxide is no joke, many people are killed by it every year.
I'm a physics layman, and I'm having some trouble with uniting the content of your comment with the fact that existing magnetic confinement experiments have reported maintaining a plasma at the right temperature for longer times (not with fusion, but with microwave heating, and with the power of those heaters in the 10MW range).
Have I understood the consequences of those reports wrong? Does the heat loss you talk about only occur with fusion? (And if so, is it even a problem if the conditions for fusion to occur can be created by external heating this "easily"?)
In order to protect astronauts from decompression, the hull of a spacecraft has to be insanely good at stopping gas particles. Not 99.5% good, but like 99.9999999…% with 20 zeros! That’s very good.
But a thin metal sheet has no trouble doing this, as demonstrated by the Apollo lunar lander.
Some things are just not as hard as they sound. Magnetic confinement works very well. It easily achieves the necessary 9’s.
It’s just hard to keep it stable at millions of degrees, but that’s a different problem.
Wait til you guys hear about DNA transcription error rates!
Are you saying the way to contain plasma is the shape of a double helix
*Slaps proton*
These things can fit so many nines of reliability.
I'm not sure, but we can try to figure out what is going on. And by the way I'm a physics layman too. I just read a lot of books about fission and a few about fusion too, it happens to be my hobby. When I'm bored, the bookmarks that I browse are [1] and [2].
So, when reports state that the a certain temperature was achieved and sustained for a certain period of time, what are they actually saying? We could go and find an article and get into some details, but I imagine they say that somewhere in the plasma that temperature was reached and sustained. But it is quite likely that that region is quite microscopic, maybe a very, very thin inner torus inside a larger torus. There is a gradient of temperature from the region where the announced temperature happens to the walls of the device. But one way or another that thin inner region can't have a surface area of anything close to 1 square meter. To get to 1 GW of power, you need 10^-12 square meters, and to get to 10 MW you need 10^-14 m2. That's about the surface area of a torus of (circular) length 3 m and diameter 1 femtometer. 1 femtometer is roughly the size of a nucleus of deuterium or tritium, so in principle this is the minimum diameter of a torus where you can talk about fusion.
[1] https://www.ncnr.nist.gov/resources/n-lengths/
[2] https://www.oecd-nea.org/janisweb/
Just wanted to say thank you for this comment, fascinating and perfect example of the beauty of HN. Relatively fresh off The Making of the Atomic Bomb and while fusion was not at all a focus, this (incomplete) impression is exactly what I came away with.
Is there any chance you'd recommend any books related to these topics? The walk through decades of revelations in physics was the most enjoyable aspect of that book, I'd love to continue building on that story.
Well if you liked "The Making of the Atomic Bomb" I can strongly recommend the follow up "Dark Sun" that covers both the Soviet atomic bomb program and the development of the H-bomb by the US.
Not sure how I missed that Rhodes wrote a continuation! The clarity of writing about physics for a layperson has been wonderful, glad to find there's more. Much appreciated.
One of the points I took away from that book being that "H-bomb" weapon design is as much about fission as fusion - most designs being fission-fusion-fission with most energy coming from the latter fission stage.
That book is absolutely great. As the sibling comment mentions, the Dark Sun is also great.
Here are some more books I read on this topic. One was written by someone who was very close to the Ulam and Teller inner circle: "Building the H Bomb" by Kenn Ford. Another one is "Sun in a Bottle: The Strange History of Fusion and the Science of Wishful Thinking" by Charles Seife. And finally, you can't go wrong with any book written by James Mahaffey.
These are exactly what I was looking for. Many thanks!
It would be entirely reasonable to wonder if fission-pumped fusion could be scaled down and pulsed.
It would be a Lovecraftian nightmare of unmentionable proportions to actually operate, but you could imagine it breaking even.