World renewable energy at almost 25%

  • Active since 1995, Hearth.com is THE place on the internet for free information and advice about wood stoves, pellet stoves and other energy saving equipment.

    We strive to provide opinions, articles, discussions and history related to Hearth Products and in a more general sense, energy issues.

    We promote the EFFICIENT, RESPONSIBLE, CLEAN and SAFE use of all fuels, whether renewable or fossil.
Status
Not open for further replies.

begreen

Mooderator
Staff member
Nov 18, 2005
104,654
South Puget Sound, WA
Nice article....

Buried within, they say that of the 23.7% of global RE electricity, 16.6% is 'large scale hydropower' (much of which is quite old).

A little math has 'all the rest', wind, solar, biomass and geothermal at 7.1% of global electricity production (not capacity)....more than I would have guessed, and not far behind the US figure (wind+solar is ~7%, biomass puts it over).

The article's wording of capacity/production was strange...I will go read the report.

The report notes that more RE was added in 2015 than in any previous year....not bad with the crash in oil prices. So much for the predictions that cheap oil would doom RE.
 
Report:
http://www.ren21.net/wp-content/uploads/2016/06/GSR_2016_KeyFindings.pdf

p. 10: SOLAR
The solar PV market was up 25% over 2014 to a record 50 GW, lifting the global total to 227 GW. The annual market in 2015 was nearly 10 times the world’s cumulative solar PV capacity of a decade earlier.

An estimated 22 countries had enough capacity at end-2015 to meet more than 1% of their electricity demand, with far higher shares in some countries (e.g., Italy 7.8%, Greece 6.5% and Germany 6.4%).


[Note that this is capacity/demand, not fraction production/consumption, for solar this likely overstates production/consumption by 4X]

p. 11: WIND
Wind power was the leading source of new power generating capacity in Europe and the United States in 2015, and the second largest in China. Globally, a record 63 GW was added for a total of about 433 GW. Non-OECD countries were responsible for the majority of installations, led by China, and new markets emerged across Africa, Asia and Latin America.

Wind power is playing a major role in meeting electricity demand in an increasing number of countries, including Denmark (42% of demand in 2015), Germany (more than 60% in four states) and Uruguay (15.5%). The industry had another strong year, and most top turbine manufacturers broke their own annual installation records.


[Not clear, but this is likely capacity/demand, not fraction production/consumption, for solar this likely overstates production/consumption by at least 2.5X]

p. 18: GLOBAL RE %-ages:
Hydro is 16.6% of global electricity production. Wind is 3.7%. Biomass (for elec) is 2.0%. Solar PV is 1.2%.

While these numbers might seem small, remember that globally wind has been doubling (conservatively) every 4 years and solar every 3 years. As an example, if these doubling rates continued, in 10 years, 2025:
--Wind would be 6X 2015 values, about 21% of global RE electricity
--Solar PV would be 10X, 2015 values, about 12% of global RE electricity
if big hydro were still 15%, and bio 2%, then in 2025 we could have ~50% global RE electricity.

Fraction of global energy END-USE: all RE is 19% versus 2.5% for Nukes and 78% for Fossil Fuels.

[NB: half of that RE figure is 'traditional biomass'...scrounging cordwood AND collecting sticks and dung, which is 4X all nukes, BTW]

I LIKE this number...as it inverts a major peeve of mine. The FF companies (like the Exxon 'Outlook for Energy 2040') always score all energy sources by INPUT BTUs. Of course, many FF's end up in heat engines (such as steam turbines and ICEs) with efficiencies of 20-40%, so this overstates their role/fraction by 250 to 500%, compared to things like wind or PV that make electricity directly. IOW, FF's should be penalized for the their thermodynamic losses, not get a gold star for them. This report does something else (it appears)...it computes a fraction depending on OUTPUT BTUs and kWh as delivered to the user.
 
Last edited:
  • Like
Reactions: begreen
The US potential for progress and job growth in this field is huge if the politics of FF and denial can be overcome by common sense.
 
All energy is renewable. We burn wood, it goes up in smoke . It comes back as ash and fertilizer when it rains. It makes the trees grow more, we burn more wood, it goes up in smoke , comes down as ash and fertilizer, it makes the trees grow more. What does not get burned ends up crude oil or natural gas someday. Repeat, repeat , repeat. The sun shines, it heats up air, it moves air, it makes the windmills turn. The never ending energy of the sun dries up water, it makes it rain, it makes rivers full , it turns hydro generators. Repeat , repeat , repeat. It is all a matter of cost per unit. On another note, I read today that our biofuel steam generator was so expensive to run here in ear wausau , Wi. that they are going to start running natural gas through it to save money. Some stuff is just too expensive. Go with what is nearby and cheapest for electricity and fuel.
 
Do you suppose nuclear is renewable? I don't know a natural process that reunites those atoms.
 
Do you suppose nuclear is renewable? I don't know a natural process that reunites those atoms.

Supernovas! The renewal time period is rather long, though (billions of years), and not well-matched to conservation since it requires stellar destruction, which can be expected to wipe out nearby planets, as well.

It's a pity we hardly invest anything at all into fusion research. Despite the ill-informed jokes, there is significant potential there for a baseload power supply that would pair well with solar and wind, with significantly less and less hazardous waste than nuclear fission or coal.
 
  • Like
Reactions: Highbeam
Yes, even nuclear is renewable. I worked for a guy who was employed at the nuclear power plant in Wisconsin. The by products can still be used for power. I did not understand it due to my little mind how nuclear waste can be recycled and reused. It was turned into something else and dont remember what it was. Although, we were talking about carbon , weren't we?
 
Supernovas! The renewal time period is rather long, though (billions of years), and not well-matched to conservation since it requires stellar destruction, which can be expected to wipe out nearby planets, as well.

It's a pity we hardly invest anything at all into fusion research. Despite the ill-informed jokes, there is significant potential there for a baseload power supply that would pair well with solar and wind, with significantly less and less hazardous waste than nuclear fission or coal.

Have to disagree. We have sunk a LOT of cash on nuclear fusion over the years....if we had put some of that money 30 years ago into renewable energy research....we might be a lot better off now. Of course, you never know what is going to pay off in research....so not much point in coulda shoulda.
 
Have to disagree. We have sunk a LOT of cash on nuclear fusion over the years....if we had put some of that money 30 years ago into renewable energy research....we might be a lot better off now. Of course, you never know what is going to pay off in research....so not much point in coulda shoulda.

I read that argument made repeatedly in discussions about fusion, but few people have actually looked up the numbers.

It's really not even close. Fusion research spending for the last several years has been right around $500 million annually, and is currently declining again in the US, even though the effort to get the ITER project operational and returning data for the next project, which if ITER goes well will be a prototype power plant, is nearing its peak spending period.

Fusion is getting only 25% more research funding than clean coal, and it's getting about 1/4 as much as renewable energy and efficiency R&D, not counting the renewable energy tax credits. Pages 22, 29, and 39 here:
http://energy.gov/sites/prod/files/2015/02/f19/FY2016BudgetinBrief.pdf

As for tax credits, Topaz Solar Farm alone got somewhere around a 1-1/2 times the annual fusion research budget, for an existing technology, and we're currently installing over a dozen times as much solar per year as that single farm represented, all collecting the same tax credit. The total value for all of 2015 was probably over $4 billion (7.3 GW new capacity, $2/W installed cost, 30% ITC)

Wind generated 190 billion kWh in the US last year, which if it was all eligible for the 2.3 cent/kWh production tax credit, works out to $4.4 billion.

It's true, we never know what is going to pay off in research, and fusion research is constantly running into unexpected challenges. But it remains one of the most promising controlled energy sources in development (along with alternative fission fuel cycles - reprocessing, thorium, etc.), and the lack of real showstoppers being found at the current leading reactors is a positive sign.

If there were budding renewable technologies with similar promise, by all means fund them, but further cutting the anemic fusion budget isn't the way to get there.

This is also well worth a read for people who have the time, although keep in mind few of the numbers it presents are inflation-adjusted, and it is a bit over-exuberant about the overall potential:
https://www.21stcenturysciencetech.com/Articles_2010/Winter_2009/Who_Killed_Fusion.pdf
 
Yes, even nuclear is renewable. I worked for a guy who was employed at the nuclear power plant in Wisconsin. The by products can still be used for power. I did not understand it due to my little mind how nuclear waste can be recycled and reused. It was turned into something else and dont remember what it was. Although, we were talking about carbon , weren't we?

It can be recycled, but not endlessly. Current nuclear fuel recycling involves separating the waste produces from the remaining fuel and adding some fresh fuel. It helps both reduce the amount of waste that has to be carefully disposed of and reduce the amount of new fuel needed, but only moderately. There are more advanced recycling concepts possible, but not currently in use for reasons to complicated to really get into.
 
I read that argument made repeatedly in discussions about fusion, but few people have actually looked up the numbers.

It's really not even close. Fusion research spending for the last several years has been right around $500 million annually, and is currently declining again in the US, even though the effort to get the ITER project operational and returning data for the next project, which if ITER goes well will be a prototype power plant, is nearing its peak spending period.

I was not thinking about the current state of affairs....in which fusion is basically 'dead'....and relegated to useless international mega-demos. I am talking about the massive funding that occurred in the 1970s through the 1990s. Inflation corrected....it would be >$30B total, just for the US.

http://focusfusion.org/index.php/site/reframe/wasteful/

For reference, the entire Manhattan Project cost $23B in today's dollars. The MP invented and built the first demo and multiple Pu production reactors, producing tons of material that 10 years earlier took a huge effort to make a few micrograms of, and designed and built two types of working atomic bombs using two separate teams, using slide rules and (the first, built for the MP) digital computers (that had less computing power than current-day greeting cards).

https://www.technologyreview.com/s/541636/weighing-the-cost-of-big-science/

No one really thinks fusion is a viable, affordable, near-term energy source anymore....its a science project in 'pure research' at this point....and funded like that.

http://issues.org/31-4/fusion-research-time-to-set-a-new-path/

What is your opinion of the Lockheed team? Seems to me their internal coil will cool the plasma.
 
Last edited:
I was not thinking about the current state of affairs....in which fusion is basically 'dead'....and relegated to useless international mega-demos.

Current reactors like JET, JT-60, Alcator, and NSTX are relatively large projects, but not mega-demos. Each of the above have annual budgets in the $100 million ballpark, while individual national labs have budgets dozens of times as large.

They're certainly not useless. Each has made significant advances towards resolving specific challenges in making magnetic confinement fusion work - plasma heating, plasma stability, confinement operation (learning to get past the quenches one of your links discusses), and increase heat output. One of the big concerns of the ITER designers recently has been funding the ITER design and construction while still maintaining sufficient funding at existing Tokamaks so they can continue generating data that will help refine the design of the final details for ITER.

ITER is a mega-project. If it doesn't work more or less as expected, then the Tokamak design will likely be dead, but data from the existing reactors point to ITER being the way to go - scale up to increase the temperature and density, and improve the control to enable reaching steady state operations. Even then, much of what ITER will learn will be of interest to those working on the Stellarator design, and possibly to other designs.

For reference, the entire Manhattan Project cost $23B in today's dollars.

Yes, fission is comparatively easy. It even happens naturally on earth if you have enough fissile material in one place, as it did in Oklo in Africa about 2 billion years ago. But it has enough drawbacks that over the long term we really want to phase it out in favor of something better. Dealing with those drawbacks is why modern plants are significantly different than those built in the 1940's and individually cost almost half as much as the entire Manhattan project.

Also, the $30 billion figure for fusion research includes $10 billion for inertial confinement research, overwhelmingly at the National Ignition Facility, which is primarily a weapons research facility. There's no clear development pathway between the NIF and a viable powerplant, despite what the press releases always imply.

And $30 billion over 60+ years adds up a lot slower than the $8+ billion per year I mentioned above.

No one really thinks fusion is a viable, affordable, near-term energy source anymore....its a science project in 'pure research' at this point....and funded like that.

Nor will it ever be if it doesn't get funding proportionate to its potential impact. The electricity market in the US alone is in the ballpark of $1/4 trillion annually.

What is your opinion of the Lockheed team? Seems to me their internal coil will cool the plasma.

It's similarly intriguing to the Bussard Polywell design. Both sound worth continuing research, and as I understand it Lockheed did get a decent-sized research grant. However, they also made numerous unsupported claims when they announced their research, such as suggesting they could have a working power plant ready by 2024. They're currently working at the kinds of energy levels Tokamaks were achieving in the early 60's, and I suspect several prototype generations away from having a decent idea how well their design will scale up.

They refer to "internal coils" but I don't think they really mean internal to the plasma, but internal to the chamber, as opposed to protected by the chamber walls. I wouldn't guess cooling of the plasma is the main potential issue, but rather heating of the coils.

The alternative to the Tokamak that is furthest along is the related Stellarator design, in particular the Wendelstein 7-X, which is comparable in scale to several of the Tokamak's currently operating that were built in the 80's and 90's.
 
Ok, you seem pretty knowledgeable about fusion tech. What is the application?

If it is simple power generation, perhaps you can pencil it out for me....

If I build a 1 GW fusion plant, and run it on a 50% duty cycle for 20 years, then I have made 80,000 GWh, or 80 Billion kWh. At a wholesale price of 5 cents per kWh, that is $4B worth of electricity. If I assume the fuel is free (its not...D is pretty cheap, breeding up T is not), and personnel are free, and maintenance is free...then my commercial 1 GW plant still needs to cost less than $4B up front to break even. If I assume a 5% discount rate to the investors, then that means I need to cost less than half that up front to satisfy my bond holders. IOW the plant construction has to be <$2/W, even assuming zero fuel, personnel, maintenance costs and no unforeseen technical problems.

Looks like 1 GW (output) would be 4x the power level of ITER, assuming 50% conversion of heat energy to kWh.....and ITER is currently projected to cost $14B and has no heat capture or conversion equipment, nor a 80,000 hour design life.

As you said, fission is MUCH easier....and LWR currently cost several dollars/watt. Breeders might cost $10/W. In what world is your future fusion reactor even within the right order of magnitude of costs, even if all the existing material and technical problems were resolved tomorrow?
 
Baseload plants typically run at about 80% capacity, and their payback is calculated over 30 years. Actual life for most nuclear fission plants has been 40-60 years, so they typically have relatively long periods of low effective power cost that were not factored into their original financing decision. Currently the US nuclear industry is averaging 90% capacity.

I hate to admit it, but ITER is on track to go beyond $14 billion. But that's for a project with a multinational leadership comprised of a large percentage of political appointees that has been doing an atrocious job of coordinating work between the member countries, to the point that they had a massive leadership shakeup a year or so ago and a complete review and overhaul of the schedule (that they weren't even admitting the schedule was no longer workable is reminiscent of what I saw, working in the aerospace industry, of the management foul ups and denialism that contributed significantly to the gigantic schedule and budget slides that plagued the Boeing 787). It is also plagued by unpredictable funding, including as the US for several years backed out of the project, resulting in loss of some existing talent and reshuffling of work, both of which are destructive to productivity.

It also includes a massive amount of novel engineering, and the DEMO project, if it happens, would also have a massive amount of novel engineering, but subsequent power plants would adapt the demonstrated designs to each location with far less design work. It also includes literally building the factories, both on site and in some of the contributing countries that will be building the major components of the plant. The amount of superconducting material used and the size of cryogenic and vacuum components are also relatively unique at the present time, and there's significant economy of scale to be expected if we can make a transition to producing numerous copies of a functional design, instead of every major component being unique.

In short, this is a closely related argument to the one being had 7-8 years ago when solar PV installs cost $8/Watt, and certainly 15 years ago when they were $12/Watt. Those arguments are started to sputter out, now that installed costs are around $3/Watt.

Current Light Water Reactor projects seem to be getting priced at about 6-$8/Watt, and I think are mostly funded by private, for profit utilities. Of course, their investors are protected to large degree by permissive rate review boards, but if their costs were to get too out of line with neighboring utilities, the boards would tamp down on rate rises and their profits would fall, so I suspect $6-8/W is competitive for new capacity in the segment of the electricity market nuclear plants sit in. With less complex fuel handling and disposal and end-of-life decommissioning issues, I strongly suspect fusion plant costs will be even more front loaded than fission plants, allowing them to have slightly higher up-front costs.

Comparatively, fusion Tokamak plants would be physically larger, but not having to protect for the massive residual decay heat that fission by-products produce, their containment structures will be significantly less challenging, helping somewhat to offset the cost of their greater complexity.
 
Yeah. Ok. I guess I don't think that these fusion projects are one step away from a design that can function at 50% cf for 20 years, let alone 90% cf for 60 years. Fission plants are not financially viable at $6-8/W, which includes a govt-funded accident insurance plan (a subsidy). And are made out of steel and concrete, which are two very well understood and rather inexpensive materials. And they don't have massive 12T superconducting magnets in building-sized 4K cryostats. And we don't yet know what the fusion vessel should be made of, or how the superconductors will age with n bombardment, etc. But you think the cost of building all that unique stuff, from materials we haven't even validated or discovered yet, will cost about the same amount as a concrete containment vessel...so fusion should be about $8/W too? Even if that's the price, then it, like fission, is not financially viable.

Of course, you are saying that PV used to be $8/W too...but with PV we had a nice straight learning curve from $100/W in the 70s down to $8/W that predicted that PV would be affordable (<$1/W) by the time it reached reasonable scales....exactly as it did. And that was a well understood material (silicon) and devices (PV modules and glass) that had excellent, demonstrated service lifetimes in the field, mostly just exposure to easily simulated weather, rain and solar UV, as opposed to high temps and high X-ray and neutron fluxes.

Maybe we will someday have a learning curve for fusion plants....but until we know what materials they need to be built of, or even have a working unit or design, I don't think you can invoke learning curves and economies of scale.
 
Last edited:
Good discussion but off-topic. This is interesting and well worthy of a separate thread.
 
Last edited:
Status
Not open for further replies.