I posted this on another forum, but I thought I would put it here as well to generate some controversy.
This is a quick test you can do with a boiler that contains a fair amount of water such as the Optimizer 250. Ok, I wouldn't say it's particularly quick, but no good (accurate) experiments are.
With the fire out, drop the water temperature to 140, then weigh and burn an amount of wood (for the optimizer 250 I found that 20 lbs. was a good amount) and see what the increase in temperature is. Do this with the heat load off of the stove but leave the circulator on to make sure the water is being evenly mixed. I have found that around 20 lbs. of wood would raise the 240 gallons of water approximately 40 degrees, from 140 to 180.
Use softwood for this experiment so the wood burns up fairly quickly. Don't worry about the last bit of small hot coals as they don't hold much BTU compared to the rest of the wood. You can also use this type of test to see what the true BTU output of your stove is by timing how quickly it burns the wood.
So here are my calculations:
BTU's to heat the water: 240 gallons X 8.35 lbs. X 40 degrees = 80160 BTU.
BTU's to heat the metal and masonry in the stove: 1500 lbs. X .12 BTU per pound (specific heat of steel) X 40 degrees = 7200 BTU.
Total BTU's required: 80160 + 7200 = 87360.
20 lbs. of wood has 6191 BTU per pound at 20% moisture content, so it has 6191 BTU X 20 lbs. = 123820 raw BTU.
So the efficiency calculation using 20 lbs. is 87360/123820 = .7055
So the efficiency is around 70%.
You could run this experiment multiple times to get a pretty accurate rating for your stove. Overall I think I comfortably get around 70% honest efficiency average from the Optimizer 250. It may be as high as 75% under some conditions.
If you don't have a large water capacity stove, you could run the same experiment drawing off a set amount of heat if you measure the flow and delta T of your load.
Off topic, but for the REAL nerds in the group, using molecular weights I have calculated that burning 1 lb. of wood (at 0% moisture) will create 0.6622 lbs. of H2O and 1.70 lbs. of CO2. (Obviously using 1.3622 lbs. of Oxygen from the atmosphere.)
That's a lot of water generated for potential condensation corrosion!
Also, when it is tuned for efficient operation I have found the true output of the stove to be 125,000 BTU absolute maximum. It is officially rated for 250,000 BTU but I would suspect that rating to be wood consumption when running wide open, not actual BTU's into the water.
Still enough BTU's for my purpose, but I'm guessing all manufacturers are guilty of this practice.
FYI, I'm not bashing P&M in any of my posts. I think they are one of the best out there and would absolutely recommend them over others.
Thoughts? Am I doing this wrong?
This is a quick test you can do with a boiler that contains a fair amount of water such as the Optimizer 250. Ok, I wouldn't say it's particularly quick, but no good (accurate) experiments are.
With the fire out, drop the water temperature to 140, then weigh and burn an amount of wood (for the optimizer 250 I found that 20 lbs. was a good amount) and see what the increase in temperature is. Do this with the heat load off of the stove but leave the circulator on to make sure the water is being evenly mixed. I have found that around 20 lbs. of wood would raise the 240 gallons of water approximately 40 degrees, from 140 to 180.
Use softwood for this experiment so the wood burns up fairly quickly. Don't worry about the last bit of small hot coals as they don't hold much BTU compared to the rest of the wood. You can also use this type of test to see what the true BTU output of your stove is by timing how quickly it burns the wood.
So here are my calculations:
BTU's to heat the water: 240 gallons X 8.35 lbs. X 40 degrees = 80160 BTU.
BTU's to heat the metal and masonry in the stove: 1500 lbs. X .12 BTU per pound (specific heat of steel) X 40 degrees = 7200 BTU.
Total BTU's required: 80160 + 7200 = 87360.
20 lbs. of wood has 6191 BTU per pound at 20% moisture content, so it has 6191 BTU X 20 lbs. = 123820 raw BTU.
So the efficiency calculation using 20 lbs. is 87360/123820 = .7055
So the efficiency is around 70%.
You could run this experiment multiple times to get a pretty accurate rating for your stove. Overall I think I comfortably get around 70% honest efficiency average from the Optimizer 250. It may be as high as 75% under some conditions.
If you don't have a large water capacity stove, you could run the same experiment drawing off a set amount of heat if you measure the flow and delta T of your load.
Off topic, but for the REAL nerds in the group, using molecular weights I have calculated that burning 1 lb. of wood (at 0% moisture) will create 0.6622 lbs. of H2O and 1.70 lbs. of CO2. (Obviously using 1.3622 lbs. of Oxygen from the atmosphere.)
That's a lot of water generated for potential condensation corrosion!
Also, when it is tuned for efficient operation I have found the true output of the stove to be 125,000 BTU absolute maximum. It is officially rated for 250,000 BTU but I would suspect that rating to be wood consumption when running wide open, not actual BTU's into the water.
Still enough BTU's for my purpose, but I'm guessing all manufacturers are guilty of this practice.
FYI, I'm not bashing P&M in any of my posts. I think they are one of the best out there and would absolutely recommend them over others.
Thoughts? Am I doing this wrong?