110 vs 220V

  • Active since 1995, Hearth.com is THE place on the internet for free information and advice about wood stoves, pellet stoves and other energy saving equipment.

    We strive to provide opinions, articles, discussions and history related to Hearth Products and in a more general sense, energy issues.

    We promote the EFFICIENT, RESPONSIBLE, CLEAN and SAFE use of all fuels, whether renewable or fossil.
Status
Not open for further replies.
I seem to remember reading somewhere where 220 was more economical than 110.
Any electricians in da house?
 
1000w is not much for load. You will run twice the amperage at 110 than 220 and may have slightly more line loss. It depends, if you go much distance from the source 220 could be more economical. Probably not much at 1000w. I like to run 220 when I can just because the wire can be smaller for less amperage.

Mike
 
daveindigby said:
I will be running 1000W for 12 hours per day and have the option of
using 110 or 220v does it make a difference? is one way more economical
than the other?

The answer is "no", there is no difference in regard to economy. Using the lower voltage needs larger wires to prevent line-loss, that's all.
 
jdemaris said:
daveindigby said:
I will be running 1000W for 12 hours per day and have the option of
using 110 or 220v does it make a difference? is one way more economical
than the other?

The answer is "no", there is no difference in regard to economy. Using the lower voltage needs larger wires to prevent line-loss, that's all.

Not so fast, you need twice as much current to get 1000 watts at 110 as you do at 220 so the line losses due to friction are double in a given wire size. Of course you might choose to reduce this effect by using smaller wire to match the smaller current and then your friction losses might be similar. The savings, or the economy, of using the 220 is smaller wire and wire is very expensive.
 
If you wanna get right down to cases, 110 and 220 were 30 years ago. These days it's 120/240. but then again..........220, 221, whatever it takes.
 
Highbeam said:
Not so fast, you need twice as much current to get 1000 watts at 110 as you do at 220 so the line losses due to friction are double in a given wire size. Of course you might choose to reduce this effect by using smaller wire to match the smaller current and then your friction losses might be similar. The savings, or the economy, of using the 220 is smaller wire and wire is very expensive.

There are no line-losses worth mentioning with 120 or 240 as long as the proper size wire is used. And, the poster mentioned a 1000 watts which is a low draw and easy to wire either way. Now, if he had an extremely long run, it might be cost-effective to choose high voltage and smaller wiring - but otherwise, it probably makes no difference.

The question was . . . about 220 being more efficient - in regard to overall energy use I assume - and no, it is not. Any circuilt, regardless of voltage should be wired for a 5% voltaged drop or less. Just pick the correct wire to have less then 5% loss when used under the anticipated load and any voltage.

Also, in regard to the other mention of 110 actually being 120. To be techincal, grid current that's called "120 VAC" is usually 170 VAC at the high and low peaks of the wave - but 120 VAC is the overall useable average voltage.
 
You do not pay for current, you pay for energy.

;-) Excuse me for putting my physics teacher voice...

Amperage x Voltage = Wattage

Wattage is a unit of power, or energy per unit time. So, we multiply by time to get energy. We have a defined standard wherein we use 1000 watts, or a kilowatt, and a unit time, the hour. Hence we get the kilowatt-hour, or kWh.

Another common unit for energy is the joule. A kWh is 3.6 megajoules... 3,600,000 joules.

It does not matter whether you have...

8.33 amps x 120V = 1000 watts
or
4.55 amps x 220V = 1000 watts

In the end, you have the same rate of power usage, and therefore, the same amount of energy used in a given time.


A little deeper,
The fundamental equations here are Ohm's Law (I=V/R) and Joule's law (P=IV). We can substitute one into the other to get:
P=V^2/R (V squared/R)
The resistance of a wire is a fundamental property of the material, varying slightly with temperature changes that we experience in day-to-day life. Devices are given a wattage rating at a set voltage. So, a 1000 watt device at 120V...
1000W = (120V^2)/R tells us the resistance of the device is R = 14.4 Ohms.
Crank up the voltage to 240V. Since R doesn't change, the wattage must.
P = (240V^2)/14.4Ohm... P = 4000 watts

Unless the device has electronics that adapt to the incoming voltage, a device will consume energy at a greater pace when given the higher voltage. If the device has such adaptive electronics (or simply a switch on the side where you tell it 120V or 240V), then the device will lower its amperage draw at the higher voltage, keeping its wattage the same, and hence energy consumption also the same.

Bottom line... a standard device uses more energy at a higher voltage. A device designed to handle both voltages will use the same amount of energy regardless.
 
Just last week I went through this research as I had some electric heat installed in my basement. My research agrees with what has been said here. My electrician was telling me 220 would be far more efficient, but I overruled him on that :).

The type of heater and the number of watts is what is typically the key. I went with three 1500 watt heaters. 2 are 120 V hard wired convection style and they heat their spaces up very quickly. The third is a 240 V hydronic baseboard electric heater also hard wired in. It definitely seems to heat up more slowly, but it produces nice even heat, and it is silent. It also seems to cycle less, presumably because the fluid in unit stays warm and radiates heat. All three units use nearly the same amount of energy to maintain a given temperature.
My understanding is that I would have had to step it up to the 240 volts if I had wanted anything more than 1500 watts, but for these spaces 1500 watts seems like it is going to work out well.
 
mkmh said:
Just last week I went through this research as I had some electric heat installed in my basement. My research agrees with what has been said here. My electrician was telling me 220 would be far more efficient, but I overruled him on that :).

The type of heater and the number of watts is what is typically the key. I went with three 1500 watt heaters. 2 are 120 V hard wired convection style and they heat their spaces up very quickly. The third is a 240 V hydronic baseboard electric heater also hard wired in. It definitely seems to heat up more slowly, but it produces nice even heat, and it is silent. It also seems to cycle less, presumably because the fluid in unit stays warm and radiates heat. All three units use nearly the same amount of energy to maintain a given temperature.
My understanding is that I would have had to step it up to the 240 volts if I had wanted anything more than 1500 watts, but for these spaces 1500 watts seems like it is going to work out well.

What brand of Hydronic are you using, and do you think its any better or worse than the electric ones?
I am thinking of it for back up heat & shoulder weather heat.
 
Electric companies bill by the kilowatt-hour, not amps. Volts times amps results in kilowatts- and kilowatts used over measured time results in the kilowatt-hours. 120 volts at 20 amps, or 240 volts at 10 amps - still result in equal figures.
 
Hogwildz said:
What brand of Hydronic are you using, and do you think its any better or worse than the electric ones?
I am thinking of it for back up heat & shoulder weather heat.

I went with Fahrenheat which is sold at the big box stores and NorthernTool.com.
I'm not convinced the hydronic feature is worth the added cost, but the heat does seem to be nice and even. Unfortunately it is quite slow to heat up from a cold start. The convection ones will warm up a room much quicker. So, I would say Hydronic is a good choice if you are looking to maintain a fairly constant temperature.
 
Okay, running efficiency is the same assuming that the wire is sized to match the current load. If we assume the wire is sized this way then we can deal with the installation cost where savings are in favor of the 220 system. Two equal wattage appliances can be had in 110 or 220 volts. The 15 amp (110 volt) appliance uses 14 gauge wire and the 30 amp (220 volt)appliance uses 10 gauge. These wires have significantly different costs per foot. Now the next consideration is the panel. Your typical modern home has a 200 amp panel. You can get twice as many of the 15 amp circuits through a 200 amp main breaker and service, assuming your panel has physical space for the 220 breakers.

I like my little electric cadet style heaters. I would use standard 12-2 and then have flexibility to upsize the heaters, or even switch voltage, if needed.
 
Efficiency may depend somewhat on what the load is... I've been told that for a given mechanical load (i.e. an air compressor) an electric motor that can be wired for 110 or 220 will be SLIGHTLY more efficient when running on 220.

For a 1,000 Watt load, installation costs should be about the same - that's well within the specs for a 12 or 14g wired 15A 110v circuit, and code doesn't allow permanent wiring with smaller than 14g, regardless of the current load. - The only difference might be in the termination hardware, (i.e. breakers and outlets) where 110v stuff is usually cheaper.

One other factor might be the cost of the load itself - does whatever the item is cost differently for a 110 volt unit than a 220 volt? What about any replacement parts - i.e. if it's a light, how much are the bulbs in either voltage?

Gooserider
 
Gooserider said:
Efficiency may depend somewhat on what the load is... I've been told that for a given mechanical load (i.e. an air compressor) an electric motor that can be wired for 110 or 220 will be SLIGHTLY more efficient when running on 220.

I've heard the many rumours, or unverified claims, including something like you mentioned. I've heard many times that motors that have to start hard (usually capacitive start motors) have a little more starting torque at high voltage. But the specs don't show it, so I'm doubtful.

If you go shopping for electric motors and read the test specs, they come with efficiency ratings at each voltage - kind of like the EPA gives us gas mileage figures for cars, and Energy Star gives us ratings on applicances and energy consumption. At present, EPA does regulate efficiency on some industrial electric motors.

Electric motors are rated by how much electricity does useable work versus how much gets wasted in heat. I've never seen one yet that has a higher rating for 240 volts over 120. All I've seen are rated absolutely equal. Just about all electric motors though, are less efficient when used for small jobs for their size. I.e., a 5 horse motor used all the time to make 1/4 horse throws off more heat then it ought to. Electric motors are intended to be most efficient when run at 75% load capacity or more. A 5 horse or less motor tends to run at 84-85% efficiency if it's a "standard" motor, regardless if 120 VAC or 240 VAC. If a lot of extra money is spent for a "high efficiency" motor, 5 horse or less, the factor climbs to 87-88% (not a whopping difference).

Big motors do better. A standard 50 horse motor runs around 91% efficiency, whereas a premium model up around 94%.
 
I posed this question to a lineman who works for our provincial power company and he stated that an appliance that uses say 1000w running off 120v or running off 240v will meter slightly different. The reason being is that the meter is designed to sense both legs of the system at the same time (240v). When there is a a significantly higher load off one side of the meter vs. the other it has a more difficult time sensing the load and is less accurate.

Apparently it's barely measurable in terms of real cost. Pennies a year that sort of thing.

It is important when wiring the house to factor in all your loads in your panel to share both 120v sides of your service by spreading larger 120v loads like the fridge, dishwasher, furnace, washing machine, garage plugs etc.

At the end of the day it makes almost no difference if you choose 120v vs 240v on a small load like 1000w. I like wiring stuff to 240v if I have the choice so I don't have to run as large of a guage wire as 120v. (like my electric baseboard heat).
 
Status
Not open for further replies.