220 Volt Living

Automatic Washer - The world's coolest Washing Machines, Dryers and Dishwashers

Help Support :

why 110 volts?...

My understanding of the history of 110V supply in the USA was the first power distribution was at 110V DC, the voltage was chosen because it was believed (not sure if it is correct or not) that 110V DC was the highest DC voltage that could not cause electrocution. I read many years ago, can't remember where, that around the late 1800s and early 1900s, 110V was considered the highest "safe"DC voltage and 32 Volts was the highest "safe" AC voltage, which is also why most single-home lighting plants in rural areas used 32 Volts - though the plants were mostly DC, some were AC.
Edison provided DC 110V but the limitations of DC technology at the time meant the electricity had to be generated at the same voltage as final supply, there was no way to step down a higher DC voltage to a lower final voltage, so the DC power stations could only supply a radius of about 1 mile. Power plants were thus small and everywhere.
Westinghouse developed AC systems that generated and distributed power at several thousand volts,AC, and used transformers at the end to supply the final voltage - also 110V as I understand it. (Transformers work on AC only.) Westinghouse's system was much cheaper to supply power, and they were winning contracts left and right to supply new areas.
Edison's company and their allies fought back, advertising that AC was lethal and DC was safe, including public demonstrations of shocking stray dogs with first DC (they survived) and then DC (they died.) There were genuinely many deaths from AC in the early days, in part because there was little regulation and high voltage AC transmission wires were often run close to low voltage telegraph lines, so shorts between the two systems lead to linesmen on the telegraph wires being electrocuted. Edison's very heavy DC supply cables were run underground.

Interestingly, Edison's allies worked to make sure that the first electric chair would be supplied by Westinghouse generators, and when there was discussion of what to call "execution by electricity", they suggested the term "Westinghoused." (Dynamort and Electromort were also suggested, before "electrocuted" was chosen as the preferred term.)

Over time AC systems were made safer and became the main standard, but the initial selection of 110 Volts DC by Edison lead to the standard US voltage remaining 110 Volts even as areas changed to AC.

This has cost the US $millions over the years in heavier cables, higher currents and shorter distances covered after each transformer.

Other countries that adopted electricity later chose to double the supply voltage for efficiency reasons. To to this day we still have a mess of different voltages and frequencies around the world.

The link below tells the interesting tale of the "War of the Currents", the battle between Edison DC and Westinghouse AC in the early days of electric supply.

 
To be quite honest, Edison was the salesman of his day. He was clever, but he was very good at packaging and marketing. In modern times, someone like Musk for example, has a lot of echos of Edison's showmanship.

 

DC was extremely impractical for distribution. You could not transform the voltage up/down without motor-generator sets, which are inefficient, bulky and noisy. You also have a lot of issues around the types of arcs it generates in switches / at sockets (which limit the voltage) and with galvanic issues where it can pull ions of metal from point to point and either strip a contact surface, or electroplate it. 

 

DC is also quite cumbersome for electric motors, whereas AC can drive them directly. Although, it was easier for big traction motors on trains as you could use variation voltage to control speed. 

 

In the US context, everything happened around Edison and he was the big noise in power. In Europe, there were others, and notably big companies like Siemens, but there was no single system or philosophy for doing things certainly until after WWII and the beginning of pan-European standards for some of these things.

 

For example, the Depford Power Station in London, which was driven by Sebastian de Ferranti's ideas, generated AC back in 1888 at 88 1/3 Hz

 

There were environmental objections to large scale power infrastructure or motor-generator sets in London as they were noisy in the case of the converters or smelly in the case of small coal burning plants, so it was one of the first cities in the world to develop a HV power grid, distributing at 11kV 88 1/3 Hz with domestic / residential and commercial customers being provided with AC single phase connections.

 

AC was also being used in Paris around 1878 - There were AC power systems installed at a Paris Expo mostly for public lighting.

 

This allowed a big plant to be located out of town, thus solved a lot of annoyance with the prospect of ugly infrastructure close to central London.

 

Other cities adopted other systems, including Edison-style DC, but there were a number of standards that were all in existence simultaneously.

 

Some places adopted 3-wire 127V / 220V systems which would be quite comparable to the present day US system. Some adopted DC systems and others adopted systems like 200V 50Hz, 210V 50Hz and ultimately 220V 50Hz.

 

Here in Ireland the power system was nationalised in 1927, with the local power companies being gradually bought up by ESB, standardised and linked to the grid and they picked 220V (380V 3-phase) 50Hz. (There had also been 200 and 210V 50Hz systems in use too.)

 

50Hz seems to have just been a convenient standard as it allowed a balance between flicker free incandescent bulbs and not having to run the generators at very high RPM. It's also 100 peaks / troughs per second, which fits with the logic of metric base-100. 

 

60Hz came about for exactly the same reasons, but was just landed upon possibly because of it being 60 cycles, 60 seconds in a minute, 60 minutes in an hour etc.

 

They're quite arbitrary.

 

A lot of older European fittings and fixtures are still rated for AC/DC. So, you'll typically see 16Amp / 10 Amp DC because there's a difference in safe current carrying and arcing characteristics.

 

The reason for the switch on UK sockets was also to do with DC arcs on 200V systems.  There was a preference for switching off the power before removing the plug, to prevent arcing and thus avoid damage to pins / contact surfaces etc. There were even types that had a pin with a groove that was locked in place when the switch was on, and only removable when the power was switched off.

 

But there were umpteen different companies working in competition and harmonised standards only really emerged for all sorts of things in the 1940s. Everything was developed commercially and there were competing designs of plugs, sockets, bulb holders, wiring systems, voltages and frequencies. There's a lot of discussion that seems to talk about 'a European' approach vs 'an American' approach but in reality there were just multiple companies all doing their own thing.

 

The market ultimately needed standards, as otherwise appliances were very very complicated to produce. So, they emerged and were formalised and ultimately legislated for.

 

There's also a discussion that seems to assume Europe moved from 110-120V to 220/230V. That's also not exactly true. 220V 50Hz was always used in some areas, and was introduced in others in the 1940s/50s. So, it very much depends on where you are.

 

The basically reality of it is that 220V (now 230V) 50Hz ultimately won the standards wars in Europe and for the most part so did Schuko plugs/sockets, with a few outliers (including here).

 

Westinghouse first began experimenting with AC power in the US using already developed technology. He imported a number of Gaulard-Gibbs transformers, which had been developed by Lucien Gaulard (France) and John Dixon-Gibbs (England) and a large Siemens AC generator. His AC systems basically sprung from there.

 

The downside to AC in the 50-60Hz range anyway, is the cycling can trigger muscle contractions which can cause your hand to grab a conductor and can cause atrial fibrillation. 

 

DC causes far worse burns and tissue damage though as the current flows through continuously and it can also cause weird effects like electrolysis.

 

Both of them are pretty nasty at high current/voltage, but I wouldn't really give DC Edison's marketing safety approval. 

 

The 230V 50Hz system is pretty rock solid and extremely safe, once you use it with the correct equipment and fixtures and fittings have evolved to become safer and safer over the decades. 

[this post was last edited: 10/7/2021-11:09]
 
I've been a long time "electrification" fan, but it wasn't until a book I read recently (Edison & The Electric Chair - Mark Essig) that I learned of the sheer number of people getting electrocuted on the streets and in their own homes during the 1870's and 80's. We forget Edison wasn't the only game in town in the early days, each 'company' putting up their own poles and stringing their own wires wherever they pleased, typically for industrial or arc light use where AC was frequently used, and in either case, high voltage.

There were reports of shopkeepers getting killed when their awning poles would brush against bare wires, reports of "abandoned" circuits that turned out to still be live, rogue power wires dropping or rubbing against phone circuit wiring and electrocuting people in their homes. And then the poor saps working for the myriad phone companies (this was pre-monopoly) and telegraph services, wading through all these lines. It was an epidemic and made electrifying the home a tough sell.

To Edison's credit, he insisted all of his circuits be buried under the streets, which was at his expense. And wire insulation, cable troughs, fusing and protection, underground interconnects to buildings, etc. etc. also had to be developed and tested. No small undertaking! Despite NY passing a law that any future installations had to be buried, few others actually complied, and it wasn't until the blizzard of '88 that things changed. I forget the figure now, but there were hundreds, if not thousands of miles of abandoned wire cut down afterwards.

The other reason that DC was so popular is that electricity was primarily used in industry for mechanical power, and no AC motors existed at the time, only DC. Electroplating was another big business, which also is DC-only. Lighting doesn't care, but then there's no flicker to worry about, especially compared to low frequency generation (Niagara Falls 25Hz, anyone?). So even though AC could be distributed great distances, it was only after Westinghouse offered a motor that AC really became a viable option, and it's why many older large cities continued to offer a DC service up until a few years ago just for elevators and traction motors.

I'd still like to know the logic behind 110V. I've been hit with it enough to know I prefer it to 240V :) , but there was A LOT of investigation put into the effects of voltage levels, types, and current levels all through the 'electric chair' era (not just by Edison) to try to make corporal punishment "more humane". Very much in line with the popularity at the time in the "rise of morality, clean living, progressiveness & purity", etc.

cadman-2021100820220904646_1.jpg
 
USA is largely 110v/120v due to monopolies and business agreements really. There isn't now nor wasn't then any real reason or benefit to remaining with 120v power at 60hz, but as electricity spread across nation that standard was adopted things flowed.

Post WWII much of Europe was in or near ruins. This required not only rebuilding infrastructure but also housing. Decisions were made to go with 208v-240v (or in some cases higher) power at 50hz for domestic purposes as standard. Prior to WWII and maybe for a bit after in come European countries one could find all sorts of appliances that ran on 110v-120v power.

Keep in mind United States blessed with abundant natural resources for fuel (oil and gas) meant many things called upon for say heating use those sources instead of electricity.

Washing machines in North American largely were top loading variety that did not self heat water. They got their hot water from taps feed by (usually) oil or gas fired storage water heaters. Instant water heaters were known in USA going back to early 1900's, but never really took off way electric versions did and have in Europe.

As someone mentioned previously in this thread, watts is watts... Amount of energy required to say raise one gallon of water "X" degrees per hour is constant. 208v-240v power will get you there faster, but overall same amount of energy is used on average.
 
watts is watts

yes, but...

1. Lower voltage means heavier, more expensive cables.
2. Lower voltage means more amps for the same job.
3. Max wattage for a standard US power outlet is 1700 Watts, isn't it?
In Australia it is 2400 watts, in UK and Europe it is 3000 or 3600 watts, isn't it?
That means faster kettles, more effective irons, faster heating washing machines and dishwashers, and so on.

re point 1, in the past I converted a few washing machines to run off 12 volts DC. They worked well but even with special high efficiency motors they needed big fat cables to work. like each wire as fat as a pencil. No self heating, either.
 
Gizmo, to your points, it's a little...tricky.

The copper is more expensive and heavier gauge for 120V, but since our homes have 240V available at the main panel, anything that uses substantial power like a clothes dryer, heat pump, water heater or heavy-duty window AC gets it's own 240V feed. Where we do save copper is with our split-phase system. One need only run 12-3 wire with a hot from each bus bar and share the neutral eliminating the extra conductor.

The "tricky" part is that to keep things simple, most electricians run 14 gauge for lighting (15A) and 12 gauge (20A) for receptacles, so most houses built in the last 40-50 years have 2400W available on a circuit, but 15A receptacles are usually installed as they're cheaper (a box of 10 is roughly $5 USD). The NEC does have a requirement that kitchens, baths and dining rooms have 20A service and I think they require at least one 20A receptacle. That's standard for garages and outdoor recepts as well.

Electric kettles aren't really a thing here, and appliance makers never really took advantage of the 20A branch circuit capability most homes have, save for the very rare Amana Touchmatic III microwave from the early 80s, which nobody has ever heard of.

Outside the home, pretty much anything commercial is always done with 20-amp 5-20R receptacles.

BUT, back before 200A service was the norm, 15A was very common. And if you really wanted to run all of your kitchen smalls at once, a 240V "Kitchen Center" with individual outlets and cords with one feed made more sense.
 
Hello Gizmo!

1. Lower voltage means heavier, more expensive cables.

Generally true, but many older and even new construction are over built when it comes to wiring. That is to say heavier cable was used than required. Especially in multi-family where abuse was expected.

2. Lower voltage means more amps for the same job.

Suppose so, but wouldn't much also depend upon what job wanted doing?

3. Max wattage for a standard US power outlet is 1700 Watts, isn't it?

In theory a 120v circuit at 20 amps (standard for refrigerators, air conditioners, and other appliances with heavy power draw), max is 2400 watts. At 10amps max goes down to 1200 watts.

Now NEC does specify that a circuit breaker shouldn't handle more than 80 percent of the load for which it is rated unless the breaker is labeled otherwise. By this standard, the total current draw on a 20-amp circuit shouldn't exceed 16 amps.

To provide a margin of safety, the total draw on the circuit shouldn't exceed 16 amps at any one time, which translates to a maximum power draw of 1,920 watts on a conventional 120-volt circuit. Again breaker won't trip or fuse blow unless or until 2400 watts is reached, and maybe not then if only momentarily.

Case in point, older Miele washing machines that could be wired to run in North American at 110v/120v had two 1500 watt heater legs. When switching between 208v/240v down to 110v/120v merely disconnected one of those heater legs so machine only drew half of total 3000 watts of heating power.

Latest Miele washers such as W1 are 120v only, but heating power is barely about 1000 watts. Yes, they use less water for washing than older washers, but heating tap cold water to hot or boiling is a bit of work. Thankfully thanks to modern technology washers can utilize both hot and cold water connections, thus filling with warm water if necessary to take edge off heating requirements.

You simply cannot have a "cold fill" only front loading washing machine sold in USA running on 120v/20 amp circuit. My AEG washers allow ten minutes or so for heating water from tap cold to set temperature, and they do so without fail for most part.

OTOH my older Miele washer if run at 120v would likely take twenty or more minutes to bring tap cold water to 140F or higher.

It has been required by code here in USA since about 1960's or so that newly built homes or apartment buildings provide at least one 120v/20 amp circuit in kitchen (usually for refrigerator), and another near a window (for air conditioner).

The 20 amp circuit in kitchen serves both fridge and microwave. When both are running together can often tell microwave isn't getting its fair share of juice. Once condenser on fridge kicks off, oven is happy.

Max watts for an AC here you can find to run at 120v/20amp is around 10,000 btu of cooling power. Above that things start going into 220v/240v territory.

None of this touches difference in power demands for resistance loads (such as heating), versus say temporary spike caused by a motor starting.

My older ironer presses are rated 1400 watts. Some vintage rotary ironers were rated almost 1600 watts, again all on 120v power. Miele B990 is only 1500 watts and doesn't go higher than about 320F temperature wise. Far to low for heavy linen or cotton fabrics, especially if they are damp or too much so.
 
There wasn’t really a decision to move Europe to 220-230V after WWII, rather that was already the most dominant standard in Europe before WWII. Some countries, very notably France, used 127V, 3 wire systems. Other countries, like Italy, had 127V for lighting and 220V for power, charged on two different meters. There were also legacy DC systems in some towns and cities, usually predating national grids.

Here in Ireland for example, you could happily go back in time to the 1920s, plug your modern 230V appliance in and it would work. The standard was 220V 50Hz by the mid 1920s. That had begun to emerge as the de facto preferred standard for supply.

After WWII you start to see the emergence of the precursor of the European institutions that eventually became the EU. Many of those were concerned with harmonising and making things easier for commercial activity across the European continent and that would have included founding standards bodies like CENELEC.

If that hadn’t happened you could have had multiple versions of everything from appliances to light bulbs.
 
UK / Europe 'Voltage Harmonisation'

I finally found the letter I received from my local electricity supplier in early 1995. I quote (in part)

" On 1 January 1995 a change was made to the Electricity Supply Regulations "

" For many years your electricity has been supplied within a range based on 240 Volts. In order to harmonise low voltage electricity supplies throughout Europe, from 1 January 1995 the permitted range changed and is now based on a 230 Volt supply. "

" Supply Voltage (and permitted variations) before 1 January 1995 240 Volts (225.6-254.4V) "

" Supply Voltage (and permitted variations) from 1 January 1995 230V (216.2-253.0V) "

Efforts were made (since Voltage regulation was very rarely so wide) to maintain the supply at or near 240V. Even within the last 10 years, my supply has regularly been at around 246V.

I have just measured my supply Voltage. It is stable at 238V, which is quite average nowadays.

Hope this is of some interest in the ongoing discussion!

All best

Dave T

P.S. the 'post pictures' input box shows for only a few seconds at the bottom of the reply form, then disappears! I suspect that this may be a compatability issue for my 'rather elderly' copy of Google Chrome, but it is preventing me from posting pictures of various adaptors... :-(

P.P.S. It disappears whether I try to click in it or not ;-)
 
Mine seems to hover around 225V - 232V, although the original spec here in Ireland was always 220V

What I heard was the firmly stated opinion that it’s just a bureaucratic fudge to harmonise isn’t actually quite true. As transformers are replaced, the target voltage is 230V, so eventually you’ll just see 220 / 240V systems vanish and the tolerances reduced. There was just no obligation to unnecessarily remove old transformers, some may have actually been adjusted though further up the networks, as they have adjustable parameters. Small local transformers are typically completely sealed and zero maintenance.

The spec is based on IEC and CENELEC harmonisation recommendation, not just the EU, but compliance with CENELEC specs is mandatory in the EU, many countries far beyond comply with them anyway as it just doesn’t make any sense to be using oddball standards anymore.
 
>> The reason for the switch on UK sockets was also to do with DC arcs on 200V systems. There was a
>> preference for switching off the power before removing the plug, to prevent arcing and thus avoid damage
>> to pins / contact surfaces etc.

When we were in New Zealand, all of the receptacles had individual switches on them. Perhaps obvious in hindsight, but quite foreign to us, the logical outcome of this was that individual appliances often didn't have any on/off switches of their own- they simply relied on the receptacle switch to turn the appliance on and off. The only ones which retained them, were items that had specific ergonomic or safety reasons for a switch in another position, or which turned off as a function of another control (timer, etc). Everything else was just always on from the cord's perspective.

So if specifically importing 240V items to use in other countries, it might be wise to provision for this - either by ensuring the receptacles have switches, or that the circuit is separately switched upstream.
 
Switched socket outlets

I seem to recall from when I took my IEE Wiring Regulations exam, that the 'switch' on a socket outlet was designed to be just an 'isolator', to make the socket 'dead' when inserting/removing a plug (especially in the days before shrouded pins), and was NOT designed to repeatedly switch load current. I may be mistaken, as my memory is not what it was, but the contact area does seem very small for switching what could be a 10Amp plus inductive load. I do remember the contacts welding on a socket outlet switch at home, after a short circuit on the load, but I don't think that I ever replaced it.... ;-)

All best

Dave T
 
not designed to be main on-off switch

maybe not in UK but it has been standard practice in Australia all my life. Plenty of simple heating appliances like non-automatic kettles, irons and  heaters have no switch, you switch them on and off at the wall. (irons have a thermostat but who turns it down to zero before switching off at the wall? - nobody.)
 
There's no requirement for sockets here in Ireland to have switches. They're just an optional extra. They're commonly installed, but they're absolutely not required.

Appliances sold here would be more or less identical to those sold elsewhere in the EU, other than the fitted plug being different and I've certainly encountered heating appliances without switches on the continent. You just unplug them when you're not using them e.g. space heaters etc

Older electric kettles, before they were automatic, were simply unplugged. I'm talking VERY old though they were made before the 1970s.

iej-2021101215525805486_1.jpg
 

Latest posts

Back
Top