Smart questions
Smart answers
Smart people
INTELLIGENT WORK FORUMS
FOR ENGINEERING PROFESSIONALS

Member Login




Remember Me
Forgot Password?
Join Us!

Come Join Us!

Are you an
Engineering professional?
Join Eng-Tips now!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!

Join Eng-Tips
*Eng-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Donate Today!

Do you enjoy these
technical forums?
Donate Today! Click Here

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.
Jobs from Indeed

Link To This Forum!

Partner Button
Add Stickiness To Your Site By Linking To This Professionally Managed Technical Forum.
Just copy and paste the
code below into your site.

Helpful Member!  nate2003 (Mechanical) (OP)
11 Jul 06 18:13
Here is an interesting article.  Actually, this website has many interesting scientific/engineering articles.



http://www.americanheritage.com/articles/magazine/it/2000/3/2000_3_64.shtml

"I have had my results for a long time, but I do not yet know how I am to arrive at them."  Karl Friedrich Gauss

SlideRuleEra (Structural)
11 Jul 06 21:13
Good Article, that brings back a specific memory. On the final exam of a college electrical power course (in the late 1960's) we had a question that came from that same 50 CPS to 60 CPS conversion.

The question went something like this:
"During the change from 50 cps to 60 cps what had to done to existing transformers?"

I can not longer recall the electrical technical details why, but the answer, as I recall was "Nothing".
Having been designed for 50 cps, the transformers had "more than enough" iron for use at 60 cps. The practical result was that they operated at 60 cps just a little bit more efficiently than designed.

Hopefully some of the EE's here will set me straight if I have gotten mixed up on this issue.

PS: I know that Hz is the proper terminology for cycles per second these days, but this IS the History Forum smile

www.SlideRuleEra.net idea

Zogzog (Electrical)
18 Jul 06 14:50
Great article, Thanks!
Ussuri (Civil/Environmental)
19 Jul 06 8:35
I notice from the Wikipedia entry that the majority of Europe use 50Hz supply.  Why did the US change from 50 to 60 Hz.  Is there a big benefit?
SlideRuleEra (Structural)
19 Jul 06 21:00
In late 19th & early 20th centuries there was a "frequency completion", sort of like VHS vs. Beta VCRs in the 1980's or Blu-ray DVD vs. HD-DVD today.

"Motor People" wanted DC - easy to build and control speeds of large motors for things like trolleys & factories. Since reasonable voltage DC cannot be distributed long distances efficiently, they were willing to settle for low frequency AC. 25 Hz was considered acceptable.

"Lighting People" want higher frequency AC, to totally eliminate incandescent light flicker (slightly noticeable at 25 Hz). 50 Hz solved the flicker problem, and the "Motor People" would grudgingly go with it.

"Clock People" wanted 60 Hz to make the design and manufacture of the numerous devices that had timing components easier (60 min/hr, 60 sec/min, 60 Cycles/Second).

Various electric equipment manufacturers "placed their bets" on which standard would "win".

The clincher was the development of a satisfactory large 60 Hz synchronous converter (A specially design rotating machine - sort of a "black box" for it's day - that turned AC into DC). This was not an easy thing to do until solid state electronics came along.

The synchronous converter allowed the "Motor People" to get their DC by joining the "Clock People" at 60 Hz. Also the "Lighting People" and nothing to lose (technically) by going from 50 Hz to 60 Hz.

And the rest is history.

I had a wonderful electrical engineering professor. He was in his mid-70's when I took his classes in the late 1960's. He had participated in, or had first hand contact with people directly involved in this type of "behind the scenes" work. He always enjoyed passing on this sort of info.

www.SlideRuleEra.net idea

IRstuff (Aerospace)
19 Jul 06 22:54
All sounds reasonable, except for the bit about the clocks.  There were almost no electronic clocks at the time the die was cast.  Most clocks were geared, so there would have been no good reason to worry about dividing the line frequency evenly by 60

TTFN



GregLocock (Automotive)
20 Jul 06 0:08
And I doubt the regulation (or whatever you call it) of the supply frequency would have been accurate enough to use for timekeeping, at the time when the decision was made.

Cheers

Greg Locock

Please see FAQ731-376 for tips on how to make the best use of Eng-Tips.

ScottyUK (Electrical)
20 Jul 06 4:44
Greg,

Actually it was! In the olden days, power generating stations would have two clocks. One was a precision chronometer, and the other was a mains-powered type driven by a synchronous motor. Over the course of the working day the mains clock would run slightly slow when the grid was heavily laden and the frequency dropped slightly, and during the night it would run slightly fast when the frequency was increased fractionally to compensate the slow period through the day. Over 24 hours the clocks matched each other.

Perhaps it was a cunning plan to lengthen the working day and reduce the time employees spent at home!

----------------------------------
  I don't suffer from insanity. I enjoy it...

SlideRuleEra (Structural)
20 Jul 06 9:57
Like ScottyUK says, the frequency varied (and still does) but the overall average was precise enough for everyday use. One application that our professor told us about where the hour-to-hour frequency deviations was a problem was radio broadcasting (1920's style). National Broadcasting Company (NBC) had all of there clocks on one circuit. Of course these clocks had synchronis motors, since electronic clocks did not exist - I believe the first commercial one was the Bulova "Accutron" watch which became available in 1960.

As one radio program ended (in one studio) the next program (in a different studio) had to start up at exactly the precise moment so that there was no dead-air time or overlap. With all the clocks on the same circuit - no matter what happened (power interuptions within the building, etc.) the clocks remained sychronized with each other.

If the frequency was too high - shut down the dedicated clock circuit until time caught up.

If the frequency was too low - switch the clock circuit over to a special generator in the building's basement that intentionally operated at an above normal frequency. Run the clocks at "high-speed" until they caught up with time. All the while, the clocks remain in sync with each other.

www.SlideRuleEra.net idea

IRstuff (Aerospace)
20 Jul 06 10:37
Still unclear why 60Hz would be preferred over 50Hz for that.  Since the clock motors were synchronous, and there's not yet been any mention of what one would need 1/60 of a second resolution for, and whether that would be sufficiently different than the resolution gotten from 1/50 second to matter.

As a shot in the dark, I'd be more tempted by the notion that the US didn't want to be running the same frequency that the Germans used, or something to that effect.

TTFN



SlideRuleEra (Structural)
20 Jul 06 10:46
IRstuff - I have to agree that the "Clock People" being a major factor in the decision sounds odd, will see what I can find.

But in the meantime, here are some great pictures to go along with nate2003's original post
http://www.telechron.net/features/boulder.htm

www.SlideRuleEra.net idea

Quantum50 (Mechanical)
20 Jul 06 10:56
ScottyUK, that's what I've come to understand as well.  As a matter of fact, I saw a program a couple of years ago that indicated the power companies still do that now for the purpose of correcting clocks among other reasons.

I wish someone from Europe with expert knowledge would comment on why Europe has remained at 50Hz and what problems come from it (clocks?) and the corrective actions.
Quantum50 (Mechanical)
20 Jul 06 11:02
Or rather, why is 50Hz preferable over 60Hz?  My motors/transformer training is very rusty, sorry.
GregLocock (Automotive)
20 Jul 06 20:52
Thanks Scotty. So, when did the Europeans settle on 50 Hz? When did the USAns settle on 60 Hz?

Cheers

Greg Locock

Please see FAQ731-376 for tips on how to make the best use of Eng-Tips.

BJC (Electrical)
21 Jul 06 1:05
I think people decided on 60 cycle because it's natural.  ie, 60 seconds in a minute, 60 minutes in an hour.   360 degrees in a circle. etc.  No big mystery they just though in 60s.

The cabel that brought the power to Los Angels was segmented copper. That is it was 8 or 10 strips that interlocked and sprialed around a spacer assemble.  In the hot desert climate under load it creeped and got lower and lower.  They changed it out just in time to use the copper in WW2 for bullets and shell.
In case you want to see it and buy a cool "Nerd TOy" the bur rec is selling them here.
http://www.usbr.gov/lc/hooverdam/service/memorabilia.html
waross (Electrical)
24 Jul 06 1:34
My understanding of the frequency choice is that the US originally started with 25Hz. When Europe started to electrify, they engineers realised that there were a lot of economies to be realised by using a higher frequency and used twice 25 hz. or 50 hz.
When the Us and Canada decided to change to a higher frequency, they took the attitude that if 50 Hz. was good, 60 hz. would be better.
I have had this understanding for so many years that I have no idea of the source of the information. It may be purely speculative.
As for the two clocks in the power house:
I understood that went out in North America many years ago when the utilities started to use time signals from the National Observatory to synchronize the generators on the nation wide grid.
BTW, I understand that Texas is an interesting exception to the inter-nationally interconnected power system.
respectfully
notnats (Mechanical)
24 Jul 06 9:17
Motors and transformers are smaller and lighter on 60 hz, which is probably why it is pretty much standard on ships. I think aircraft use 300hz or similar for this reason

Jeff
davidbeach (Electrical)
25 Jul 06 2:18
Aircraft use 400Hz, not 300Hz.  There are three independent grids in the US/Canada.  The West is all WECC, most of Texas is ERCOT, and the East is a whole conglomeration of coordinating councils.  All power transfers between the three grids takes place over DC links.
GregLocock (Automotive)
25 Jul 06 2:50
"All power transfers between the three grids takes place over DC links."

can you explain the reasons for that in terms that a knuckle-dragging mechie could understand? Is it just to simplify synchronization, or are the distances sufficiently large that the skin effect becomes important? How is the DC generated at one end  and then transmogrified into AC at the other end?

Maybe you could just point me at an article!

Cheers

Greg Locock

Please see FAQ731-376 for tips on how to make the best use of Eng-Tips.

davidbeach (Electrical)
25 Jul 06 12:07
The three grids are not synchronized, so there is no means of making an AC connection.  The DC is created using high power rectifiers and then inverted back to AC using high power inverters.  These are now power electronics but the original high voltage DC links used mercury arc valves.  The inter-grid ties are generally short, with both ends of the DC link at the same facility.  Long DC links are used for bulk transmission over long distances with no intermediate tap points, such as the Pacific Intertie that runs from the Columbia River in Oregon to the northern outskirts of Los Angeles.
BJC (Electrical)
25 Jul 06 12:42
Greg
A DC conversion can be likened to a control valve. You can requlate the power you let through.  A straight AC intertie will let power flow without check.  A big load on one section of the system will suck all the power the system has.
zeusfaber (Military)
25 Jul 06 17:19
Two other reasons for avoiding long ac interconnect links:

1.  The capacitance (to earth) of a long line starts to absorb too much current.  This is a particular problem for buried (or submarine) cables.  The current may be out of phase with the voltage, so nominally wattless - but this doesn't stop it warming the cable up.  At 50 Hz, you don't have to go too many km before a buried cable absorbs its own rated current.

2.  To keep two networks locked in sychronism, you need to be able to transfer significant amounts of current between them.  This becomaes difficult when you have large networks fighting one another over a relatively weak ac link.  The loser is generally the link.  A dc link gets round the problem by simply ignoring it (at which point it gives up and goes away).

A.
Helpful Member!(6)  JerryH2 (Electrical)
8 Nov 06 18:48
Gentlemen:

There are really two different questions in voiced in this:

1) Why the US (and most of the Americas) uses 60 Hz and Europe (and the rest of the world) uses 50 Hz?
2) Why does the US uses 110 V (now set at 120 V) and Europe uses 220 V (now set to 230 V)?

It does seem to be a conglomeration of historical reasons, including state of the art back in 1890’s, which company had a head start, and standardization. Some history:

George Westinghouse did his original engineering using 133 1/3 Hz. Westinghouse had an steam engine driven alternator set running at 2000 rpm (By 1886 mechanical engineers liked to have steam engines in integral numbers of rpm) and with 8 poles the set produced 8000 cycles per minute or 133 1/3 Hz. This was good for lighting as there was no flicker but it turned out it was too high for motors later developed.

The earliest experiments (1886 and 1887) used belt driven generators and tended toward high frequencies like 133 1/3 Hz. This suited illumination, which was practically all that alternating current was used for at that time. By 1889 and 1890 direct driven generators were coming on line. They were more robust but with lower rotation speeds they encouraged lower frequencies.

In the early years of ac there were many frequencies: each engineering team seemed to pick their own. Early frequencies in the US were 133 1/3, 125, 83 1/3, 66 2/3, 60, 50, 40, 30, 25 Hz. When Tesla joined Westinghouse, it was using 133 1/3 Hz. Tesla insisted upon 60 Hz because his ac induction motor was designed for 60 Hz and apparently wouldn’t work at 133 1/3 Hz.

On the Westinghouse Museum website it says that G. Westinghouse assigned his engineers Stillwell, Shallenberger, Schmid, and Scott to find a good frequency. Practical considerations of connecting alternating generators to reciprocating engines then in use demanded a lower frequency than 133 Hz.
Before the end of 1892 they chose 2 frequencies: 60 Hz for lighting and 30 Hz where power was to be converted to DC.

Why did Tesla/ Westinghouse engineering team choose 60 Hz? If it was Tesla that was the driving force, various biographies of Tesla declare different theories ranging from Tesla “thought it was the fundamental frequency of the universe” to “… considered the natural earth had a frequency of 10 Hz and doing experiments with 8 to 20 Hz and 20 to 40 Hz and finally 40 to 100 Hz; he decided that 60 Hz was safe.” It doesn’t seem to have been a desire to do accurate clocks because Henry Warren didn’t patent the synchronized clock until 1916 long after the frequency was chosen. Although Warren was diligent in getting utilities to have tight specs on frequency this didn’t happen until into the 1920’s.

Back in the early 1890’s Westinghouse was involved in bidding electrical equipment for the Niagara Falls power project. However the Cataract Company (in charge of the Niagara Falls project) had already selected hydraulic turbines running at 250 rpm. So if a 16-pole generator were chosen the frequency would be 33 1/3 Hz and if a 12-pole machine were chosen then the frequency would be 25 Hz. The project consultant proposed an 8-pole generator or 16 2/3 Hz. The compromise was 25 Hz. At the time lower frequencies were easier to handle on transmission lines. Another reason is that the Steel industry liked 25 Hz because of huge slow speed induction rollers, which had a low power factor for 60 Hz and worked better at 25 Hz. Niagara Falls generated 25 Hz way into the 20th century. The website says that the Westinghouse Company later wished it had forced through 30 Hz.

By 1910 it looked there would be two frequencies in North America, 25Hz for transmission and heavy industry that needed dc or slow moving heavy machinery and 60 Hz for lighting (less flicker) and general use.

There was an effort by GE to introduce 40 Hz as a compromise between 25 Hz and 60 Hz in the 1890’s but it was too late to overtake the 60 Hz and 25 Hz infrastructures already in place although there were some 40 Hz installations. Even so most installations in the US were done in 60 Hz after Westinghouse and GE cross licensed their patents.

Development of high-speed turbines instead of slow reciprocating machinery and later developments of the rotary converter that worked well at 60 Hz made it easy to shift everything to 60 Hz. By 1920 most of the problems associated with 60 Hz transmission had been solved so that there was no longer any advantage of transmitting 25 Hz over 60 Hz. That seems to be why the US is 60 Hz.

Germany took the lead in Europe of developing electrical power (primarily Emil Rathenau of AEG) and AEG seems to have used 50 Hz from day one. In 1891 AEG had demonstrated power delivery over long distances using 50 Hz. I don’t know why AEG chose 50 Hz. Did the penchant for integer rpm help influence AEG for 3000 rpm and 50 Hz as opposed to 3600 rpm and 60 Hz? Did the preference for preferred numbers influence the choice of 50 Hz over 60 Hz? Did Tesla’s influence pull Westinghouse to choose 60 Hz and resultant 3600 rpm over 50 Hz and 3000 rpm? Europe was even more fragmented in the early days than the US. In 1918 in London alone there were 70 electric authorities with 50 different types of systems and 10 different frequencies and 24 different voltages. But by the 1920’s and 1930’s more and more of Europe was changing to or working with 50 Hz.

As for voltages both Europe and the US seemed to have begun with about 100 to 110 Volts DC because of Edison’s success with replacing gas lights with electric lamps. Although many inventors worked on electric lights, generators and electrical systems, Edison was one of the first and was successful in putting together whole systems not just the pieces. Edison picked 110 VDC because that was the voltage he needed to get enough light out of his bulbs to compete with common gas lamps of the time and yet not blow the filaments in his bulbs too soon.

The Berlin Electric Works (utility owned by AEG) changed from 110 V to 220 V starting in about 1899 to enlarge the capacity of their distribution system since the city (Berlin) was already wired 2 wires. They were probably changing from dc to ac at the time also. They paid for their customers to change their lighting and motors to 220 V and saved on the cost of copper by avoiding having to add more wiring. This spread throughout Germany and later Europe but didn’t take hold in the US.

I wonder if the residue from the bitter conflict between Edison and Westinghouse about the safety of AC vs. DC spilled over into not going above 110 volts for residential users even after Edison’s forces conceded the need for AC.

A lot of this information comes from Thomas Hughes Networks of Power : Electrification in Western Society, 1880-1930 and Benjamin Lamme Technical Story of Frequencies IEEE transactions 37 (1918) 60. Benjamin Lamme was chief engineer for Westinghouse in the early 1900’s.
waross (Electrical)
10 Nov 06 2:42
Thanks for the information Jerry. I enjoyed it.
Respectfully
KiwiMace (Mechanical)
19 Nov 06 23:09
A fairly informative read on the Tesla point of view -

Tesla, Man out of time.
Margaret Cheney
It's a bit of a promo for the Tesla vs Edison and later vs Marconi priority battles, but not bad reading. This is a 1982 biography reprinted in '93 and readily available at Barnes and Noble.

Unusual/brilliant sort of man, probably lacking due credit for his achievements.  Notably he originated the 3-phase system and induction motor, had an agreement with Westinghouse "to earn $2.50 per horsepower of electricity sold" (holy crap!) then gave his patent rights to Westinghouse when asked nicely later on.  That would have been worth 12.2 trillion dollars in 2003...provided he worked the same deal with everyone.


Reply To This Thread

Posting in the Eng-Tips forums is a member-only feature.

Click Here to join Eng-Tips and talk with other members!

Close Box

Join Eng-Tips® Today!

Join your peers on the Internet's largest technical engineering professional community.
It's easy to join and it's free.

Here's Why Members Love Eng-Tips Forums:

Register now while it's still free!

Already a member? Close this window and log in.

Join Us             Close