Electronics World Cover,TOC,and list of posted Popular Electronics articles QST Radio & TV News Radio-Craft Radio-Electronics Short Wave Craft Wireless World About RF Cafe RF Cafe Homepage RF Cafe in Morse Code Google Search of RF Cafe website Sitemap Electronics Equations Mathematics Equations Equations physics Manufacturers & distributors Engineer Jobs Twitter LinkedIn Crosswords Engineering Humor Kirt's Cogitations Engineering Event Calendar RF Engineering Quizzes AN/MPN-14 Radar 5CCG Notable Quotes App Notes Calculators Education Magazines Software,T-Shirts,Coffee Mugs Articles - submitted by RF Cafe visitors Simulators Technical Writings RF Cafe Archives Test Notes RF Cascade Workbook RF Stencils for Visio Shapes for Word Thank you for visiting RF Cafe!

The History of 50 Ω

A lot of people ask, so here's the answer to the eternal question, "How did 50 Ω get to be the standard RF transmission line impedance?" Here are a few stories.


Bird Electronics will send you a printed copy of their version if you ask for it.

This from Harmon Banning of W.L. Gore & Associates, Inc. cable:

There are probably lots of stories about how 50 ohms came to be. The one I am most familiar goes like this. In the early days of microwaves - around World War II, impedances were chosen depending on the application. For maximum power handling, somewhere between 30 and 44 Ω was used. On the other hand, lowest attenuation for an air filled line was around 93 Ω. In those days, there were no flexible cables, at least for higher frequencies, only rigid tubes with air dielectric. Semi-rigid cable came about in the early 50s, while real microwave flex cable was approximately 10 years later.

Somewhere along the way it was decided to standardize on a given impedance so that economy and convenience could be brought into the equation. In the US, 50 Ω was chosen as a compromise. There was a group known as JAN, which stood for Joint Army and Navy who took on these matters. They later became DESC, for Defense Electronic Supply Center, where the MIL specs evolved. Europe chose 60 Ω. In reality, in the U.S., since most of the "tubes" were actually existing materials consisting of standard rods and water pipes, 51.5 Ω was quite common. It was amazing to see and use adapter/converters to go from 50 to 51.5 Ω. Eventually, 50 won out, and special tubing was created (or maybe the plumbers allowed their pipes to change dimension slightly).

Further along, the Europeans were forced to change because of the influence of companies such as Hewlett-Packard which dominated the world scene. 75 Ω is the telecommunications standard, because in a dielectric filled line, somewhere around 77 Ω gives the lowest loss. (Cable TV) 93 Ω is still used for short runs such as the connection between computers and their monitors because of low capacitance per foot which would reduce the loading on circuits and allow longer cable runs.

Volume 9 of the MIT Rad Lab Series has some greater details of this for those interested. It has been reprinted by Artech House and is available.

Gary Breed wrote in his High Frequency Electronics publication that one explanation offered via an old househam's tale is that, "The most common story is that 50-ohm high power coaxial lines were first made using standard sizes of copper pipe, such as 3/4 inch for the inner conductor and 2 inch for the outer conductor."

Check this out - someone referenced this page on Wikipedia

Try Using SEARCH to Find What You Need.  >10,000 Pages Indexed on RF Cafe !

Copyright 1996 - 2016
Webmaster:  Kirt Blattenberger, BSEE - KB3UON
Family Websites:  Airplanes and Rockets | Equine Kingdom

All trademarks, copyrights, patents, and other rights of ownership to images
and text used on the RF Cafe website are hereby acknowledged.