The AM Forum
March 28, 2024, 08:05:46 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
 
   Home   Help Calendar Links Staff List Gallery Login Register  
Pages: [1] 2   Go Down
  Print  
Author Topic: What makes Heliax good?  (Read 14931 times)
0 Members and 1 Guest are viewing this topic.
K6JEK
Contributing
Member
*
Offline Offline

Posts: 1189


RF in the shack


« on: November 25, 2011, 02:09:13 PM »

7/8" Heliax has much better attenuation numbers than even LMR-600.  Why?

Is it just the spacing? Looking at a photo, I'm not sure the distance between the center conductor and the shield is greater because the center conductor is a tube which takes a lot of space.

Is it the insulation? Both have very low density foam.

Is it just more surface area of the conductors?



* Seven8thsHeliax.jpg (22.14 KB, 612x459 - viewed 910 times.)
Logged
kb3rdt
Member

Offline Offline

Posts: 249


poop cup


« Reply #1 on: November 25, 2011, 03:40:05 PM »

the gas inside the foam is my guess looks the center conductor looks hollow I don't care for coax that has a copper coated center conductor if you buy coax that not solid copper your going have loss there!
Logged
K1JJ
Contributing
Member
*
Offline Offline

Posts: 8893


"Let's go kayaking, Tommy!" - Yaz


« Reply #2 on: November 25, 2011, 04:05:30 PM »

Deleted
Logged

Use an "AM Courtesy Filter" to limit transmit audio bandwidth  +-4.5 KHz, +-6.0 KHz or +-8.0 KHz when needed.  Easily done in DSP.

Wise Words : "I'm as old as I've ever been... and I'm as young as I'll ever be."

There's nothing like an old dog.
Steve - K4HX
Guest
« Reply #3 on: November 25, 2011, 04:14:53 PM »

The lowest loss coax has air dielectric. The loss factor for foam dielectric coax is defined by how much air is part of the mix or how many air pockets are in the foam. The lowest loss stuff is more air than foam.
Logged
WA1GFZ
Member

Offline Offline

Posts: 11152



« Reply #4 on: November 25, 2011, 07:01:10 PM »

That and the center conductor is quarter inch thin wall tubing.
Logged
KL7OF
Member

Offline Offline

Posts: 2313



« Reply #5 on: November 25, 2011, 07:33:58 PM »

I have heliax that has a 3/16 alum center conductor that is copper plated/coated...foam dielectric and copper shield....I have joined pcs together by soldering a short pc of  3/16 copper tube over the center conductor then put in a pc of foam over that and soldering a standard 5/8 copper pipe connector to connect the outer shields....  The analyzer doesn't show any impedance bump..
Logged
W7TFO
WTF-OVER in 7 land Dennis
Contributing
Member
*
Offline Offline

Posts: 2521


IN A TRIODE NO ONE CAN HEAR YOUR SCREEN


WWW
« Reply #6 on: November 25, 2011, 08:21:57 PM »

What makes Heliax good?

At least 3 things:

1..well engineered connector series
2..it is tough outdoors
3..a high velocity factor

73DG
Logged

Just pacing the Farady cage...
W0BTU
Member

Offline Offline

Posts: 230



WWW
« Reply #7 on: November 25, 2011, 09:12:14 PM »

I used to think that the type of dielectric had almost everything to do with the loss in coax. Not so.

After all, foam dielectric RG-8 has less loss that solid dielectric RG-8. However, the reason for that is largely because the center conductor has to be made smaller when we switch from foam to solid dielectric (for the same impedance), and the increases in loss are because of that decrease in the diameter in the center conductor.

Here's what WB2WIK once said on QRZ:

"Up through a few hundred MHz, the "dielectric loss" in coaxial cables is nearly "nothing" for all popular dielectrics, and is very dwarfed by the conducted loss which comes mostly from the skinny center conductor.

"The only reason air or 'mostly air' dielectric cables (including foamed PE, etc) have 'less loss' than solid dielectric cables has nothing to do with the dielectric and everything to do with the center conductor diameter.

"For a typical .405" cable like RG213/U (sized), solid PE or PTFE dielectric cable (DC for both is 2.1 ~ 2.3) will require a #13AWG center conductor to hit 50 Ohms characteristic impedance. Changing to cellular poly ("foam"), which has a lower DC, requires a #11 AWG center conductor -- bigger, fatter, lower loss. Changing to "air," to maintain 50 Ohms you can go to a #9AWG center conductor, which is bigger, fatter, and lower loss still.

"It's all in the conductor size used, not actually the dielectric itself.

"The dielectric material (and its dielectric constant) drives what gauge the center conductor needs to be.

"'Dielectric loss' using popular cable dielectrics is almost immeasurably low for all of them until you get up into the SHF range."


Furthermore, the solid inner and outer conductors in Heliax and hardline further reduce losses compared to stranded center conductor and woven shield braid.
Logged

73 Mike 
www.w0btu.com
K1JJ
Contributing
Member
*
Offline Offline

Posts: 8893


"Let's go kayaking, Tommy!" - Yaz


« Reply #8 on: November 25, 2011, 09:21:24 PM »

Mike,

I appreciate the info about common coax dielectric not having much effect up until the SHF range.

I didn't realize the inner conductor on coax was that critical in size.  

I will defer to your info and have deleted my post about dielectric being the main loss factor.  

TNX, OM.

T


Logged

Use an "AM Courtesy Filter" to limit transmit audio bandwidth  +-4.5 KHz, +-6.0 KHz or +-8.0 KHz when needed.  Easily done in DSP.

Wise Words : "I'm as old as I've ever been... and I'm as young as I'll ever be."

There's nothing like an old dog.
Steve - K4HX
Guest
« Reply #9 on: November 25, 2011, 10:21:51 PM »

Good stuff. Remember, dielectric losses increase linearly with frequency and resistive losses increase as the square root of frequency.
Logged
Bill, KD0HG
Moderator
Member

Offline Offline

Posts: 2563

304-TH - Workin' it


« Reply #10 on: November 26, 2011, 10:10:48 AM »

This discussion makes me ask a question..Why was 50 ohms decided upon as a standard for transmission lines? At HF, most of the losses in coax seem to be resistive in the center conductor. One would think a 40 ohm line with a larger center conductor would be less lossy than the same diameter 50 ohm line.

One example is the RG8-X series of coax, the same diameter as RG-59 but able to handle a lot more power at HF.

Then begs the question, why was 75 ohms decided on as a defacto standard for TV VHF/UHF use, but not for RF transmission? The only major use of 75 ohms for RF transmission that I know of was the RCA FM transmitters of the 1950s, which used their own particular series of rigid line. That might have been an RCA sales scam, one *had* to use RCA transmission line with their antennas and transmitters.

Is 50 ohm line an intersection between resistive and dielectric losses, or not? If it is, why is 50 ohms used for air dielectric lines as well as plastic dielectric lines?

Ideas?
Logged
K1JJ
Contributing
Member
*
Offline Offline

Posts: 8893


"Let's go kayaking, Tommy!" - Yaz


« Reply #11 on: November 26, 2011, 11:38:45 AM »

Good questions, Bill.

I would like to add one too:

Years ago I ran three hardlines to each tower - simply cuz I had the extra feedline and wanted to avoid switching.  

When running 6M I asked if I put TWO feedlines in parallel and then matched them to 25 ohms, would this reduce the loss to 1/2?  I was told that the loss would be the same but was not offered a reason why. I assumed it was because of cummulative dielectric loss, but now I'm not sure.

IE, what will happen to overall loss when two coax cables are put in parallel and then matched properly - will it change?



Steve:

"Resistive losses increase as the square root of frequency"    This is because of skin effect, no?


T
Logged

Use an "AM Courtesy Filter" to limit transmit audio bandwidth  +-4.5 KHz, +-6.0 KHz or +-8.0 KHz when needed.  Easily done in DSP.

Wise Words : "I'm as old as I've ever been... and I'm as young as I'll ever be."

There's nothing like an old dog.
Steve - K4HX
Guest
« Reply #12 on: November 26, 2011, 12:23:02 PM »

IIRC correctly 50 Ohms is a compromise between power handling and loss. The best power handling occurs at around 25-30 Ohms. The least loss is around 75 Ohms (thus small signal lines like video use this). The 25 Ohm line was too expensive or unwieldy to 50 Ohms was selected as a compromise.
Logged
WA1GFZ
Member

Offline Offline

Posts: 11152



« Reply #13 on: November 26, 2011, 05:00:54 PM »

25 ohm cable will also have a low breakdown voltage.
I think video is 75 ohms for the lower C per foot if I remember.
Logged
Tom WA3KLR
Contributing
Member
*
Offline Offline

Posts: 2120



« Reply #14 on: November 26, 2011, 05:38:25 PM »

You’re right Steve.  The only book I have that touched on these optimum numbers for coaxial cable impedance is ironically a book on i.c. design titled “The Design of CMOS Radio-Frequency Integrated Circuits” by Thomas H. Lee.  Pages 141 to 143 cover this topic and go through the equations to find the optimum air-dielectric coax impedances for best power-handling and lowest loss/optimum Q.

For power handling, the number came out to 30 Ohms.  For lowest loss, the number came out to 77 Ohms.  The TV engineers rounded this number off to 75 Ohms.

Again rounding off the geometric mean of these two values (48 Ohms) gives us the 50 Ohms compromise.
Logged

73 de Tom WA3KLR  AMI # 77   Amplitude Modulation - a force Now and for the Future!
W0BTU
Member

Offline Offline

Posts: 230



WWW
« Reply #15 on: November 26, 2011, 05:46:43 PM »

Mike, I appreciate the info about common coax dielectric not having much effect up until the SHF range. ...

You can thank Steve Katz, WB2WIK for that one.  Smiley

As for why 50 ohms is a standard, the last story I heard went something like this: During WWII, the developers of radar technology went to their local hardware store, bought some handy sizes of pipe for the inner and outer conductors, and it just happened to be ~50 ohms.

92Ω coax was used in AM car radio whip antennas because of the large impedance mismatch; the most important consideration was low capacitance per foot to minimize attenuation.

There's an interesting thread with lots of other explanations as to the reasons for the different coax impedances at http://forums.qrz.com/archive/index.php/t-244930.html, "Why 50Ω, Just WHO made this decision?".
Logged

73 Mike 
www.w0btu.com
W0BTU
Member

Offline Offline

Posts: 230



WWW
« Reply #16 on: November 26, 2011, 06:10:47 PM »

IIRC correctly 50 Ohms is a compromise between power handling and loss. The best power handling occurs at around 25-30 Ohms. The least loss is around 75 Ohms (thus small signal lines like video use this). ...

... For lowest loss, the number came out to 77 Ohms.  ...

Why would the lowest loss not also equate to maximum power handling capability? After all, lowest loss = lowest heating of the coax.

75 ohms would probably have lower capacitance per foot, but that would really only make a difference if there were an impedance mismatch.
Logged

73 Mike 
www.w0btu.com
Steve - K4HX
Guest
« Reply #17 on: November 26, 2011, 06:50:57 PM »

Power handling is determined by conductor size. You can use larger conductors at the lower impedance.

Logged
KA2DZT
Member

Offline Offline

Posts: 2192


« Reply #18 on: November 26, 2011, 07:10:29 PM »


Then begs the question, why was 75 ohms decided on as a defacto standard for TV VHF/UHF use, but not for RF transmission?

Early TV antennas used folded dipoles which are 300 ohm.  So when coax was used on the antenna, a simple 4/1 balun resulted in 75 ohm coax.  That's why.

I think there were some other factors as to why 300 ohm antennas were used for TV reception.  IIRC I think 300 ohms made a better match to the signals natural impedance in the air or some such nonsense.  If signals even have something called natural impedance.  I seem to remember reading about that or maybe it was something I dreamed about.  In fact, I think that impedance is actually 277 ohms.

I use to dream about antennas all the time, as I installed them for about 45 years.  I also use to have many nightmares about antenna work, but I won't go into that now.

Fred
Logged
W0BTU
Member

Offline Offline

Posts: 230



WWW
« Reply #19 on: November 26, 2011, 07:15:55 PM »

Power handling is determined by conductor size. You can use larger conductors at the lower impedance.

I think there's something we're missing here.

As we increase the OD of the center conductor--for a given shield ID--the impedance lowers. That's a given.

And larger center conductors have less loss at a given current. Also a given.

However, as we increase the size of the center conductor, the dielectric gets thinner.  And when that happens, we lower the voltage rating of the coax. (25Ω coax has almost no dielectric.)

But for CATV applications, and at amateur power levels, is the voltage breakdown rating a factor? I don't think so.

So why should 75Ω coax, with a smaller center conductor than 50Ω coax, have less losses than 50Ω coax, when the I2R losses of the 75Ω cable's center conductor are higher?

I haven't done the math, but perhaps the reason is the same reason that 450Ω ladder line has lower losses: for a given power level, the current is lower as we increase the impedance.
Logged

73 Mike 
www.w0btu.com
Steve - K4HX
Guest
« Reply #20 on: November 26, 2011, 07:19:37 PM »

From Lampen's article in Radio World some years ago:

DIFFICULT RATIO

  One other question that often comes up: Why 50 ohms and 75
  ohms, or any other impedance, for that matter?

  These impedances were not chosen by accident. It was known in
  the 1920s that cables of different impedance worked better for
  one application that the other. For instance, it was
  determined, through experimentation, that the best power rating
  was around 30 ohms.

  Because the impedance of any coax cable is the ratio of the
  sizes of the center conductor and the distance to the braid
  (and the quality, or "dielectric constant" of the plastic in
  between), you might wonder why we don?t have 30-ohm coaxes.

  To be sure, there are customers out there who would buy as much
  30-ohm coax as we could make! The problem is that 30 ohms
  represents a ratio very difficult to make ? so much so that it
  is quite likely that most of what you would make, you would
  throw away. Only a small percentage would be usable.

  The people who would die for 30-ohm coax are the really
  high-power people. Those are customers such as nuclear
  physicists (those with some kind of atom smasher) or medical
  scanners, such as X-ray, CAT scanners or NMRI machines. They
  all use high power and would love to have 30-ohm coax. It would
  deliver their voltage with even less loss (and higher
  efficiency).

  Those customers who wanted low-signal attenuation found that
  the ideal impedance was 77 ohms. But this was an odd number in
  terms of wire sizes. If you "fudged" just a bit to 75 ohms,
  then standard wire sizes and dimensions could be used. This was
  why all those low-power, low-voltage signal-carrying cables
  (baseband video, CATV/broadband, antenna lead-in) were all
  75-ohm.

  And then there are customers who want to deliver high voltage;
  the ideal impedance for them is around 60 ohms. This is an
  eminently "makeable" cable, but it never really got started,
  mainly because it was soon realized that most high-voltage
  customers were often high-power customers too. Therefore, there
  really needed to be a compromise between voltage and current;
  that compromise was 50 ohms.

  We'll have more on the strange story of 50-ohm coax, with a
  trip into the world of wireless microphones, in the next
  installment. Let me know if you have any questions or comments.

  Steve Lampen is a senior audio video specialist for Belden Wire
  & Cable Co. in San Francisco. His book, "Wire, Cable, and Fiber
  Optics for Video and Audio Engineers," is published by
  McGraw-Hill. Contact
Logged
Steve - K4HX
Guest
« Reply #21 on: November 26, 2011, 07:31:12 PM »

Some more links:

http://radioworld.com/article/bell-labs-and-coaxial-cable/17425

http://radioworld.com/article/is-there-an-ideal-impedance/17424

http://www.wired.com/thisdayintech/2009/12/1208coaxial-cable-patent/
Logged
R. Fry SWL
Member

Offline Offline

Posts: 114

Broadcast Systems Engineer (retired)


WWW
« Reply #22 on: November 26, 2011, 07:58:11 PM »

Quote
Why would the lowest loss not also equate to maximum power handling capability? After all, lowest loss = lowest heating of the coax.

The power rating of coaxial line is related to the amount of r-f current that it can carry without exceeding the permissible temperature rise along that transmission line including the effects of load SWR, and the environment (local ambient air temperature, solar heating, and the thermal characteristics of the insulating media within the line, itself).

For example, 75-ohm coax requires less r-f current, and hence has lower loss to convey the same amount of power for a given length and frequency, but is not capable of carrying the same maximum power as its 50-ohm version.

Andrew LDF4-75A, 1/2" OD, 75-ohm Heliax is rated for 0.273 dB loss per 100 feet and 3.8 kW maximum average power at 20 MHz into a load SWR of 1:1.

Andrew LDF4-50A, 1/2" OD, 50-ohm Heliax is rated for 0.3 dB loss per 100 feet and 7.75 kW maximum average power for the same conditions.
//
Logged
W0BTU
Member

Offline Offline

Posts: 230



WWW
« Reply #23 on: November 26, 2011, 08:00:45 PM »

Some more links: ...

Thanks, Steve.

One of those contained this:
"In fact, the first transmission-line coaxial cables were made by taking small copper pipe and putting it inside large copper pipe. If you do this with standard sizes of copper pipe, you will get impedances like 51.5 ohms or 52 ohms."

Interesting. :-)
Logged

73 Mike 
www.w0btu.com
R. Fry SWL
Member

Offline Offline

Posts: 114

Broadcast Systems Engineer (retired)


WWW
« Reply #24 on: November 26, 2011, 08:04:28 PM »

Quote
As for why 50 ohms is a standard, the last story I heard went something like this: During WWII, the developers of radar technology went to their local hardware store, bought some handy sizes of pipe for the inner and outer conductors, and it just happened to be ~50 ohms.
_______________

Just to note that early VHF and some high-power MW AM broadcast stations used 51.5 ohm rigid coaxial line, which was a standard of the RMA (Radio Manufacturers Association) in those early days.  Rumor was that this impedance was the result of the standard dimensions of pipes found in hardware stores, also, but who knows?

Later the industry moved to a 50-ohm standard whose mechanical dimensions were defined by the EIA (Electronics Industry Association).  This was valuable, as it removed the variability in the wall thickness of the inner conductors of rigid coaxial lines, the design/hole pattern of the flanges for their outer conductors, and the cutout for the anchor insulator used at each flange allowing for reduced impedance mismatches where line sections joined.

Previously all of this had caused problems with the mechanical compatibility and system performance possible when a given installation used components supplied by various manufacturers.
//
Logged
Pages: [1] 2   Go Up
  Print  
 
Jump to:  

AMfone - Dedicated to Amplitude Modulation on the Amateur Radio Bands
 AMfone © 2001-2015
Powered by SMF 1.1.21 | SMF © 2015, Simple Machines
Page created in 0.086 seconds with 18 queries.