Shure Whiteboard Session: Understanding Cables for RF

Welcome back and happy New Year! As we head into 2016, we’re back with another whiteboard session. This week’s topic will answer common questions about cable components for your RF setup.

RF cables are an often overlooked aspect of wireless systems. You can have everything else set up perfectly: you’ve got the right kit; you’ve coordinated frequencies; your antennas are placed correctly, but when all is said and done, it’s all too easy to shoot yourself in the foot by selecting the wrong antenna cables.

Shure Whiteboard Sessions: RF Cable Basics

Why Are Rf Cables Required?

RF antenna cables are a crucial piece of kit when:

A: Your wireless receivers are too far away from the performance area to achieve correct antenna placement, meaning you’ll need to cover some distance with cables.

B: Your receivers are installed in a metal rack and you want to avoid poor performance by removing the antennas from inside the metal housing.

Considering Signal Loss

RF cables should be treated differently to standard audio cables. For instance, with balanced audio cables, we can operate successfully over relatively long cable runs. With RF cables, however, signal loss is a much greater issue.

There are three core elements when considering signal loss: 1) The build quality of the cable, 2) The frequency at which you’re transmitting, and 3) The length of your cable run.

Additionally, while all the components of your wireless system should ideally have the same impedance, your RF cables are the component most likely to fluctuate. Always use high-quality cables to reduce risk.

What Cables Should You Use?

Standard RF cables are essentially coaxial cable with BNC connectors at each end. These cables are very similar to those used by the video industry, with one key difference: their signal resistance characteristics. Video coaxial cable is typically 75 Ohm, whereas wireless system operators use 50 Ohm coaxial. (Ohm’s Ω – refers to the amount of electrical resistance in the cable, where the greater the number of ohms, the greater the resistance will be). If you were to use mistakenly a video coaxial cable at 75Ω, you would receive very little signal, so please do take extra care.

Let us now consider the chart illustration below to demonstrate the varying signal loss in two different 50Ω RF cables…


For the sake of example, let’s say our radio microphone kicks out 10 milliwatts (mW) of RF power and is operating at 600MHz. Also, let’s assume there’s no path loss (i.e there is no signal loss from the transmitter to the antenna). In this [perfect] example, our antenna is seeing all 10 mW of power, which allows us to demonstrate the signal loss of two common RF cables.

Knowing that 3dB of signal loss results in a power loss of roughly half, we can use the two examples (RG-58 & RG213) to approximate the amount of loss through each cable. RG-58 would take the signal strength down from 10 mW to around 3 mW, which is quite a lot of loss. RG213 on the other hand only loses 1.9 dB – translating roughly to a loss of approximately 3 mW (7 mW output).

In an ideal world, we would have no signal loss across our wireless setup, so it’s important to compensate for any loss using a signal booster or an active antenna. See the screenshot below for an example of how active antennas can boost the signal significantly and compensate for loss.


Learn More

To learn more about RF cables and other best practice wireless principles, consider attending one of our Wireless Mastered or Wireless Workbench training sessions. To ensure you don’t miss our next whiteboard session video, please subscribe at or simply enter your details below.

Subscribe here

Marc Henshall

Marc forms part of our Pro Audio team at Shure UK and specialises in Digital Marketing. He also holds a BSc First Class Hons Degree in Music Technology. When not at work he enjoys playing the guitar, producing music, and dabbling in DIY (preferably with a good craft beer or two).

Show Comments (8)

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.


  • Noel User says:

    Have you ever tried quad shielded RG-59 coaxial cable?
    I use it for my UHF scanner antenna [800MHz] and there is no difference between it and 50 ohm coax except it costs a lot less.
    Wanta bet your wireless microphone and/or antenna actually has a 50 ohm impedance?

  • Peter Alcock says:

    Do the losses in a case like this (using 75 Ohm coax) vary with frequency?

    Also, I used a couple of TOA aerials for an installation recently, and they specifically said to use 75 Ohm cable (, so what’s going on here? They worked fine by the way.

    Is the characteristic imedance of the cable required determined by the Aerial design or the Receiver design. I’m confused!

    • Tom Colman says:

      Hi Peter,

      Yes, you can use 75 Ohm cable for short jumpers, but the loss over
      long distances will be greater than 50 ohm cable, and yes, losses will vary with frequency, but the cable specification should tell you what they are.

      As for the TOA cables, you’d need to check with TOA why they specifically recommend 75 Ohm. I’m not surprised they worked though, as per my comment above.

      The cable impedance is determined by the impedance of all the components in the system. The below text is taken from this Shure Educational Guide:

      Ideally, for minimum signal loss in antenna systems, all components should have the same impedance: that is the antennas, cables, connectors and the inputs of the receivers. In practice, the actual losses due to impedance mismatches in wireless receiver antenna systems are negligible compared to the losses due to antenna cable length.


      • Steve Caldwell says:

        Yes, It is definitely not best practice to use the incorrect impedance cable. The Impedance (Z) of coax is designed to maximize the power coupling between the source and the load. 75ohm coax was designed because it had less losses for receiving only equipment. 50ohm coax was then created because it was a compromise between the common 75ohm used for receivers, and the 30-odd Ohms that a natural half-wave antenna likes to present to its feed point. 50 ohm cable then became the standard for transmitting equipment – good for low loss, and good for maximum power transfer to an antenna. Most antennas that are ’50 ohm’ are only so because of a design change to create that impedance.

        Most implications of using 75ohm cable on a 50ohm system can only be measured using a VNA, or other sensitive equipment. it manifests itself as an inherent reflection in the join between the two impedance’s, which in turn is a loss in power transfer.

        bad RF practice all the same.


        • Steve Caldwell says:

          Oh, I should add, that 75 ohm cable has lower loss than its similarly sized 50 ohm counterpart, regardless of frequency. Its only when they are used incorrectly that the losses increase.


  • Steve Caldwell says:

    SocratesWept is correct. The characteristic impedance of a piece of coax is designed to interface with the source and load impedance of the items connected to it. The theory is, if you were to measure the resistance of an infinitely long piece of coax, you would read 50 or 75 ohm. As he said, you can actually use a piece of 75ohm coax in a pinch, and the losses can be calculated as a Standing Wave Ratio (VSWR) of 75/50 or 1.5:1. This mismatch does of course occur at both ends of the feedline, as a source Z mismatch, and a load Z mismatch.

  • SocratesWept says:

    75R and 50R are not the resistance of the cable. Measure it with a DVM and see. 😉
    They are the impedance of the cable. It is quite possible to use 75R cable for short runs in 50R systems. The problem is not simple resistive losses it’s reflections and power loss due to impedance mis-match.

1 Trackback

Short URL