eHam

eHam Forums => Elmers => Topic started by: N3JBH on September 19, 2008, 07:23:30 AM



Title: Adjustable Tuned Inputs and Cable length
Post by: N3JBH on September 19, 2008, 07:23:30 AM
 Ok I would love to know the reason for why comments like this is made. It appears to me that some folks claim that the feed line length from your transceiver to your amplifier make a difference.  Here are to examples.  Example one was taken from  http://www.somis.org.

"The radio’s output filter reactance and the amplifier’s tuned input reactance do not interact, and then it should make no difference what length of coax is used between the radio and the amplifier. (But it definitely does.)"

The second  example was taken from the Collins 30L1 Manual  States that a cable that is 20 feet 5 inches long is included  and that this length provides for slightly lower system distortion how ever other length’s can be used for convenience,

Finally I have read on here many times from a gentleman for which I have huge amounts of respect for that from what I understand this is all silliness. But my question is what if any merit does using a properly measured feed line between the transmitter and amp serves?  And what would be the ideal length then being we only have one cable feeding a multiple band HF amp?


Title: Adjustable Tuned Inputs and Cable length
Post by: KB1GTX on September 19, 2008, 08:04:00 AM
It's my understanding there should be a 180 degree phase shift caused by a half wave of coax beteween the amp and radio, which is to help prevent osilations.


Title: Adjustable Tuned Inputs and Cable length
Post by: KE3HO on September 19, 2008, 09:37:39 AM
<<<  "It's my understanding there should be a 180 degree phase shift caused by a half wave of coax beteween the amp and radio, which is to help prevent osilations. " >>>

Oh my, sigh. This will be an interesting thread.

The amplifier amplifies whatever is presented at its input. Assuming, just for a minute, that a half-wavelength piece of coax gave a 180 degree phase shift, as you say. Just exactly how is this going to prevent oscillations? How is the tube supposed to "know" that your rig is 180 degrees ahead of what it is amplifying?

If the input to the amplifier is tuned to near 50 j0, then the length of the coax (assuming 50 ohm coax) between the rig and the amp makes no difference.

73 - Jim


Title: Adjustable Tuned Inputs and Cable length
Post by: N5LRZ on September 19, 2008, 10:01:31 AM
A half wave length of any length of coax will have the same inductance on the output as on the input regardless of the inpedance of the coax itself.

Thus you can load a 50 ohm antenna with 72 ohm coax and have the radio side show 50 ohms at the radio.  HOWEVER, this is only at one single frequency.

The farther away you get from this sweet frequency then the inductance of the coax comes more and more into play.  The Standing Wave begins to kick in and you are back to reality.

This is great if you transmit only on one frequency and you have a perfect 50 ohm antenna.  BUT, who has ever had a perfect 50 ohm antenna and transmit only on ONE SINGLE frequency.

Basicly the half wave in actual practice/real world function is a bunch of BS give the realities of the real world and real world practices.


Title: Adjustable Tuned Inputs and Cable length
Post by: W5DXP on September 19, 2008, 10:27:34 AM
I once had a small 2m Heathkit amplifier driven by an HT. There was a definite oscillation problem with certain lengths of coax between the HT and the amp. Consider that piece of coax acts like a stub when the HT is in receive mode and could cause a phase shift that results in positive feedback to the amp input.
--
73, Cecil, w5dxp.com


Title: Adjustable Tuned Inputs and Cable length
Post by: WA3SKN on September 19, 2008, 11:09:50 AM
If line length makes a difference, then you are either not working with the right levels or the right impedance.  A multiband radio and multiband amplifier would make this solution moot!  You would have to design for multiband operations.
73s.

-Mike.


Title: Adjustable Tuned Inputs and Cable length
Post by: NA3CW on September 19, 2008, 12:52:09 PM
Many years ago I worked for a manufacturer of linear amplifiers who shall remain nameless.  A coax cable's length makes no difference IF there is a good resistive match at all frequencies of interest.  In case of an amplifier's tuned inputs, it's most definitely not a resistive match at all frequencies.  We had "interesting" things happen on 10 and 15 meters with certain lengths of cable.  For example, a short length of cable, imperfectly matched, can present added capacitance to the tuned circuit input of the amplifier, thus detuning it.  In our case it would cause amplifier efficiency to suffer, presumably because of waveform distortion.  The other effects have faded in memory due to the passing of years.  

My two cents.

73,
Chuck


Title: Adjustable Tuned Inputs and Cable length
Post by: N6AJR on September 19, 2008, 01:52:11 PM
I am trying to remember the reason for the exact length on the collins amp setup.  I have heard it before.  Perhaps it creates enough dc resistance in the line to load the output prio[erly. I really don't rememebe, but I do know this was mentioned.. any collins experts out there???  


Title: Adjustable Tuned Inputs and Cable length
Post by: N3JBH on September 19, 2008, 02:52:51 PM
Hey big brother what makes a true freind ??? :)


Title: Adjustable Tuned Inputs and Cable length
Post by: N6AJR on September 19, 2008, 04:14:36 PM
What a true friend is :

Some one who will wipe your derrier when you have broken both arms and have them in casts..

now thats a true friend..

later litle brother


Title: Adjustable Tuned Inputs and Cable length
Post by: WB2WIK on September 19, 2008, 04:54:04 PM
Nah.  A friend is someone who will help you move.

A true friend is someone who will help you move the body.

But re line lengths:

Unless the input of the amplifier is swamped by a lossy resistive network having a terminal resistance of 50 Ohms, all amplifier inputs are reactive.  Even those with a tuned input can only provide a good match at one frequency, and at one cathode impedance (for GG tube amps) which will depend upon cathode current and, in turn, drive level.  Nothing's perfect.

Solid state amps would be even worse if not for the fact that the high powered ones use FETs which have so much gain the designers throw a lot of that away with input attenuators built into the amps.  Those attenuators "lose" power but also improve matching.

VHF solid state amps are terrible about this: They only provide a reasonable match over a narrow frequency range and power level, and because most of them are bipolar and don't have gain to spare, they don't use input attenuators.  As such, their input Z varies all over the place and is rarely a good match to 50 Ohms.

In such a case, the patch cable between the exciter and the amp is part of a critically tuned matching network, and of course its length is important.  With my SS bipolar amps on 50-144-222-440 MHz, I can "make" or "break" each installation by changing the patch cable length, in some cases by only an inch or two.

Tetrode amps with grids driven are a lot more forgiving because they have so much excess gain, it's easy to "throw away" 6-10 dB of the available gain with a great resistive pad on the input.  A 6 dB pad on the input of an amp means the SWR can never be above 2:1, even if the circuit that follows is an open circuit or a short circuit.   That's nirvana when it comes to keeping things very stable and making all the hardware happy.

WB2WIK/6


Title: Adjustable Tuned Inputs and Cable length
Post by: WB6BYU on September 19, 2008, 05:22:50 PM
So to summarize:

If the input to the amplifier is really 50 ohms resistive
at all frequencies of interest, it shouldn't matter what
length of coax cable you use.

If it isn't 50 ohms resistive, then the length of the
cable will affect the load that exciter sees.  I suspect
that the Collins amplifier had and odd input impedance
on some bands that some rigs had trouble driving.  (Many
tube rigs could match loads higher than 50 ohms but ran
out of loading capacitance much below that on some bands.)
In that case, the manufacturer selected a length of coax
that provided impedances that were easier to match.  They
may even have tuned the input stage to present a resonant
load to the rig given this specific cable length.  Using
other length cables would still work, but required the
exciter to match other impedances which might not be as
efficient.

The important point here is that this is a specific
situation with a load impedance that is NOT 50 ohms.
You can't get a 50 ohm input impedance connecting 50 ohm
coax to such a load, but you can transform, say, 25 ohms
to 100 ohms, which many rigs are happier to drive.  Any
such coax has to be a compromise covering all bands,
since the length won't consistently be a quarter wave
or some multiple thereof over all the bands that the
amp operates.

So there may be some situations where a particular length
of coax makes it easier to match a load, but you can't
extend that to a general case.


Title: Adjustable Tuned Inputs and Cable length
Post by: G3RZP on September 20, 2008, 01:35:35 AM
he length of coax should be 90 degrees or a multiple therof. This is so that as the input impedance of the amplifier varies over the RF cycle (as it will!) the load presented to the driver remains resistive, and so the introduction of phase distortion is avoided. Even Class A amplifiers have some variation in input capacitance caused by changes in electron density over the RF cycle.

The choice of an even or odd multiple of 90 degrees is sued is dependent upon the total phase shift required between the driver and the amplifier. A tetrode driver has a high source impedance, and a 90 degree phase shift makes tghis look like a lower impedance drive source with better regulation.

This is summary from Pappenfus, Bruens and Schoenike 'Single Sideband Principles and Circuits'. The book is aimed mainly at tube equipment, but the principles are the same for solid state. It's book well worth getting.


Title: Adjustable Tuned Inputs and Cable length
Post by: W8JI on September 20, 2008, 04:05:50 AM
I can answer that...

<< Example one was taken from http://www.somis.org.

"The radio’s output filter reactance and the amplifier’s tuned input reactance do not interact, and then it should make no difference what length of coax is used between the radio and the amplifier. (But it definitely does.)" >>>>>

The answer is, the author does not understand transmission lines or amplifier input circuits.

If you look in ANY transmission line textbook you will see the capacitance of the line has NOTHING to do with the line SWr or tuning of the load except as it determines the transmission line surge impedance.

Any one of us who knows any of the basics of transmission lines knows that the load exclusively determines SWR and if we match the line to the load, it is matched for ANY line length. If it has standing waves, it has standing waves for ANY line length that are always the same level except for loss along the line.

Never anywhere in any real peer reviewed textbook do you see the line capacitance being "part of the input matching".

Now here's one you really will love....

<<The second example was taken from the Collins 30L1 Manual States that a cable that is 20 feet 5 inches long is included and that this length provides for slightly lower system distortion how ever other length’s can be used for convenience>>

All that 180 degrees stuff and so on is just some weird idea about distortion and loadlines. If you actually measure the electrical length of the line and the networks in Collins manuals where they make that claim, they very clearly are telling a big tall tale. The electrical degrees of the pi network in the exciter, of the line, and the phase shift in the input networks only can total a certain number of electrical degrees on one or two frequencies.

That's an absolute fact, and any of you can prove it if you take the time.

Now I don't know why someone put that in the Collins manuals, but I suspect they want a certain cable length in some amps because the amps are unstable. If you take a 30L1 amplifier for example and put it on ten or fifteen meters and unterminate the input while it is keyed, it will oscillate under certain conditions of tuning and loading.  

If you add enough input cable attenuation you can terminate the input port and the uncontrolled unwanted feedback is much less likely to make the amp unstable.

My guess, and I have no idea for sure, is some engineer decided to not properly neutralize the amplifier and simply tell customers to use a certain length of cable. If that length is not used and the amplifier input is not properly terminated for unwanted undesired regenerative feedback. It would be pretty rough for someone to say "we didn't neutralize this amplifier so on higher bands you might get into stability problems if you use a short cable into a high Q network on an exciter".

Better to make a shortfall a "feature".

One thing we know for sure, most other manufacturers including Heathkit neutralized their 4x811A amplifiers. So did Gonset, and so does Ameritron. You can use any cable length you want with them, and they are stable and the IMD does not change.

Collins is the only one that didn't neutralize thse tubes, and it is the only amp that actually does change IMD in a two-tone test as you vary input cable lengths on 20 through ten meters.


I can't say why they put that in the manual, but I can say one thing. If you neutralize the 30L1 or the 30S1 amps and then rn a two-tone IMD test, the IMD no longer changes on higher bands with input cable length. They are also unconditionally stable if you do that under any case of input termination.

You be the judge.

73 Tom



 






Title: Adjustable Tuned Inputs and Cable length
Post by: W8JI on September 20, 2008, 06:54:11 AM
The length of coax should be 90 degrees or a multiple therof.>>

Nonsense. The tank adds phase delay as does the input circuit. That delay varies with tuning and other adjustments, and it even varies with dynamic impedance variations in the PA stages. It varies with frequency too.

If you actually look at a 30L1 and KWM-2 combo for phase delay, you'll see it isn't the phase delay they claim (not even close) on any band let alone on all bands. Not only that, the Q of the tank circuits prevent and variation in impedance over the RF cycle.

What you will find is the IMD varies with tuning and loading and line length an abnormal amount, being most critical with short lines. The problem is the 30L1 has terrible stability above 20 meters. It makes a darned good oscillator on 28 MHz.

When I ran IM tests on my Collins gear I found it was very critical for feedline loss and feedline length up near 15 meters. The problem however was the stability of the amplifier. When I added a neutralizing circuit in my 30L1, just like Ameritron, Heat, and Gonset used in their 811 amps, the problem of critical feedline length vanished.

A big attenuator at the amp did much the same, but a hi-Q tuned circuit actually made a good IM tweaker.


>> This is so that as the input impedance of the amplifier varies over the RF cycle (as it will!) the load presented to the driver remains resistive, and so the introduction of phase distortion is avoided. >>

Sorry, but the input circuit and tank circuit or low pass filter acts as a flywheel, and any fractional cycle variations are long gone with even a very modest Q. It takes many dozens RF cycles (or more) for the envelope to change in most pieces of gear we have because of bandwidth limits of filters and tank circuits. It certainly cannot do it at a fraction of a cycle.

Just to allow an impedance variation in a fraction of a cycle on 7 MHz the bandwidth of the output filter, feedline, and input circuit would have to be well over 14 MHz wide. Any radio we can use on the air and not have the FBI, CIA, FCC, neighbors, and hundreds of angry Hams knocking at our door would by necessity NOT be able to transfer a fractional cycle variation to or from the load.

<<Even Class A amplifiers have some variation in input capacitance caused by changes in electron density over the RF cycle.>>

Meaningless. We don't use direct coupled oscilloscope style amplifiers in our signal processing stages. Even a network with a Q of 1 or 2 would severely restrict impedance variations.

If you have an engineering problem you don't want to fix, you make it an engineering feature.


73 Tom


Title: Adjustable Tuned Inputs and Cable length
Post by: G3RZP on September 20, 2008, 08:40:07 AM
Actually, I meant to say that the length of the coax plus the phase shift in the various tank circuits should be a multiplke of 90 degrees, which is a very different thing altogether. Sorry about that.

There is an impedace change with input level, especially in classes with grid current. Reducing the source impedance helps this. This can be done with NFB, but this can lead to extra distortion in the driver stage, which the use of suitable cable plus tuned circuit phase shifts, combined with correctly applied NFB (over not more than two steges if you want it unconditionally stable), can improve. See IEE HF Communication Conference Document of 1963. For a GG stage, it's important to have a low impedance source at the second harmonic. So a pi network should be used, rather than an L-C-L tee. As said previously, even Class A stages suffer this phenomena, although at low levels it's not too noticeable, and at high levels, Class A isn't generally used. As gain varies slightly with signal level, you find it in MOS devices, where the Miller effect leads toa small change in input impedance as gain varies.

As far as the Collins amps are concenred, I doubt they have a very low input SWR. This may lead to there being a magic length of cable, either to prevent oscillation or to provides n SWR that the driver could work into on all bands - or both. I once had an Atlas 210 to review. On 15m, it needed a special length of coax (although they didn't say so) between the tuner and the rig if it wasn't to oscillate. The impedance transfer from the tuner via the filters was such that the PA load impedance fell into a spot on the Smith Chart where the devices were unstable.

In reality, the length of cable should be sufficient to reach from the driver output socket to the amplifier input socket with some slack....


Title: Adjustable Tuned Inputs and Cable length
Post by: W8JI on September 20, 2008, 05:17:13 PM
By the way, all amplifier inputs are NOT reactive. All the amplifier inputs that WB2WIK has might be reactive, but it is certainly possible to have a 1:1 SWR with negligible reactance on any amplifier if it:

1.) Is stable

2.) Has a properly designed input system

Granted, there are a lot of poorly matched amps out there. Saying all amplifiers have reactance is wrong however. There is no reason they have to have mismatch on the input other than design problems.

Now grounded grid amps, because of the inherent very high levels of negative feedback, do show input impedance changes with tuning and loading unless the tube has a very high ratio of anode to cathode impedance.

Amps that draw grid current also have dynamic varaiations in grid to cathode impedance, but a good design swamps that out so the exciter doesn't see it.

73 Tom

 


Title: Adjustable Tuned Inputs and Cable length
Post by: N3JBH on September 20, 2008, 07:06:22 PM
Thanks Tom i knew you make sense out of all this stuff for me. You know you was the Guy i stated about having such high respect for and treasure all information i learn from you Jeff