On Tue, 2009-01-27 at 21:15,
[hidden email] wrote:
...
> Interesting thing is the current draw at each power level was
> not what I expected. I expected the higher voltage to lower
> the current at each power level. It did but a miniscule amount.
> At 10 watts the difference was 0.03 amps. At 100 watts the
> difference was 0.02 amps. I expected a lot more.
>
> The endpoint voltage for the long cable was 12.7 V.
> For the short cable is was 13.9 volts. I kind of expected
> the 100 watt amperage to decrease as the ratio of these two
> voltages. Doing the math, the expected current draw would
> have been 1.6 amps versu the 20 milliamps.
>
> Can someone explain this anomoly?
> Brian/K3KO
Interesting.
I believe the power supply voltage affects the maximum possible power
but has very little effect on the current at any given power level.
It makes sense if you think about it. Because there is a fixed
impedance transformation ratio between the PA transistors and the
antenna, the current in the transistors tends to be directly
proportional to the current in the antenna.
In a tube-type amplifier with a PI matching network, you can re-adjust
the impedance matching for maximum efficiency. So for the same power
level, higher PS voltage results in lower current.
Al N1AL
_______________________________________________
Elecraft mailing list
Post to:
[hidden email]
You must be a subscriber to post to the list.
Subscriber Info (Addr. Change, sub, unsub etc.):
http://mailman.qth.net/mailman/listinfo/elecraft
Help:
http://mailman.qth.net/subscribers.htmElecraft web page:
http://www.elecraft.com