How much current can you get from the line transformer in your power supply? That's a really good question. Good luck on finding a good answer. The short answer, of course, is "it depends."

No surprise, the first thing it depends on is what the transformer is rated for. If you buy a 12 volt, 2 amp transformer and then try to get five amps from it, bad things are gonna happen. If you connect your transformer directly to a resistive load, you should have no problem getting the rated current. How often do we do that? More likely we are going to connect it in some configuration of rectifier and filter for a power supply. Now things get nasty. It is almost certain the filter is going to be a capacitor-input type. Now you are drawing power from the transformer in large pulses to charge the cap. Linearity is long gone.

If you look around you will likely find some rules of thumb based on the rectifier type: half wave, full wave, and full wave bridge. That's a start, but I have a strong distrust of rules of thumb, especially if I don't know where they came from. But, as a rule of thumb to the rules of thumb, figure on about the rated current for full wave and half the rated current for the half wave and full wave bridge. But maybe we can do a little better.

There are other factors, like peak current and magnetic flux density in the transformer core, but typically the limiting factor will be heat generated in IR losses and core losses. If we knew all the parameters of the transformer construction we could run through some really nasty math and figure out theoretically what the transformer could handle in our circuit. Almost certainly we don't know more than the volts and amps the transformer is rated for, so even if we wanted to do it that isn't really an option. Besided, I REALLY don't want to run through that math. Of course, then we would still have to test it!

So, maybe we just build it and test it and skip all the math? Well, almost all the math. Still need a bit of arithmetic. But not all the calculus and differential equations to do it "right." I had often thought that what we need to do is measure the temperature rise of the tranformer. But that's not so easy. It is the core we need to measure and it isn't readily available. If part of it is exposed, it is subject to rapid cooling and won't give an accurate indication. I didn't pursue it further.

Then, lo and behold, I was reading through an old issue of
Radio Electronics Annual on the
World Radio History
website. In the
RE Annual 1985 issue I came across an
article
by Mannie Horowitz describing how to
measure the temperature rise by measuring the resistance of the
windings. It had occurred to me this might work, but I had never
followed through. The article starts on pg 75 of the magazine, pg 71
of the PDF. It is the final installment of a series on how to design
power supplies. The general procedure is quite simple. All
temperatures MUST be Celsius!. I will elaborate after:

Measure ambient temperature as Ta. Measure resistance of winding as Rcold. Run transformer in circuit under use for 8 hours. DISCONNECT ALL POWER and immediately measure resistance of winding as Rhot. Calculate deltaT = 254 ( (Rhot - Rcold) / Rcold) Add deltaT to Ta to get Tcore, the internal temp of transformer. If T >= 90 you should be concerned. If T >= 105 the tranformer is overheating and not suitable.That's all there is to the basic procedure. I reworded it slightly from the article, but it is essentially the same.

Now, let's elaborate some. First, as mentioned, all temperature measurements must be in Celsius. The constant 254 as well as the 90 and 105 would need to change to use any other temperature scales.

The constant 254 is the inverse of the temperature coefficient of copper, 0.00393, or 0.393 %. The resistance of copper goes up 0.393% per degree Celsius. By finding the difference in hot and cold temps, then dividing by the cold temp, we are getting a percentage resistance change from cold to hot. By dividing the total percentage by 0.393, or equivalently multiplying by 1/ 0.393, we arrive at the temperature increase. Adding the rise in temp to the ambient temp gives us the internal core temp. Simple!

But we need to be careful. We need to ensure the core temp starts at the temp we record as ambient. To do that, make sure the transformer sits in that temperature environment at a stable temp for several hours. Eight hours at a constant room temperature should be enough for most any transformer.

Another thing to keep in mind is that the change in temp will be the same no matter what the ambient temp is. So if you do this test in an air conditioned shop with an ambient temp of 20C and the resulting core temp is 88C (delta T 68), then put the finished supply in your attic where it routinely exceeds 40C, the core temp will STILL go up 68C to more than 108C, which is very bad. So make accomodations for the environment the device will end up living in later.

And of course, BE SAFE! Be absolutely certain the transformer is unplugged from the wall mains when measuring it in any way. It is also important to disconnect it from the power supply circuitry when measuring the resistance of the windings. The circuitry will most likely change your readings. Quick disconnects are a good idea.

under construction...