Yes, that is almost true, apart from a very small copper loss in the primary winding that carries the small magnetising current.
The core loss (iron loss) depends on the applied voltage. This loss is measured by the open-circuit test, carried out at the working voltage.
Iron losses (Pi) are independent of of load which occur due to pulsation of flux in the core. Iron losses include both Hysteresis loss and eddy current loss and is same at all the loads.
It's easily done if you can measure the power drawn at the normal working voltage but with no load on the transformer (open-circuit secondary). All the power is core loss with the exception of a (very) small amount of resistive loss in the primary winding.
Iron loss it includes the core loss is partically the same at all loads and copper loss the value of cu loss is found from short circuit test
Yes, transformer losses will be the same for any linear load with the same VA. However, if the load is nonlinear, such as a rectifier, the load waveform will be distorted and the losses will be higher than with an undistorted sinusoidal load current of the same VA
No load current is excitation current, and is usually specified as a certain percent of base KVA rating at a specific voltage (often tested at 90, 100, and 110% rated voltage). This can be found in the test report for the specific transformer in question, or should be supplied by the manufacturer. Note it is specified as a percent of base, so if heated up from the high side, the current will be less than from the lowside - the transformer's excitation takes the same amount of power regardless of energizing voltage. You could test this roughly yourself by applying rated voltage to the lowside, and measuring the induced current. The load current for a single phase 11kVA transformer would be: 11K / (L-N voltage).
Iron losses (Pi) are independent of of load which occur due to pulsation of flux in the core. Iron losses include both Hysteresis loss and eddy current loss and is same at all the loads.
It's easily done if you can measure the power drawn at the normal working voltage but with no load on the transformer (open-circuit secondary). All the power is core loss with the exception of a (very) small amount of resistive loss in the primary winding.
If no other parameter will change then there will by only slight increase in core loss. However if frequency is increased 10 times transformer will be capable to accept 10 times higher voltage under same load conditions without saturating. This defectively increases transformer power capacity and cores loss will also increase in same degree.
Yes because the transformer heating (power losses) depend on the load current and the load voltage. It can be assumed that the voltage stays more or less constant, therefore the iron loss is also constant. The copper loss depends on the square of the load current. So it is the VA of the load that determines the power loss and any heating.
Iron loss it includes the core loss is partically the same at all loads and copper loss the value of cu loss is found from short circuit test
Yes, transformer losses will be the same for any linear load with the same VA. However, if the load is nonlinear, such as a rectifier, the load waveform will be distorted and the losses will be higher than with an undistorted sinusoidal load current of the same VA
The voltage of a transformer should be a sine wave but if the transformer is overloaded with excess voltage there could be nonlinear effects in the magnetic core that cause harmonics (i.e. departure from a sine wave) in the voltage. The current is determined by the load. If the load is resistive the current and voltage have the same waveform (by Ohm's law) but if the load is nonlinear, a diode rectifier for example, the current will depart from being a sine wave.
It doesn't, really. The power loss occurs via the resistance of the copper windings and eddy currents in the magnetic core. More details are given in the earlier answer to the same question.
Same
remain same
No load current is excitation current, and is usually specified as a certain percent of base KVA rating at a specific voltage (often tested at 90, 100, and 110% rated voltage). This can be found in the test report for the specific transformer in question, or should be supplied by the manufacturer. Note it is specified as a percent of base, so if heated up from the high side, the current will be less than from the lowside - the transformer's excitation takes the same amount of power regardless of energizing voltage. You could test this roughly yourself by applying rated voltage to the lowside, and measuring the induced current. The load current for a single phase 11kVA transformer would be: 11K / (L-N voltage).
The current flowing through a transformer's secondary is the current drawn by the load, so it will be exactly the same as the current flowing through your induction motor -assuming that is the load. Don't really understand the point of your question!