As a milliampere (milliamp or just mA) is 1/1000th of an ampere, we can convert mA to Amps by just dividing by 1000. Another way is to take the current in mA and move the decimal to the leftthree places to accomplish the division by 1000. Here's the scoop: 275 mA / 1000 = 0.275 Amps Note that the decimal in 275 is to the right of the 5, and it's written as 275.0 (with a 0 added to show where the decimal is). Moving the decimal to the left three places gets up to .275 Amps, but we usually hang a 0 in front of the decimal. To convert Amps to milliAmps, just multiply by 1000 or move the decimal to the right three places. Just the opposite of what we did here to convert the other way.
For three phase current the formula used is: Power=1.732*v*I*power factor where power=watts 1.732=3 under-root for 3 -phase current V=voltage I=curent
28.57 Amps.
1000 milliamps = 1 amp.
You don't. The units measure different things.
The prefix 'micro' means, one millionth. So there are one million micro amps in one amp
m means milli, or thousandths. Multiply the number of amperes by 1000.
You have to know the power loading and phase angle (or power factor) between each pairh of the phases, otherwise you could be making serious errors.
For the same power - Watts - you need to run twice as many amps at 220V than at 440V. For the same load, it'll pull half the amps at 220V than it did on 440V
Multiple by 100 - 4.5 X 100 = 4500mA
There are .42 amps in 420 mA. Equation 420/1000 = .42 amps
4.7mA * 1/1000mA The mA cancel and then divide the 4.7 by 1000. voila 4.7mA = .0047A
3500 mA
4.3 amps
.11 A = 110 mA
0.13 A = 130 mA
ma stands for milliamp. The prefix 'milli is equivalent to .001 So 1 amp would be 1000 milliamps and 20 milliamps would be .02 amps
2500 mA
No.
Amps is amps be it DC or AC.
.5 amps equals 500 ma, which is much larger than .400 ma.If you meant between .5 amps and 400 ma, then again, .5 amps equals 500 ma, which is larger than 400 ma by a factor of 100 ma.