this answer was found concerning computer requirements, but the equation used at the end will produce the same results for whichever type of electronics you're trying to guage the watts on.
# Meets ENERGY STAR requirements
# Line voltage: 100-125V AC or 200-240V AC
# Frequency: 50Hz to 60Hz, single phase
# Maximum current: 6.5A (low-voltage range) or 7.5A (high-voltage range) for 100-125V AC, 3.5A for 200-240V AC
The maximum current would be when starting up or when the fans and processor are at 100%. I can't remember the Energy Star regs, but at idle it is somewhere down around 20W. Add the consumption of the monitor to this. A LCD screen will be way less than a large CRT.
Sustained use would be in the middle - I think the guess of 350W is pretty good for active use.
(Note: Watts = Volts x amps, so 6.5A at 120 V = 780 Watts -- that's at peak. Comparatively, this is like 8 light bulbs or 1/2 of a space heater or a microwave.)
Generally speaking, a couple of amps will power up most televisions. The actual current draw will vary as the size, type and age of the set. Certainly an older tube-type set (vacuum tubes other than a picture tube) will draw more. By looking at the manufacturer's information label, one should be able to discover the wattage rating of the appliance. And by simply dividing the wattage rating by the line (or mains) voltage, one can discover about how much current the unit is rated to draw.
There is no one answer to this. It depends on the actual design of the TV. The TV should indicate somewhere (probably on the back) how many amps it will pull.
If that information is not there, you would have to measure the effective resistance of the TV by hooking a multi-meter to the two ends of the unplugged power cable and seeing how many Ohms it reads.
To then calculate the amps, assuming that the TV is meant to be plugged into standard 120V service, you would take the voltage (120V) and divide it by the resistance that you measured.
A TV, or other electrical device, doesn't use "watts per hour", it uses "watts", which is already a unit of power (energy / time). The amount of power used varies widely, depending on size and technology; for a specific TV set, look on the back for the electrical specifications.
there is a place on your tv that tells you the amperage. just look on the back or in your owners manuel
It depends on the type and age of the TV set, but a general answer is between 100 and 300 watts.
draw 0.104 amps
40 amps
100 amps
8,33 Amps
it draws 210 amps
I assume you are talking about the newest Samsung 65", the one with the curved screen. That draws 276 watts which, using watts=volts x amps and assuming your voltage is 115v, gives you 2.4 amps.
draw 0.104 amps
40 amps
100 amps
8,33 Amps
23
it draws 210 amps
amps like.. amplifiers? it depends on how many speakers you have. or amps like.. current draw? again. depends on your power needs, your power amps... ect
1100 watts or about ten amps then another 3 to 4 amps for turn table light and fan
Amps for an oven are governed by the total wattage of the oven and what the voltage supply to the oven is.
It is drawing .06 amps.
It depends on how many amps each TV draws. The continuous load should be 80% of teh breaker or 12 amps. If an average TV draws 2.5 amps that would be 4 TVs. Look for a rating plate on TV and just add the currents up.