Television images are created with a stream of still images. In the US, the stream is 60 fields each second. As two fields are needed to make a complete frame or image, the frame rate is 30 per second. In Europe, the rate is slightly slower, at 50 fields and 25 frames per second. As 30 or 25 frames per second is fairly slow, there can be a noticeable flicker and movement can look jerky as well. This is more of a problem in Europe than the US because of the slower frame rate in Europe. The frame rates were chosen years back according the mains supply frequency in the US and UK. Although this is no longer of any concern, with advances in elecctronics, the frame rates have stuck. Even with the move to high definition, the frame rates remain the same - 60Hz in the US and 50Hz in Europe. All other countries, by the way, follow one of the two frame rates.
The 120Hz or 100Hz displays aim to overcome the flickering and jerky movement by inserting a new field between each of the fields received by the television. The theory is that with 100 or 120 fields being displayed each second, the human eye cannot discern the discrete images and see only a smooth picture. For static and slow moving images, the theory works well but as with all engineering solutions, this isn't a perfect one in all cases...
Each field that is inserted needs to be made up. It cannot be a copy of the previous field because it would result in two identical fields, returning the output to 60 or 50 Hz. So, the new, intermediate field has to be calculated by taking the previous and the following field and working out what the image would be, if there was an original field at that time. It's a process called temporal interpolation and needs a huge amount of processing power. Sometimes, the calculations to generate the new field can get confused and the result is a disturbing judder in some parts of the image. Broadcasters who have to use similar techniques will spend a great deal of money to get the best conversions and domestic televisions will never have the same level of hardware installed.
These errors are rarely visible, but before spending a lot of extra cash on a 120Hz television, spend some time looking at the image. Look for fast pans across football supporters as the camera follows the ball, for example. Also look at fast scrolling text. These are the images that may cause problems. If you are happy with the image you see and it looks smoother than the 60Hz equivalent model, then it's a good buy. If it doesn't look smoother, you should ask yourself if the extra cost of a 120Hz television is worthwhile.
That said, there are many 100 and 120Hz televisions that do a good job. Just make sure you can see the improvement before you splash the cash.
120Hz is required for 3D view and 60 Hz for normal HD viewing.
240 produces a smoother motion of horiz movement on the screen.
60hz works fine with kinect
Actually plasma screens are in total 600hz but they are divided in 10 subfields, each of which run at 60hz. But in conclusion, 10 subfields of 60hz reduce motion blur almost to zero and look alot cleaner than 1 subfield of 120hz, 240hz, etc.
There is no such thing as '120Hz' HDMI cables. Marketing hype has cables promoted for many things that make no difference in the picture or compatibility. The only features that require special cables are 3D and Ethernet over HDMI, both of which can be had with HDMI 1.4 spec cables. The signal sent to the television is going to be 24 or 60 frames per second (24Hz or 60Hz). It is the TV that creates the 120Hz or 240Hz refresh rate to create smoother motion.
10HZ. Enjoi!
120Hz television takes a standard incoming 60Hz signal and adds images between each of the incoming images. The idea is that the faster image refresh rate makes the viewing experience smoother and more lifelike. To carry out the conversion, a signal processor called an interpolator examines the incoming frames and calculates what the missing image would be if it had been captured by the original camera. It's a complex process and one that doesn't always happen correctly. There are certain images that can catch out most of the frame interpolators. The only way to answer this question is to get to see a 120Hz display and compare it against the same screen running at 60Hz. Look for fast motion and look for fine detail to make a comparison. In some cases, there will be no difference. In others, the image may appear smoother but it may also appear to have some jerky movement. The final decision regarding image improvement must finally come down to the person watching the screen. Note: North America use 60Hz and 120Hz. The equivalent in Europe is 50Hz and 100Hz.
the difference between 60Hz and 120 Hz is that. 60 hz wave will have wavelength of 0.016667 meter and change positive negative cycles 60 times in one second . while in 120Hz wave it can change positive and negative half cycles 120 times in a second and also has wavelength 8.333 mm
60hz Works fine I assume, I've got the same TV that i used to play Game cube and that always asks about 60hz or 50hz some times even 40hz! So I assume you can!
5100 series has 120hz refresh rate
Harmonics are a multiple of the fundamental frequency. If the power frequency is 60Hz, harmonics occur at 120Hz, 180Hz, 240Hz, etc.
HDTVs have become so cheap that now might be the time to buy one. If you are on a budget a normal 60Hz screen will be fine. If you want to spend more you can shoot for 120Hz screens or even a 3D TV.
There is a difference. For one thing, Australia works at 50hz and America at 60hz, and in domestic supply, Australia works at 240v and America at 110v.
ye3s if you compare it to cod5 then you can see a large difference