Television images are created with a stream of still images. In the US, the stream is 60 fields each second. As two fields are needed to make a complete frame or image, the frame rate is 30 per second. In Europe, the rate is slightly slower, at 50 fields and 25 frames per second. As 30 or 25 frames per second is fairly slow, there can be a noticeable flicker and movement can look jerky as well. This is more of a problem in Europe than the US because of the slower frame rate in Europe. The frame rates were chosen years back according the mains supply frequency in the US and UK. Although this is no longer of any concern, with advances in elecctronics, the frame rates have stuck. Even with the move to high definition, the frame rates remain the same - 60Hz in the US and 50Hz in Europe. All other countries, by the way, follow one of the two frame rates.
The 120Hz or 100Hz displays aim to overcome the flickering and jerky movement by inserting a new field between each of the fields received by the television. The theory is that with 100 or 120 fields being displayed each second, the human eye cannot discern the discrete images and see only a smooth picture. For static and slow moving images, the theory works well but as with all engineering solutions, this isn't a perfect one in all cases...
Each field that is inserted needs to be made up. It cannot be a copy of the previous field because it would result in two identical fields, returning the output to 60 or 50 Hz. So, the new, intermediate field has to be calculated by taking the previous and the following field and working out what the image would be, if there was an original field at that time. It's a process called temporal interpolation and needs a huge amount of processing power. Sometimes, the calculations to generate the new field can get confused and the result is a disturbing judder in some parts of the image. Broadcasters who have to use similar techniques will spend a great deal of money to get the best conversions and domestic televisions will never have the same level of hardware installed.
These errors are rarely visible, but before spending a lot of extra cash on a 120Hz television, spend some time looking at the image. Look for fast pans across football supporters as the camera follows the ball, for example. Also look at fast scrolling text. These are the images that may cause problems. If you are happy with the image you see and it looks smoother than the 60Hz equivalent model, then it's a good buy. If it doesn't look smoother, you should ask yourself if the extra cost of a 120Hz television is worthwhile.
That said, there are many 100 and 120Hz televisions that do a good job. Just make sure you can see the improvement before you splash the cash.
#
100Hz (or 120Hz in North America) is a technique of doubling the frame rate of the incoming signal. It pre-dates LCD televisions by a number of years and was a feature of CRT televisions from the early 1990s. The frame doubler was effective at reducing flicker that could sometimes be a problem in CRT televisions. LCD televisions do not have the same issues with flicker but 100Hz and 120Hz LCD televisions have been available virtually from the time LCD screens were introduced.
The Samsung is better.
The difference is in the backlight and resolution. LCD usually have better resolution.
New: $3519.00 - $4699.00 (13 stores) Used: $3999.00
Samsung flat screen lcd might range between $400-450 on Samsung 32" Class/720p/60Hz/LCD HDTV while Samsung 40" Class/1080p /120Hz/LCD HDTV might costs 600-650. All this deals you might find them at best buy
The higher the better if for use in USA.
ye3s if you compare it to cod5 then you can see a large difference
5100 series has 120hz refresh rate
They dpon't make LED screens yet just LCD and plasma screens.
Actually plasma screens are in total 600hz but they are divided in 10 subfields, each of which run at 60hz. But in conclusion, 10 subfields of 60hz reduce motion blur almost to zero and look alot cleaner than 1 subfield of 120hz, 240hz, etc.
Yes.