Artemus
Veteran Member
This is related to my When "better" is actually worse thread from a few years ago. My soon-to-be son-in-law started complaining about how our newish OLED television was set up while he was here with my daughter over the holidays. We had tried the presets when we got it and decided that "Standard" looked the best and just left it there. He complained that contrast was too high and that moderate motion smoothing was enabled, making movies look like "a soap opera." I let him adjust it to "proper" settings and it was terrible in comparison.
This is perfect example of the "worse is actually better so therefore I am superior" attitude that annoys me so much. I have never liked watching movies in a theater. I have found the jerky, blurred motion that you get with the industry standard 24 frames per second to be extremely unpleasant on large screens...less so on television but getting worse as screen size increased. The motion smoothing gets rid of the annoying strobing effect and makes things much more viewable. But the "purists" have decreed that movies must be filmed and viewed at 24 fps despite how unpleasant it is because "that is what film looks like." The complaints are that a higher frame rate looks "too real." WTF???? (My favorite comment I found on the web is that higher frame rates makes things look "unnaturally realistic." Again, WTF???) And yes, I know you can get interpolation artifacts in individual stills when there is a great deal of motion, but the likelihood that you will ever notice them in real time is trivial compared to the improvement you get with smoother motion.
The irony is that the 24 frames-per-second standard had nothing to do with artistic merit and everything to with the fact that the technology and film costs of 100 years ago required that the absolute minimum frame rate that most audience members found tolerable be used. Movies could look much better now, but the idiotic "it has always looked like shit so therefore shit must be better" attitude seems to be very hard to break. I'm glad that television manufacturers are at least willing to take the steps minimize the effects of a problem that the film producers refuse to fix in the first place.
The same soon-to-be son-in-law buys vinyl records.
This is perfect example of the "worse is actually better so therefore I am superior" attitude that annoys me so much. I have never liked watching movies in a theater. I have found the jerky, blurred motion that you get with the industry standard 24 frames per second to be extremely unpleasant on large screens...less so on television but getting worse as screen size increased. The motion smoothing gets rid of the annoying strobing effect and makes things much more viewable. But the "purists" have decreed that movies must be filmed and viewed at 24 fps despite how unpleasant it is because "that is what film looks like." The complaints are that a higher frame rate looks "too real." WTF???? (My favorite comment I found on the web is that higher frame rates makes things look "unnaturally realistic." Again, WTF???) And yes, I know you can get interpolation artifacts in individual stills when there is a great deal of motion, but the likelihood that you will ever notice them in real time is trivial compared to the improvement you get with smoother motion.
The irony is that the 24 frames-per-second standard had nothing to do with artistic merit and everything to with the fact that the technology and film costs of 100 years ago required that the absolute minimum frame rate that most audience members found tolerable be used. Movies could look much better now, but the idiotic "it has always looked like shit so therefore shit must be better" attitude seems to be very hard to break. I'm glad that television manufacturers are at least willing to take the steps minimize the effects of a problem that the film producers refuse to fix in the first place.
The same soon-to-be son-in-law buys vinyl records.