I’m sure this is really old news for all of my friends involved in film or video production, and I'm pretty sure that I had read or heard about it somewhere myself previously, but I had my first-ever first-hand experience with it this afternoon, and it was pretty startling.
We were visiting my in-laws, who just got a new HDTV (my own family is still stuck in the 20th century in this regard), and one of the Harry Potter movies was being broadcast on network TV. After maybe 60 seconds of viewing I began saying out loud “Why do I feel like I’m watching a soap opera?” I pulled out my iPhone and started to Google, and, sure enough, the string “new tv soap opera effect” popped up by the time I had keyed in the first three words. This article explains what’s going on quite effectively. Long story short, for at least the past 40 years or so, most made-for-TV productions have, for reasons of cost-effectiveness, been shot on video rather than film. The effective frames-per-second rate for video is typically twice that of the long-established standard for film: 60fps for video vs. 30fps for film. (Actually, the standard is 24fps for film, but without getting too technical, a fairly insignificant conversion up to an effective rate of 30fps has been standard practice for decades when converting celluloid for TV/video presentation.) Modern HDTVs typically come with a default setting (which can usually be tweaked or disabled altogether via a little bit of digging through the menu options) which impose an effective rate of 60fps (or even greater in some cases) on everything.
These facts make for a fascinating case study of technological irony and the durability, for better or for worse — in this case almost certainly worse — of subliminal associations. The irony stems from the fact that, while objectively it’s an undisputed fact that 60fps results in much more fluid and life-like moving imagery (especially great for sports viewing), the overall cheaper production values generally associated with TV/video as compared to film are likely to cause a viscerally negative reaction from viewers when they encounter something that they know is supposed to be in the latter category but which has been translated into the former. It just immediately looks and feels wrong. Really wrong.
I would hypothesize (someone’s probably done a doctoral dissertation on the subject already, and I’m sure there’s plenty more to read online, but I haven’t bothered to delve any further yet) that the above holds true for folks who are roughly my age and older, who grew up with a sharp line of demarcation between TV and film, but falls off rapidly among younger viewers. Indeed, my thirteen-year-old son was able to acknowledge the difference, but still favored the 60fps/120hz setting anyway. And with the transition of the film industry to digital technology really picking up steam in recent years, it’s a sure bet that directors and producers will be less and less willing to be tied down to an arbitrarily lower standard of visual quality, imposed only by the expectations of “old fogey” viewers like myself — nor am I necessarily arguing that they should be, various factors being more-or-less equal. But the transition is definitely going to take some getting used to for us old-timers. Case in point: all you Tolkien fans better brace yourselves, because this will factor into Peter Jackson’s production of The Hobbit, set for release this coming December, in a big way.
No comments:
Post a Comment