Results 1 to 7 of 7

Thread: FPS vs Quality

  1. #1

    Default FPS vs Quality

    I've always assumed that say 50fps @ 100Mbps is visually inferior to 25fps @ 100Mbps because that seems logic to me. with 25 fps you have 4 MB per frame whereas with 50fps you have 2 MB per frame.

    I wonder though if it's really that straightforward or if there's more to it?
    And would it really matter that much the the viewing experience or not?

    I usually shoot at the highest fps possible so I can slow it down a little to smooth out movement. To me the final material looks pleasing enough and I which leads to my assumption that the lack of quality is probably more due to sensor architecture and lens quality rather than less data per frame at high frame rates.

    Would love to hear your thoughts, knowledge and experience on this. Maybe I should start filming at lower frame rates after all...
    The cats are watching us...

  2. #2

    Default

    > with 25 fps you have 4 MB per frame whereas with 50fps you have 2 MB per frame
    Law of diminished returns. At some point you can't really tell a difference.

    I think it matters for more editing. Do you have enough quality/data to make good edits.

    > I usually shoot at the highest fps possible
    Sometimes it depends on the camera. If your camera can do the exact same job at 50 as it can at 25fps, then shooting at a higher speed means you can always have more flexibility in post. But some cameras can't do the same quality, they'll crop the sensor or do some other funky thing. Something to keep in mind when deciding.


    Pro shooters that I talk to these days are shooting 60fps all the time. Footage looks the same as 30fps but you can do half speed in post (or any number in between). If my camera could do 4k 60fps I'd be doing that too, and this is among my main reasons I'm planning to upgrade.

    One thing I'll say is that I positively DO NOT like watching 60fps footage. There's a crisp "fakeness" to it that I spot immediately and do not enjoy. For consumption I like footage at 24-30 fps. IMO as a viewer the only time it makes sense to distribute at 50-60fps is when delivering fast action like sports.

  3. #3
    Join Date
    Feb 2006
    Location
    Surrey, UK
    Posts
    11,349
    Blog Entries
    1

    Default

    Quote Originally Posted by jochicago View Post
    One thing I'll say is that I positively DO NOT like watching 60fps footage. There's a crisp "fakeness" to it that I spot immediately and do not enjoy. For consumption I like footage at 24-30 fps.
    Surely part of the "crispness" is that when shooting at the higher frame rate, the shutter speed is typically reduced - thus less motion blur.
    If you take a 60fps source and then render at 24/30fps does this somehow add motion blur (which would not happen if it was simply exporting every other frame).

    (Obviously for Grapes & me being in the UK reevant figures are 50 and 25)
    Tim

  4. #4

    Default

    Quote Originally Posted by Grapes View Post
    I've always assumed that say 50fps @ 100Mbps is visually inferior to 25fps @ 100Mbps because that seems logic to me. with 25 fps you have 4 MB per frame whereas with 50fps you have 2 MB per frame.
    I wonder though if it's really that straightforward or if there's more to it?
    Usually cameras offer more Mbps for higher framerates to compensate for the "data loss", but obviously not all.

    Quote Originally Posted by Grapes View Post
    And would it really matter that much the the viewing experience or not?
    Yes! But this is more or less a combination factor with other things.
    For example a camera that shoots only 8 bit 4.2.0 (like the Sony alphas) with relatively low Mbps (100 Mbps) - i can totally spot the difference in post while starting to grade that. It falls apart so quick. In comparison, my camera shoots also 8 bit, but 4.2.2 @ 800 Mbps and i can push the image a whole lot more without it falling apart. There is just a lot more information stored and it's visible on screen.

    Also, the 120 fps slow motion that my Canon 1DX mk II does @ 360 Mbps absolutely destroys Sony and other cameras with their meager 150 Mbps. It is a VERY noticeable difference.
    If only it wouldn't come at a huge storage cost. I get 14 min per 128 GB Cfast 2.0 card.

    On avg. i use up 150 - 200 GB per small project because of those bit rates.
    Is it 5x better quality - NO! Is it a noticeable quality improvement - YES, for someone with a trained eye!



    Quote Originally Posted by jochicago View Post
    Pro shooters that I talk to these days are shooting 60fps all the time. Footage looks the same as 30fps but you can do half speed in post (or any number in between). If my camera could do 4k 60fps I'd be doing that too, and this is among my main reasons I'm planning to upgrade.
    Yep, other then interview scenarios my camera has 3 presets. 4K 60p @ 800 Mbps, 4K 24p @ 500 Mbps and 1080 120p @ 360 Mbps. If i decide to shoot something only in HD (like a zero budget production), then it's 100% always (other then talking bits) on 60p for A-Roll and 120p for B-Roll.


    Quote Originally Posted by jochicago View Post
    One thing I'll say is that I positively DO NOT like watching 60fps footage. There's a crisp "fakeness" to it that I spot immediately and do not enjoy. For consumption I like footage at 24-30 fps. IMO as a viewer the only time it makes sense to distribute at 50-60fps is when delivering fast action like sports.
    Have to agree. Sports and video gaming (screen recording) is the only time someone should distribute 50/60p. Even regular action scenes look 100 times better with natural motion blur on 24p then with 50/60. It becomes very video like and not cinematic.

  5. #5

    Default

    Quote Originally Posted by TimStannard View Post
    If you take a 60fps source and then render at 24/30fps does this somehow add motion blur
    Do you guys watch Youtube at 25/50? When I watch Youtube the options are usually stuck at the way the video was uploaded, which is usually 24, 30 or 60 fps.

    I ask because I think unless you are doing a TV show for actual TV sets, the fps are now an artistic choice. My camera lets me pick any fps I want from 24,25,30,50,60 and I think even custom.

    I think these days with the modern cameras the settings are really optional. I normally aim for around 1/60 shutter at 30fps, but with cameras these days you can probably use the same setting at 60fps. So unless you are the one trying to respect some rule, in a new 4k 60fps camera the settings are not necessarily forcing you too far from what you would have done at 30fps.

    Honestly, I donít even know what my Panasonic camera is doing sometimes. I leave it on Auto and it gets some shots that mathematically donít make sense to me. For instance, I lock the ISO, Iím indoors, the light is not good. Iím measuring manually and I realize Iím barely getting enough light for 24fps. Then I switch it to auto and let it do its thing at 30fps and the camera puts out something that looks like it went down to 1/20 to get the shot. Video looks better than what I was going to do manually. So now I just think of it as dark magic and when I canít mathematically get the shot I just put it on auto and let it wow me.

    Going back to the 60fps crispness: outside of camera settings I think thereís the factor of being fed twice as many frames, so thereís twice as much data coming to the eyes. Even when shot at the same settings there are more reference points, the motion is more fluid and it feels a bit unnatural to me, almost a touch robotic.

  6. #6

    Default

    Quote Originally Posted by jochicago View Post
    Do you guys watch Youtube at 25/50? When I watch Youtube the options are usually stuck at the way the video was uploaded, which is usually 24, 30 or 60 fps.

    I ask because I think unless you are doing a TV show for actual TV sets, the fps are now an artistic choice. My camera lets me pick any fps I want from 24,25,30,50,60 and I think even custom.
    When you watch a European Youtube creator, then it will most likely be in 25fps and you can see it in the quality settings.

    Not necessarily an artistic choice, more like a necessity.
    If you shoot outdoors without artificial light, then it doesn't matter - use 24 or 25 or whatever you want artistically.

    But if you shoot indoors in Europe, then we are pretty much forced to shoot 25 fps because of the lights!
    Everything runs on 50 Hz here (PAL video standard) while North America runs on 60 Hz - NTSC standard. That's not just in TVs, that's in everything that transmits light.

    If i shoot 24 / 30 / 60 / 120 indoors in Europe and there are ceiling lights or a table lamp or a TV screen or whatever, then i will get image flickering. Same the other way around.

    I would much rather shoot 120 slow motion then 100, but i can't in Europe if there are regular lights around.

  7. #7

    Default

    Awesome feedback guys, really appreciate it.

    Indeed indoors or any artificial light sources here forces us to sacrifice some frames while filming.

    I think I'll just stick to my habits, I personally don't really notice much difference between low and high frame rates with my current gear and I like the temporal flexibility. Also interesting to see how subjective the high framerate playback is. I personally love watching high fps at full speed (at a large enough screen). I'd rather watch something at say 300fps Full HD than 24fps 4K (or probably 8k for that matter though I haven't seen a comparing setup for this yet). I recall a BBC research project setup at an exhibition a few years back where they had set up exactly that, so a low res high fps and a high res low fps of exactly the same subject angle etc side by side. To me the high fps screen looked just as good as the high resolution one. Of course there's limits to this and I suppose it's also subjective again but it was cool to see how increasing the fps instead of the pixels also contributes to a higher perceived image quality.
    The cats are watching us...

Similar Threads

  1. Adobe Premiere How to export High quality or full quality?
    By powel212 in forum Adobe Premiere, Premiere Elements, and After Effects
    Replies: 0
    Last Post: 03-07-2010, 01:38 AM
  2. I've lost quality when upgraded Monitor...anyway to get that quality back?
    By Note in forum General video editing software help and advice
    Replies: 2
    Last Post: 03-26-2007, 03:14 AM
  3. Gettin the best quality possible, like dvd quality
    By videoboy in forum Adobe Premiere, Premiere Elements, and After Effects
    Replies: 2
    Last Post: 11-28-2005, 04:38 PM
  4. Light and quality capturing quality
    By MrChad in forum General video editing software help and advice
    Replies: 1
    Last Post: 05-15-2005, 09:53 PM
  5. Need Help, quality
    By Thirty_Four in forum General video editing software help and advice
    Replies: 7
    Last Post: 08-28-2004, 07:03 AM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •