• TachyonTele@lemm.ee
    link
    fedilink
    arrow-up
    41
    arrow-down
    7
    ·
    edit-2
    14 hours ago

    Basically, yes. Our eyes capture the light that goes into them at 24 frames per second (please correct me if I goofed on that) and the image is upside down.

    Our brains turn those images upright, and it also fills in the blanks. The brain basically guesses what’s going on between the frames. It’s highly adapt at pattern recognition and estimation.

    My favorite example of this is our nose. Look at you nose. You can look down and see it a little, and you can close one eye and see more of it. It’s right there in the bottom center of our view, but you don’t see it at all everyday.

    That’s because it’s always there, and your brain filters it out. The pattern of our nose being there doesn’t change, so your brain just ignores it unless you want to intentionally see it. You can extrapolate that to everything else. Most things the brain expects to see, and does see through our eyes, is kind of ignored. It’s there, but it’s not as important as say, anything that’s moving.

    Also, and this is fun to think about, we don’t even see everything. The color spectrum is far wider than what our eyes can recognize. There are animals, sea life and insects that can see much much more than we can.

    But to answer more directly, you are right, the brain does crazy heavy lifting for all of our senses, not just sight. Our reality is confined to what our bodies can decifer from the world through our five senses.

    • Reyali@lemm.ee
      link
      fedilink
      arrow-up
      4
      ·
      6 hours ago

      If you want a fun experiment of all the things we see but don’t actually process, I recommend the game series I’m On Observation Duty. You flip through a series of security cameras and identify when something changed. It’s incredible when you realize the entire floor of a room changed or a giant thing went missing, and you just tuned it out because your brain never felt a need to take in that detail.

      It’s sorta horror genre and I hate pretty much every other horror thing, but I love those games because they make me think about how I think.

    • HatchetHaro@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      27
      ·
      12 hours ago

      the 24 fps thing is one hella myth. our cones and rods send a continuous stream of information, which is blended with past-received information in our perception to remove stuff like the movement from darting your eyes around.

    • calabast@lemm.ee
      link
      fedilink
      arrow-up
      50
      arrow-down
      1
      ·
      edit-2
      14 hours ago

      We definitely are seeing things faster than 24 Hz, or we wouldn’t be able to tell a difference in refresh rates above that.

      Edit: I don’t think we have a digital, on-off refresh rate of our vision, so fps doesn’t exactly apply. Our brain does turn the ongoing stream of sensory data from our eyes into our vision “video”, but compared to digital screen refresh rates, we can definitely tell a difference between 24 and say 60 fps.

      • Steve@communick.news
        link
        fedilink
        English
        arrow-up
        20
        arrow-down
        3
        ·
        edit-2
        13 hours ago

        People looking at a strobing light, start to see it as just “on” (not blinking anymore) at almost exactly 60Hz.
        In double blind tests, pro gamers can’t reliably tell 90fps from 120.
        There is however, an unconscious improvement to reaction time, all the way up to 240fps. Maybe faster.

        • Contramuffin@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          2 hours ago

          I think having higher frame rates isn’t necessarily about whether our eyes can perceive the frame or not. As another commenter pointed out there’s latency benefits, but also, the frame rate affects how things smear and ghost as you move them around quickly. I don’t just mean in gaming. Actually, it’s more obvious when you’re just reading an article or writing something in Word. If you scroll quickly, the words blur and jitter more at low frame rates, and this is absolutely something you can notice. You might not be able to tell the frametime, but you can notice that a word is here one moment and next thing you know, it teleported 1 cm off

        • RecluseRamble@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          10
          ·
          9 hours ago

          It seems to be more complicated than that

          However, when the modulated light source contains a spatial high frequency edge, all viewers saw flicker artifacts over 200 Hz and several viewers reported visibility of flicker artifacts at over 800 Hz. For the median viewer, flicker artifacts disappear only over 500 Hz, many times the commonly reported flicker fusion rate.

        • seaQueue@lemmy.world
          link
          fedilink
          arrow-up
          10
          ·
          12 hours ago

          The real benefit of super high refresh rates is the decrease in latency for input. At lower rates the lag between input and the next frame is extremely apparent, above about ~144hz it’s much less noticable.

          The other side effect of running at high fps is that when heavy processing occurs and there are frame time lags they’re much less noticable because the minimum fps is still very high. I usually tell people not to pay attention to the maximum fps rather look at the average and min.

      • TachyonTele@lemm.ee
        link
        fedilink
        arrow-up
        22
        ·
        14 hours ago

        Yeah it’s not like frames from a projector. It’s a stream. But the brain skips parts that haven’t changed.

      • Ekky@sopuli.xyz
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        13 hours ago

        I think i read that fighter pilots need to be able to identify a plane in one frame at 300 fps, and that the theoretical limit of the eye is 1000+ fps.

        Though, whether the brain can manage to process the data at 1000+ fps is questionable.

        • nimpnin@sopuli.xyz
          link
          fedilink
          arrow-up
          1
          ·
          5 hours ago

          Both of these claims are kinda misguided. The brain is able to detect very short flashes of light (say, 1 thousandth of a second), and other major changes in light perception. Especially an increase in light will be registered near instantly. However, since it doesn’t have a set frame rate, more minor changes in the light perception (say, 100 fps) are not going to be registered. And the brain does try to actively correct discontinuities, that’s why even 12 fps animation feels like movement, although a bit choppy.

        • Fester@lemm.ee
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          13 hours ago

          I’m using part of this comment to inform my monitor purchases for the rest of my life.

    • leds@feddit.dk
      link
      fedilink
      arrow-up
      5
      ·
      13 hours ago

      Also, your eyes dart around and you only see a little patch. You blink. Your brain makes up a nice stable image of the world, mostly consisting of things that your brain think should be there.