Genuine answer is that it’s just not necessary. Current displays are sharp and smooth enough. I’d rather a display that lasts for a few decades, since the only reason to replace these is when they break down.
I imagine it was a typo*, but this article in Nature reports that in specifics circumstances the median maximum that people can perceive a difference may be around 500hz, with the maximum in their test possibly being as high as 800hz.
Normally though it seems closer to 50-90hz, but I’m on the road and haven’t delved too deeply into it
Not the original you replied to. And I had a typo when trying to spell typo 😂 just adding to the conversation. Wasn’t disputing you, just meant the may have meant refresh rate instead of resolution. Easy mistake. It’s still quite disputed how well eyes can tell the difference in refresh rates.
Why?
(Genuine question.)
Genuine answer is that it’s just not necessary. Current displays are sharp and smooth enough. I’d rather a display that lasts for a few decades, since the only reason to replace these is when they break down.
Your eyes can’t possibly tell the difference. We’re past the max eye resolution at this point.
What does refresh rate have to do with resolution?
I imagine it was a typo*, but this article in Nature reports that in specifics circumstances the median maximum that people can perceive a difference may be around 500hz, with the maximum in their test possibly being as high as 800hz.
Normally though it seems closer to 50-90hz, but I’m on the road and haven’t delved too deeply into it
Edit: Type to Typo
And nothing you’ve stated refers to resolution
Not the original you replied to. And I had a typo when trying to spell typo 😂 just adding to the conversation. Wasn’t disputing you, just meant the may have meant refresh rate instead of resolution. Easy mistake. It’s still quite disputed how well eyes can tell the difference in refresh rates.