Realized my screen is 144, not 60 hz
Yes, yes, I know, the classic blunder đ
I just have to say though, the difference is insane, I mean what the actual fuaĂŚosiuhrfjk!?
I have been on 60 hz screens my entire life, only upgrading to 1080p in 2015 or so, and I bought my current screen from a friend a year or two ago -- I guess that's why I never realized it was 144 hz, not 60 hz!? But playing WoW with another friend yesterday, we started talking about specs and refresh rates came up, so she even offered that I could borrow her second screen because she felt so sorry about my only having 60 hz. So for fun and just to be sure, I went to check my settings and yup, it said 144 hz in there! "Surely not", I thought... so I clicked it and absolutely surely fucking yes, it instantly looked a million times better??? I laughed so hard because it is both amazing and I am an idiot because I have seen this exact meme dozens of times and I cannot believe that I am a victim too đ
The colors are so much richer, the movement of everything was so much smoother. I mean seriously, my mind is still completely blown now a day later. This is a great christmas present for myself, and it was free!
I don't think any other computer upgrade has ever had this big an impact. Blew my mind!
You can use this site to see the actual difference side by side: https://www.testufo.com/
Might help! I feel like you should be able to see the difference pretty easily but I'm not an eyes/neuro expert, everyone's different!
Do you actually have a display that supports 90Hz? If not you wonât see a difference.
Pixel 6 does have 90Hz. But videos etc are less noticable on a smaller screen.
You really should test higher refresh rates on a modern PC monitor, I bet itâs more noticable for you that way.
LOTR is 24, it's the hobbit that was shot at 48. Although I think many parts of LOTR actually go lower than 24
From a scientific perspective, these should be exactly identical, sans any color different listening equipment might provide. The audiophile world is choc full of snake oil, chief among them the claim that analog audio can contain more audible detail than lossless digital audio. Most of the difference people often hear in a digital track vs a vinyl track boils down to how the song was mastered for the different formats.
With displays though, many people can readily tell the difference between high and low refresh rates. And once you know what to look for, you can even start feeling the difference between 144, 240, and even 500hz, though you are well past a point of diminishing returns at some of these higher numbers esports oriented displays are pushing.
Nyquist-Shannon sampling theorem tells us an exact criteria for when we can perfectly recreate a continuous signal with digital sampling - the sampling frequency must be at least twice the frequency of the continuous signal.
So a CD quality sampling rate of 44khz can perfectly represent all of the audio frequencies the music has that are below 22,000 hz. Not only is this well above the audio perception of humans, but itâs debatable to what extent frequencies so high are even intentional parts of the song, or would be pleasant to listen to even if we could (mosquito frequencies are used to drive kids out of casinos not because theyâre nice to hear).
It wouldnât be about choppiness, but what the maximum audible frequency in the song would be.
There is a small catch when it comes to recordings, as before the sampling of an analog signal frequencies above Nyquist need to be filtered to not cause artifacts which is difficult at lower rates, so it can make sense to sample at a higher frequency to have more headroom.
hence why 44.1khz was chosen for CD, a whole 2.05khz buffer to account for inaccuracies in the analog low pass filters of the time.
I should have clarified, when I said "From a scientific perspective, these should be exactly identical to the human ear". For music listening, what bats or hypothetical aliens can hear is irrelevant and was not factored in to this context. I also didn't want to go off on a tangent explaining human hearing range and Nyquist-Shannon.
That really has more to do with shutter angle. The classic film look is 24 fps with a 180 degree shutter angle. Games, where the "image" is constructed on demand, and inherently has no motion blur, vs a movie, where each image is a digital or analog picture, are different things.
In that respect, it's not really an "upgrade" to up the frame rate for films as it is in games, which is it's 24 (or 23.999) fps to this day.
Gemini Man was actually projected at the full 120hz in theaters that supported it (which these days is just about any auditorium quipped to show 3D movies)
In my experience, it's not about seeing the difference, it's about feeling it.
There's a responsiveness to moving the cursor/character at higher framerates that doesn't happen at 60. The interaction part is crucial because we're used to moving physical objects and having an instant response.
The perceptible improvement to motion quality is distinct from reaction time. Eyes have persistence of vision: what you perceive is an average over a period of time. This is why fast moving objects blur. With screens, you can see every frame drawn averaged together as your persistence of vision trails off.
This is easiest to see with fast motion, because there's a large distance between frames. For instance, try moving your mouse or scrolling quickly. On 60fps screens you'll see the image break up into multiple superimposed frames (LCDs with low pixel response time will exaggerate this even more). Higher refresh rate displays show more in-between frames for motion, resulting in smoother blur as objects move.
The qualitative difference is a smoother smear rather than multiple frames with gaps overlayed. This is particularly noticeable with text, since it's high contrast with fine details. In 60fps scrolling fast text jumbles due to discrete frames overlapping, at higher refresh rates the motion is easier to interpret because it's more apparent as a smear.
Another notable detail is high refresh rate displays are often synonymous with faster pixel response time. Many 60fps LCDs have pixels which take longer than a single frame to transition between colors, which exaggerates the trail effect. This has a direct impact on image clarity during motion because you're seeing a longer trail of previous frames overlapping your latest one.
Try VR and you'll definitely see a difference. Less than 60 and you might just throw up.
I can see a night and day difference. Iâve had my refresh rate reset to 60Hz a few times and I could see it immediately just from moving my cursor. Itâs even more noticable to me when moving a window around.
I'm the same way. The few times a driver update reset me to 60hz I noticed the moment I got control of the mouse. Over 100hz I start to notice changes less but 60 to 144hz is a huge difference both in feel and how smooth everything moves.
Did you verify in Nvidia or AMD control panel that the monitors were actually set to 144Hz? That isnât always automatic, plenty of monitors default to 60Hz for some reason.
You probably can too if it's the other way around. Play on a 240hz screen for an hour or two and switch back to 60hz at random and you can probably see the delays.
I prefer to live in ignorance rather than spending money upgrading my rig to the stand I would then want.
Haha, I don't fault you for that reasoning.
My own panel is 75hz. I'm fine with it as long as I don't see too many higher refresh rate screens. Unfortunately, my own phone has 120hz so I can't fully act like it doesn't exist.
It's harder to tell if you're just watching some videos or someone else playing/operating. It's much more noticeable when you're both seeing and feeling it, like scrolling through a web page or moving the mouse around. Personally I can feel something is not right or laggy if the my 120hz TV somehow resets to 60hz when I'm using it as my computer screen. I probably can't tell if I'm just watching someone else play on it.
Games in large aren't experienced as sequences of sudden, discrete instances of change. Instead, we estimate (not to mention control) where things will be by the time we click based on the motion we see and act accordingly. The capacity to represent fluid motion helps to this end: higher frame rates makes it easier to estimate and anticipate motion because it gives us more information to act on.
Maybe it makes a difference on some games, but on the games I tried after 60hz it's all the same to me. Tangentially, most of the time I can barely distinguish 1080p from 4k either.
Have you actually used different monitors with higher resolutions/refresh rates or have you used a 1080p60Hz monitor to test with?
I have used my own 1080p, 144hz monitor, and several 4k televisions.
Agreed wholeheartedly! I perceive the smoother motion of higher refresh rate screens as being more present and true to life than 60fps and below. It was amazing the first time I saw one, after over 25 years of screens always having the same speed limit.
Seconded, I was thrilled when I found a 75hz LCD display.
It was sad that my display was nicer in 2004 than in 2018....1600x1200 CRT at 85hz.
That is just... not even close to how addiction works.
Changing the millisecond scale of visual updates will not alter your brain reward mechanisms.
Source: PhD Cognitive Science.
I donât think you have an idea what refresh rate is. It doesnât magically make you able to do things faster. Higher refresh rates makes motions more fluid with more detail, it doesnât speed things up.
Now I want to know what the reply was X__X
They basically said that with higher refresh rates they can switch between pages/apps faster to further increase dopamine and become addicted more easily.
My grandpa refused to wear glasses because he thought correcting his vision would weaken his eyes by making them reliant on the lenses. That sort of reminds me of this. I'm sure it increases the ease at which information is transferred and interpreted, but I don't think it'll reshape how your brain interprets information.
Well, using corrective lenses when they're only marginally necessary can weaken your eyes. Of course, it's something you should talk to your eye doctor about if you can get away with not wearing glasses all the time.
I would not recommend this. My understanding is that going entirely without correction will lead to your vision worsening.
Slight blurriness at distance can reduce eye strain when e.g. in the home.
I don't know how accurate this is, but my understanding is that too much up-close eye strain (e.g. reading books) causes pseudomyopia from the eye strain. In children, this typically gets "corrected" with lenses. Additional strain from lenses leads to lens-induced myopia, where the eyeball elongates to reduce stress on the internal lens of the eye and make it easier to keep your vision in focus.
The cycle of needing stronger prescriptions repeats itself until you're finally old enough that your eyeballs stop adapting.
As an adult, there isn't much you can do besides protecting yourself from eye strain. I wear glasses 0.5 diopters less than my "full strength" prescription around the house, and only put on my full-strength glasses when I'm driving. Sometimes I use computer glasses that are reduced further if I'm glued to the screen.
I've read some credible accounts and guides about reversing myopia, like cliffgnu's. As somebody who prefers to spend my life staring into screens for one reason or another, I don't have the long-term discipline to "reverse" my myopia (if such a thing really is possible as an adult), but my vision has at least stopped changing. Perhaps as a result of maining lower prescription glasses, taking more vision breaks to reduce strain, and just getting older.
The main idea is that, if you're not raising your prescription/not wearing glasses, your eyes should only be working a little bit to accomodate bluriness. Any more and your eyes/visual system will basically stop trying, and you might find yourself engaging in bad habits like putting a screen all the way up to your face to read it. In either case, you definitely should communicate with an eye doctor about it.
Back when I had vision that I could get away without wearing glasses (one eye of mine is worse than the other and this was back when the good eye was still 20/20 and compensating for the other one) my eye doctor wanted me to wear my glasses when doing things like looking at the blackboard or watching movies. I didn't, because I didn't like the feeling of glasses on my face and I didn't notice much perceivable difference unless I closed one eye. I doubt this is the only reason my vision later got worse but I do wonder if it contributed...
Iâm not particularly concerned about it, but I donât think itâs worth paying for. Sure, you can
learn to tell the difference, but why cultivate expensive tastes when you donât need to?
(Similarly, I donât think my life would be improved by learning to like expensive food or drinks.)
High refresh rate monitors mostly work out of the box. Getting VRR working correctly is a bit of a mixed bag and still being actively worked on, at least in GNOME.
One thing I've run into if you decide to upgrade is with the latest 7xxx amdgpu drivers (it's is also apparently present in their Windows drivers) is that if you have two monitors with different refresh rates your GPU will never clock down its memory clock and your idle power usage will be higher. I have a 165 Hz and a 144 Hz monitor and I have to keep both at 144 Hz to avoid the issue.
Sidenote, anyone happen to have any monitor recommendations for a 4k 144 hz monitor, 27 or 32 inch?
Really depends on your budget but if you get a 27in monitor I would highly recommend going with 1440p rather than 4k. 1440p is really the sweet spot at 27in and the visual fidelity is going to be very similar to 4k on a larger screen if you're thinking about pixels per square inch.
I have a ton of 1440p 27in monitors for various use cases, especially gaming and productivity. 1440p is less strain on my GPU as well than the 4k. , I do also have 1 32in 4k monitor though and personally I would say 32in would be the bare minimum screen size that id consider going 4k on. Any smaller, things become so tiny on the screen you need to upscale the resolution anyway to make things fit better.
I have the opposite opinion: 22â is the 4K sweet spot. That is where you get a good pixel doubling from standard 1080. 27â is for 5K.
And here I'm stretching 4K onto 32 inches.
Your question as it stands is a bit broad. It highly depends on your use cases. For media consumption and gaming the newer QD-OLED monitors are awesome, but due to the subpixel layout (and risk of burnin) they are less suitable for productivity tasks involving a lot of text. That's just one example, but there are more things that play into getting a new monitor.
Here's rtings's current 4k recommendations. They also have a great matrix view where you can filter based on your feature preferences. I avoid Dell, since I've had coil whine issues with them in the past.
I'm personally waiting for early next year. ASUS teased a 32" 4K QD-OLED (model PG32UCDM), though the details are currently sparse. Those are my ideal specs, until PHOLED becomes prevalent, though it may be diminishing returns from here đ
Double the average DPI on desktop monitors and then weâre talking! I got an exceptional deal on a QD-OLED TV for the living room recently and the picture is stunning, but too many years of retina displays means Iâm definitely picky about sharpness on anything Iâll be sitting closer to.
Haha, fair, I was thinking about color and contrast when I wrote that, not resolution. 4K at 27" looks pretty good to me. That upcoming ASUS monitor will be 32", so still some room for improvement DPI-wise, though higher resolution become harder to drive. When my partner upgraded to a 4K display her laptop struggled to keep up, so she wasn't able to get much benefit until upgrading her computer.
Samsung Odyssey Neo G7. Itâs mini-LED, not OLED, so you donât have to worry about burn-in. The HDR performance of the Neo G7 is so close to OLED monitors that you will not be able to tell the difference. Monitors Unboxed on YouTube has a great video comparing this monitor to a 4K OLED if you want to check yourself.
Donât bother with the Neo G8, it has pretty bad scanline issues at 240HZ, so itâs really just a more expensive version of the Neo G7.
You should also be aware that the Neo G7, even though it is a 32-inch 16:9 monitor, has a pretty aggressive curve. I personally didnât have an issue with it for playing games or âproductivity tasksâ but some people find the curve too aggressive for the monitorâs form factor.
It's like a free upgrade for you! Ive been mulling over a 360hz upgrade.
Does it really make colours richer?
Not by itself. Smoother motion can make moving objects easier to distinguish and separate from each other maybe that could give the appearance of richer colors.
Faster pixel gray-to-gray response times can make a difference because you're seeing less of the previous frame superimposed on the current one.
That would explain seeing better colours on a 144hz display running at 60hz, but I expect a 144hz display running at 144fps probably spends the same proportion of time switching between frames as a 60hz display at 60fps.
In the 144hz @ 144fps case the frames will be more similar than 60fps, so even if GtG is constant they will have less delta between frame values. Pixels will also change sooner than the next 60fps frame. These sound like minor differences but they make motion smears appear more vibrant since it'll fill the gaps between 60fps frames better.
You can see this in action on https://www.testufo.com. On a high refresh rate display, look at the red UFO body and the yellow capsule. Assuming the GtG is constant between the 60fps and 144fps rows, the colors on the 144fps row look noticeably sharper due to better contrast in the motion.