If you’ve ever wondered whythere’s a tearing in your video where the bottom and top don’t seem to match when you play certain video games, or why televisions are so keep to advertise their Hertz, Tom’s explanation of Variable Refresh Rate will answer your questions and more.
Featuring Tom Merritt.
MP3
Please SUBSCRIBE HERE.
A special thanks to all our supporters–without you, none of this would be possible.
Thanks to Kevin MacLeod of Incompetech.com for the theme music.
Thanks to Garrett Weinzierl for the logo!
Thanks to our mods, Kylde, Jack_Shid, KAPT_Kipper, and scottierowland on the subreddit
Send us email to feedback@dailytechnewsshow.com
Transcript:
I’m playing this video game but the screen keeps breaking in half and stuttering all over the place.
My friend says I need VRR.
Is that like virtual reality with another R for some reason? How’s that supposed to help?
Confused? Don’t be, let’s help you know a little more about Variable Refresh Rate.
Variable Refresh Rate, or VRR, varies a display’s refresh rate on the fly in order to prevent tearing and stutter in the video, as well as help maximize power efficiency.
Before we can understand VRR though, you need to know what the refresh rate actually is. A lot of you have heard something like 120 Hertz as a refresh rate and you know a bigger number is better. Smoother. But what is it exactly?
The refresh rate is the number of times per second that a display shows a new image.
This is different from the frame rate. The frame rate is for the device. It’s the number of times per second that the device, like the PC or the phone generates an image.
You need both to be high for the smoothest image. If the device is generating a low frame rate the high refresh rate doesn’t really matter. And likewise a high frame rate wouldn’t matter as much if the display couldn’t refresh fast enough to keep up.
Refresh rate was very important in CRT monitors, because they didn’t keep the image up for very long. The pixels fired light and then faded. You had to refresh them in order to maintain an image, hence the flicker caused by CRT monitors.
LCD-based monitors don’t have that flicker problem. Liquid Crystals stay in place, they don’t fade unless you tell them to go dark.
Now, refresh rate measurements were created for CRTs. CRTs worked by creating horizontal lines from top to bottom fast enough that the lines were refreshed before they faded out. And that rate was measured in Hertz.
Hertz is a unit of measurement for the number of cycles of something per second. Display refresh rates were set to measure how many times the entire screen could refresh all 1024 lines (or however many it had). 90 Hertz meant the screen could run through 90 cycles of refreshing itself per second. The higher the refresh rate, the less perceptible the flicker making it important for CRT monitors.
Of course now, we mostly use LCD displays or something similar where the pixels, much less the lines, don’t fade. However you do need to refresh the pixels to show moving images. And LCDs often still use the metaphor of lines, even though only the pixels that change are refreshed in each pass through.
But since you’re only changing the part of the image that changes, flicker isn’t an issue (except maybe with the backlight in some cases but that’s a separate issue.) But you’re still only changing those pixels every so often. And that refresh rate on LCD monitors however is still important because it’s the limiting factor on the frame rate.
And knowing all that will help us understand the benefit of Variable Refresh Rate.
Remember a video is a series of frames. The fast refresh of the frames gives the illusion of movement.
Refresh rate means every fraction of a second, the display works through all its lines to change pixels based on the new information in a frame.
With a fixed refresh rate, a new frame can only be shown when the refresh interval is over. When the display begins refreshing the image from the top.
For simplicity, let’s say the refresh rate is one time per second. And let’s then say the frame rate is one time every 1.5 seconds.
The first frame arrives at the beginning of the first refresh cycle. Great.
One second passes and the new refresh cycle begins. But there’s no new frame yet, so the image stays the same.
One and a half seconds into this whole cycle, and a half second through the second refresh, the new frame arrives, but the refresh hasn’t finished, so the new information waits.
Two seconds pass and now the second frame can go in during the third refresh. But that image is now a half second old.
Once we start the fourth refresh, the third frame is delivered, now at the beginning of the refresh rate and things are back in sync.
But even in this simplistic example you can see the stutter that’s going to happen. You had one frame up for two seconds, then one up for a second. Every other frame is going to last twice as long as the second instead of a smooth video, you get a jumpy one.
You can fix this lag, by trying to use info from the new frame mid-refresh. Remember in LCD it’s not actually drawing a whole new frame each time, just updating pixels. Let’s say a ball is moving to the right at the bottom of the screen. If halfway through the second refresh, before the bottom of the screen has been updated, you incorporate the new frame, then you can show the ball moving position at the right time, avoiding lag and stutter.
That’s all well and good until you ‘re trying to update something in the picture that no longer matches the half that was already refreshed. Like if you’re walking down a path and start to turn so the path appears in a different part of the image now.
That’s when you get something called tearing, where, say, the bottom half of the image doesn’t line up with the top. There’s all kinds of computations that can anticipate and adapt for this sort of thing, but it adds overhead and reduces the performance of the video display.
OK so just set the refresh rate and the frame rate to be the same right? Then everything is smooth. Done and done.
Sort of. You can do this. The problem is, video games sometimes change frame rate during game play. Some of it varies by a few frames depending on what you’re doing in the game. And sometimes it varies by design. There’s a tradeoff between scene complexity, resolution and the frame rate that a particular GPU can achieve. If you lower the frame rate, it leaves more power for other things, like shadow effects, texture or advanced ray tracing. So a game might go for 30 frames per second at some point when it wants to have a stunning slow-mo or underwater scene, then ramp up to 120 frames per second for fast combat action.
A variable refresh rate lets the display adapt, within a range, to whatever frame rate the device is generating. So you might have a variable refresh rate between, say, 40 Hertz and 120 Hertz for example. That’s pretty common.
VRR first came to video cards with Nvidia’s V-Sync. Sort of.
V-sync kind of flipped the script. It worked by matching the GPU’s frame rate to the display’s refresh rate. So you solved the sync issue but since you didn’t actually match the real frame rate from the game, occasionally you still had judder where you had the same frame for too long and then a new frame for too short. Though this wasn’t a problem if the frame rate of the game was higher than the display’s refresh rate since the GPU had more frames to work with. Only if the frame rate dipped below the refresh rate. Later, Nvidia’s Adaptive Sync fixed things by turning off V-sync when frames per second dipped below the refresh rate of the monitor.
But then Nvidia G-sync came along, and actually matched the display’s refresh rate to the frame rate. True VRR. But that required a display to also support G-Sync. AMD also came out with their own version of this called Freesync.
G-Sync and FreeSync each come in three levels.
AMD’s FreeSync entry level reduces tearing and latency, the second level- Premium- supports 120 Hertz and the third level- Premium Pro adds improved HDR tone mapping. Another benefit of VRR.
Nvidia’s first level G-sync Compatible– does not require an Nvidia GPU, just for the display to support it. That gets you tear-free and stutter-free visuals. G-Sync on the GPU- the second level, supports 120 Hertz and the third level- G-Sync Ultimate, adds improved HDR tone mapping and promises even lower latency.
Both G-sync and FreeSync are proprietary though. So for everything but the G-Sync compatible level, you have to have the card that supports VRR and a display that supports your card’s version of VRR. Not all displays support them.
However VRR is now part of HDMI 2.1 for resolutions up to 4K and frame rates up to 120 frames per second. And the Xbox One S and X, Xbox Series X and S and PS5 game consoles all support it. I mean you still have to have the device, the HDMI cable and the display, supporting it, but having it part of the HDMI standard makes it easier for everyone to implement.
However a couple notes there. HDMI 2.0 devices like Xbox One only support VRR up to 60 Hertz. And VRR is optional in HDMI 2.1. So just because the TV is HDMI 2.1 does not necessarily mean it supports VRR. You have to look at the specs.
And then, remember I said a display that supports VRR supports a range of refresh rates. Most support between 40 and 120 hertz. But also recall that I used 30 frames per second as an example of a game maximizing visual quality. That’s a thing. Slower-paced adventure games in particular do it.
So if you have VRR at 40-120 frames per second, you’re still going to have mismatch problems between the frames and refresh rate below 40. Some displays have a fix for this called Low Framerate Compensation or LFC. If a game is generating 30 frames per second, then LFC sets the refresh rate is set to double that. So it’s refreshing twice for every frame, but at least it’s in sync. It does reduce the power efficiency though.
Oh and another consideration. Your AV Receiver. If your surround sound is getting its audio direct from the device, the receiver needs to support VRR too, to keep the sound in sync with the frames. Most AV receivers do not support VRR. The workaround is to use HDMI to send the sound to your TV that does support VRR, and the TV then sends the sounds to the receiver. You can do this by connecting an optical cable from the TV or using HDMI ARC or eARC.
And one last caveat. We’ve talked about LCDs as the example here because LCD’s can update pixels independently of the light source. However, as those of you have listed to our KALM episode on QD-OLED know, OLED screens have pixels that are also their light source. For some reason, VRR doesn’t play well with that and some OLED users report grayer more washed out black spaces. Which is disappointing because the big advantage of OLED over LCD is deeper blacks. There are also some reports of flickering on OLED when using VRR.
So how important is it to have VRR? Well, not all games can benefit from it, so if you aren’t playing games that make use of VRR then it’s not important. If you want the best looking picture or the lowest latency though, it might be worth it. Generally speaking the advice is that people who play multiplayer, sports or fast-twitch games need VRR the most.
If you play slow-paced adventure games and don’t need the visuals to be perfect every millisecond, it may not be as big of a deal.
In other words, I hope you Know A Little More about VRR – Variable Refresh Rate.