No, the slight flicker on grid lines is down to resolution and aliasing issues. Do you have the Temporal AA set to epic?
Freesync Monitor that are not-really-compatible will look like this:
That's a bit more complicated and leads to the ever ongoing "vsync yes or no and which mode" discussion.
For me personally, non-synced always stutters no matter how much Hz the monitor has. I tested unsynced fps from 40 up to 150 on a 165 Hz monitor and it all stuttered for me. Only above 180 fps it started to not stutter anymore due to the sheer amount of fps.
Only achievable in old games though...
So for me: always some kind of vsync if you don't use freesync/gsync!
Vsync modes:
- normal: fps locked at the monitor's hz. 60 fps/60 hz most common, 60 fps/120 Hz or 72 fps/144 Hz or 144 fps/144 Hz are less common.
On each refresh of the Monitor, the image is updated. You need to have the image ready to be updated before the monitor calls for the refresh though, leading to a buffering time -> input lag.
You can reduce the "prerendered frames" in the nvidia settings to bring down the input lag. It might stutter occasionally though if your hardware can't prepare the one single buffered image quick enough!
- normal with reduced input lag: you can limit the fps slightly below the refresh rate. Vsync will still work but do a weird thing. Limiting would be done to 59.97 fps for example.
The buffer now would start with one full image being prepared. But each second, 0.03 fps would be missing!
So once per second you'll see a very very short stutter! However, due to the buffer being emptied all the time, the input lag will be highly reduced!
I used this method most of the times!
The there's another big problem with standard vsync: if your hardware can't fill the buffer in time, the same image will be displayed a second time! So putting fps into "frametimings" will look like this:
60 fps with a stutter for one frame:
Frame 1 ->16.67ms -> Frame 2 -> 16.67ms -> Frame 3 -> hardware too slow -> Frame 3 -> 16.67ms -> Frame 4.
Leading to a gap between Frame 3 and 4 of 33.34ms.
Now there are different solution to this problem without the need of buying a variable refresh rate panel (freesync/gsync):
- adaptive vsync: With adaptive vsync, if the fps drop below 60, the syncing will just stop for a short moment. Resulting in tearing. Timing will look like this:
Frame 1 ->16.67ms -> Frame 2 -> 16.67ms -> Frame 3 -> i.e. 20.78ms -> Frame 4 -> 12.56ms (33.34 minus 20.68) -> Frame 5.
The problem: if you want reduced input lag and limit the fps, adaptive vsync will switch on/off way too often and the result is a mess!
- fast: Your PC will be allowed to render as many fps as it can do. The frame buffer will grab an image every 16.67ms. You can already see the problem... Depending on the fps your hardware can ouput, the timings will look like this:
Frame 1 ->16.67ms -> Frame 3 -> 16.67ms -> Frame 4 -> 16.67ms -> Frame 7 -> 16.67ms -> Frame 8.
So you'll have a non-tearing image but the frame that gets put into the frame buffer can be "old". It can be 16.66ms old or only 00.01ms old. The result is a choppy one.
Only if you have like let's say 130 fps for a 60 hz monitor the "age" of the single frames won't be a problem anymore.
---------------------------------
I hope this helps. It's a big topic and in the end I decided to go for a nice gsync monitor to get rid of all these problems instead of upgrading my hardware. Best decision since I have my own pc!