How many FPS for a really smooth image

Hey guys,

so I am playing around with my graphics settings atm quite a lot as I want to maximise the eyecandy ACC delivers. While doing so, I personally get the feeling that ACC or to be precise the Unreal Engine 4 delivers a smooth experience only over 90 FPS. Is this only me or can anyone relate?

Hints/tips on the fps optimal for the UE4 are welcome. :)
 
This is what I have too, and it is indeed very smooth so.

But I would not be as restrictive as you .
I have a 144 hz gsync monitor , and from my own expérience, ACC can run very well with much less fps. I will say from 45 fps it is okay with gsync. It becomes perfect from about 60 fps.
In my case, I suppose that gsync plays a huge role in the game fluidity. No stutter at all with or without ai.

The only lag I can observe is when someone is entering a server online

Just my own impression.
 
Personally for my own eyes, 50 fps and above is fluent enough. 60 is good, above 75 is really smooth and above 90 it becomes silky smooth.
At my university I was able to test the maximum my eyes can see with a strobe light which is used to set the RPM of drilling, turning, milling machines.
It's around 200 flashes per second were I started to see "just a light" without flickering.

I have a gsync monitor so I directly see the fps on monitor 1:1. For me this is game independent and above 60 fps is always definitely fine.

Without gsync or freesync though, I always need vsync or 180+ fps to make it smooth.

You seem to not use any syncing method?
Then it depends on the game itself too his visible the tearing and stuttering becomes and it would make sense that you need 90 fps in acc.
Maybe try vsync with "prerendered frame =1" in the gpu drivers and an fps limit of 59.97 via rtss to get rid of any input lag while keeping the vsync working :)
 
I have 144hz monitor as well which is freesync and I have nvidia card but unofficially, my monitor doesn't support freesync even though in the control panel, I have 2 options which are gsync compatible or fixed refresh. I get between 80 and 120 fps on and offline and all smooth apart from a couple of corners where it seems to very slightly stutter.
 
Personally for my own eyes, 50 fps and above is fluent enough. 60 is good, above 75 is really smooth and above 90 it becomes silky smooth.
At my university I was able to test the maximum my eyes can see with a strobe light which is used to set the RPM of drilling, turning, milling machines.
It's around 200 flashes per second were I started to see "just a light" without flickering.

I have a gsync monitor so I directly see the fps on monitor 1:1. For me this is game independent and above 60 fps is always definitely fine.

Without gsync or freesync though, I always need vsync or 180+ fps to make it smooth.

You seem to not use any syncing method?
Then it depends on the game itself too his visible the tearing and stuttering becomes and it would make sense that you need 90 fps in acc.
Maybe try vsync with "prerendered frame =1" in the gpu drivers and an fps limit of 59.97 via rtss to get rid of any input lag while keeping the vsync working :)

Hey Rasmus,

I have an LG Freesync 34" 144Hz Monitor and a 1080ti. As the LG monitor supports G-Sync compatible I have that activated. Do I also have to acitive V-Sync in ACC aswell or do I active G-Sync in the Nvidia Settings for ACC? I am not 100% sure.

60FPS usually feels supersmooth for me aswell. Thats why I was confused with the game feeling "jiggly"
 
Hey Rasmus,

I have an LG Freesync 34" 144Hz Monitor and a 1080ti. As the LG monitor supports G-Sync compatible I have that activated. Do I also have to acitive V-Sync in ACC aswell or do I active G-Sync in the Nvidia Settings for ACC? I am not 100% sure.

60FPS usually feels supersmooth for me aswell. Thats why I was confused with the game feeling "jiggly"
With gsync it's pretty simple:
Select it to be active in the nvidia driver menu, run game in fullscreen and it should be active.
However sometimes it is a bit bugged and won't become active until you tab out and back in or something like that.
Vsync is not needed to be active anywhere, if you have set an fps limit below the monitor's maximum refresh rate.
But I doubt you get more fps than the 144 maximum so that shouldn't be the issue.

So two things to be needed to find out:
- is gsync really running? You should have a "fps counter" in your monitor control panel. It will actually show the hz of the monitor.
It should be a steady 144 while on the desktop and once in a game, fluctuate to match the fps counter of the game. If you select "windowed mode" or "borderless windowed" in the acc settings the hz counter should go to steady 144 again and not longer match any fps counter.

The other method would be to download and install the nvidia profile inspector, go into the game's profile and activate the setting "show gsync indicator". It's hidden in the normal driver but still there.
It will show a see through overlay on the left side of the monitor the moment gsync is really active.


The second thing to check: are your fps fluctuating massively? With gsync you won't see tearing etc but if the fps drop from 110 down to 75 within a few tenths of a second you will still see stutter.
Since my system is mainly cpu limited nowadays (I7 2600k), my fps fluctuate a lot.
I'm limiting all my games at 60 fps to keep it leveled for everything. Only the dirt games are set to 80 fps because it really helps there.
 
Thanks for the hint. I've never seend "fps counter" for the Hz. I will search for that one.

My fps are very balanced. I usually run 90-120 fps with everything maxed out at 3200x1350 pixels.

I will check all of your things tonight, when I am back home and will get back here then. :)
 
The smoothest is always synced to your display refresh rate, either via vsync, freesync/gsync.

If the above requirement is met, the only way to improve the game perception of smoothness is to buy a higher refresh rate monitor.

fps does not equal smoothness, at all.
 
Thanks for the hint. I've never seend "fps counter" for the Hz. I will search for that one.

My fps are very balanced. I usually run 90-120 fps with everything maxed out at 3200x1350 pixels.

I will check all of your things tonight, when I am back home and will get back here then. :)
Currently 'working' from home on my also gaming PC so gave it a go to quickly give you a better explanation of it all.

Profile Inspector from the developer's homepage (german):
This is the version I'm using and know it will work like intended!

https://orbmu2k.de/tools/nvidia-inspector-tool
scroll down to the bottom of the post, before the comments start and you'll see:
Downloads:
NVIDIA Inspector Tool – Latest Version

download, extract and run the exe with "profiler" in it.

Now it will look like this:

Global Profile, set it here
black instead of grey = changed setting in this profile
Indicator_Global.JPG

Now all game profiles will look like this:
Grey = nothing changed, using the default from the global profile
Indicator_Profile.JPG

Now to how it will look like on the Monitor:
I activated my Hz counter and therefore had to take a photo with my phone-cam. It won't show on screenshots.

Windowed, 75 fps, 100 Hz
upload_2019-5-13_12-41-59.png


Fullscreen, 75 fps, 75 Hz, Gsync Indicator on
upload_2019-5-13_12-42-22.png



Btw, my control panel looks like this:
it's called "game enhance mode". So it might be next to the selectable crosshairs or something like that.

upload_2019-5-13_12-46-2.png
 
All sooo complicated lol. If it says gsync compatible in the cp does that mean it is or are all nvidia cps have the same options for gsync and freesync?
Afaik if you see the option for gsync, it is compatible. If you plug in a freesync monitor and it doesn't even show the gsync option, then it is not.

However there are different levels of quality regarding being "compatible".
Most true "gaming" monitors, like the 144hz LGs, will run flawlessly while non-gaming monitors that do offer freesync, like the Samsung 100hz Monitors or cheaper ones with only a freesync range of 50-75 Hz for example might run into issues like flickering etc.

In the end, nvidia has a compatibility list on their page and there are plenty of reddit and forum posts about such lists too.
It's not guaranteed though to run flawlessly!


About being complicated: most gsync monitors have this Hz-Counter, mostly wrongly called fps-counters. If it fluctuates, it's on. If it sticks to one value, something is wrong.
I hate nvidia's direction though I have to say:
- first they break the "clamp" value for the LOD Bias, inducing flickering and pixel crawling to lots of games but never admit it and still keep it in their guides to set it to clamp although they know it's not doing anything at all!

- then they do the gtx 970 crap with "4gb" only being 3.5 with some slow af last 0.5gb. Trying to hide it etc. until it was proven.

- then the drivers start to have problems... Latest one being the security and performance problems that needed a hotfix.

- and they took out the gsync overlay that you can see in my screenshot for whatever reason...
 
Afaik if you see the option for gsync, it is compatible. If you plug in a freesync monitor and it doesn't even show the gsync option, then it is not.

However there are different levels of quality regarding being "compatible".
Most true "gaming" monitors, like the 144hz LGs, will run flawlessly while non-gaming monitors that do offer freesync, like the Samsung 100hz Monitors or cheaper ones with only a freesync range of 50-75 Hz for example might run into issues like flickering etc.

In the end, nvidia has a compatibility list on their page and there are plenty of reddit and forum posts about such lists too.
It's not guaranteed though to run flawlessly!


About being complicated: most gsync monitors have this Hz-Counter, mostly wrongly called fps-counters. If it fluctuates, it's on. If it sticks to one value, something is wrong.
I hate nvidia's direction though I have to say:
- first they break the "clamp" value for the LOD Bias, inducing flickering and pixel crawling to lots of games but never admit it and still keep it in their guides to set it to clamp although they know it's not doing anything at all!

- then they do the gtx 970 crap with "4gb" only being 3.5 with some slow af last 0.5gb. Trying to hide it etc. until it was proven.

- then the drivers start to have problems... Latest one being the security and performance problems that needed a hotfix.

- and they took out the gsync overlay that you can see in my screenshot for whatever reason...
Maybe that is why I get slight flicker on grid lines? The other option is just fixed refresh rate in the cp. If I use that instead of the gysnc compatible, would I need to cap frame rate in game?
 
"The human eye can not see more than 24 fps"
"The human eyes can't see more than 60 fps"

Somehow the human eye can always see as many frames per second as the current industry standard determines.
It's a bit like someone would say "The telephone connection doesn't need to transmit more than mids, around 300-3400 Hz".
Nobody would say a telephone sounds natural or like the same quality as a modern radio/tv speaker.

In the end about 24 fps with lots of motion blur is the least we human need to somehow see a fluent image. 50-60 looks "naturally fluent" for most humans while almost everyone can still see a different between 60 to 90 and up to 200!
Like I said, I tested this with a strobe light that gives very very short light pulses and only beyond 200 something, I couldn't see it flickering anymore.

The question here is "what is needed to let me feel like looking at a fast movement in reality", not what a human eye can see or not.
And that value is different for everyone and depends on how trained the person is to spot the differences.
Normally, above 60 fps (and hz of course to actually see more than 60 fps), nobody complains.
 
Maybe that is why I get slight flicker on grid lines?
No, the slight flicker on grid lines is down to resolution and aliasing issues. Do you have the Temporal AA set to epic?

Freesync Monitor that are not-really-compatible will look like this:

The other option is just fixed refresh rate in the cp. If I use that instead of the gysnc compatible, would I need to cap frame rate in game?
That's a bit more complicated and leads to the ever ongoing "vsync yes or no and which mode" discussion.
For me personally, non-synced always stutters no matter how much Hz the monitor has. I tested unsynced fps from 40 up to 150 on a 165 Hz monitor and it all stuttered for me. Only above 180 fps it started to not stutter anymore due to the sheer amount of fps.
Only achievable in old games though...

So for me: always some kind of vsync if you don't use freesync/gsync!

Vsync modes:
- normal:
fps locked at the monitor's hz. 60 fps/60 hz most common, 60 fps/120 Hz or 72 fps/144 Hz or 144 fps/144 Hz are less common.
On each refresh of the Monitor, the image is updated. You need to have the image ready to be updated before the monitor calls for the refresh though, leading to a buffering time -> input lag.

You can reduce the "prerendered frames" in the nvidia settings to bring down the input lag. It might stutter occasionally though if your hardware can't prepare the one single buffered image quick enough!

- normal with reduced input lag: you can limit the fps slightly below the refresh rate. Vsync will still work but do a weird thing. Limiting would be done to 59.97 fps for example.
The buffer now would start with one full image being prepared. But each second, 0.03 fps would be missing!
So once per second you'll see a very very short stutter! However, due to the buffer being emptied all the time, the input lag will be highly reduced!

I used this method most of the times!

The there's another big problem with standard vsync: if your hardware can't fill the buffer in time, the same image will be displayed a second time! So putting fps into "frametimings" will look like this:
60 fps with a stutter for one frame:
Frame 1 ->16.67ms -> Frame 2 -> 16.67ms -> Frame 3 -> hardware too slow -> Frame 3 -> 16.67ms -> Frame 4.
Leading to a gap between Frame 3 and 4 of 33.34ms.

Now there are different solution to this problem without the need of buying a variable refresh rate panel (freesync/gsync):

- adaptive vsync: With adaptive vsync, if the fps drop below 60, the syncing will just stop for a short moment. Resulting in tearing. Timing will look like this:
Frame 1 ->16.67ms -> Frame 2 -> 16.67ms -> Frame 3 -> i.e. 20.78ms -> Frame 4 -> 12.56ms (33.34 minus 20.68) -> Frame 5.

The problem: if you want reduced input lag and limit the fps, adaptive vsync will switch on/off way too often and the result is a mess!

- fast: Your PC will be allowed to render as many fps as it can do. The frame buffer will grab an image every 16.67ms. You can already see the problem... Depending on the fps your hardware can ouput, the timings will look like this:
Frame 1 ->16.67ms -> Frame 3 -> 16.67ms -> Frame 4 -> 16.67ms -> Frame 7 -> 16.67ms -> Frame 8.

So you'll have a non-tearing image but the frame that gets put into the frame buffer can be "old". It can be 16.66ms old or only 00.01ms old. The result is a choppy one.
Only if you have like let's say 130 fps for a 60 hz monitor the "age" of the single frames won't be a problem anymore.

---------------------------------

I hope this helps. It's a big topic and in the end I decided to go for a nice gsync monitor to get rid of all these problems instead of upgrading my hardware. Best decision since I have my own pc!
 
No, the slight flicker on grid lines is down to resolution and aliasing issues. Do you have the Temporal AA set to epic?

Freesync Monitor that are not-really-compatible will look like this:


That's a bit more complicated and leads to the ever ongoing "vsync yes or no and which mode" discussion.
For me personally, non-synced always stutters no matter how much Hz the monitor has. I tested unsynced fps from 40 up to 150 on a 165 Hz monitor and it all stuttered for me. Only above 180 fps it started to not stutter anymore due to the sheer amount of fps.
Only achievable in old games though...

So for me: always some kind of vsync if you don't use freesync/gsync!

Vsync modes:
- normal:
fps locked at the monitor's hz. 60 fps/60 hz most common, 60 fps/120 Hz or 72 fps/144 Hz or 144 fps/144 Hz are less common.
On each refresh of the Monitor, the image is updated. You need to have the image ready to be updated before the monitor calls for the refresh though, leading to a buffering time -> input lag.

You can reduce the "prerendered frames" in the nvidia settings to bring down the input lag. It might stutter occasionally though if your hardware can't prepare the one single buffered image quick enough!

- normal with reduced input lag: you can limit the fps slightly below the refresh rate. Vsync will still work but do a weird thing. Limiting would be done to 59.97 fps for example.
The buffer now would start with one full image being prepared. But each second, 0.03 fps would be missing!
So once per second you'll see a very very short stutter! However, due to the buffer being emptied all the time, the input lag will be highly reduced!

I used this method most of the times!

The there's another big problem with standard vsync: if your hardware can't fill the buffer in time, the same image will be displayed a second time! So putting fps into "frametimings" will look like this:
60 fps with a stutter for one frame:
Frame 1 ->16.67ms -> Frame 2 -> 16.67ms -> Frame 3 -> hardware too slow -> Frame 3 -> 16.67ms -> Frame 4.
Leading to a gap between Frame 3 and 4 of 33.34ms.

Now there are different solution to this problem without the need of buying a variable refresh rate panel (freesync/gsync):

- adaptive vsync: With adaptive vsync, if the fps drop below 60, the syncing will just stop for a short moment. Resulting in tearing. Timing will look like this:
Frame 1 ->16.67ms -> Frame 2 -> 16.67ms -> Frame 3 -> i.e. 20.78ms -> Frame 4 -> 12.56ms (33.34 minus 20.68) -> Frame 5.

The problem: if you want reduced input lag and limit the fps, adaptive vsync will switch on/off way too often and the result is a mess!

- fast: Your PC will be allowed to render as many fps as it can do. The frame buffer will grab an image every 16.67ms. You can already see the problem... Depending on the fps your hardware can ouput, the timings will look like this:
Frame 1 ->16.67ms -> Frame 3 -> 16.67ms -> Frame 4 -> 16.67ms -> Frame 7 -> 16.67ms -> Frame 8.

So you'll have a non-tearing image but the frame that gets put into the frame buffer can be "old". It can be 16.66ms old or only 00.01ms old. The result is a choppy one.
Only if you have like let's say 130 fps for a 60 hz monitor the "age" of the single frames won't be a problem anymore.

---------------------------------

I hope this helps. It's a big topic and in the end I decided to go for a nice gsync monitor to get rid of all these problems instead of upgrading my hardware. Best decision since I have my own pc!
Wow! This is really good of you to take the time for all this information, really appreciate it thank you.
 
Wow! This is really good of you to take the time for all this information, really appreciate it thank you.
I was so bothered by tearing and stuttering and input lag at some point that I just learnt everything about it :roflmao:
And afterwards it was clear for me that I'd need a gsync monitor (freesync is good but not at the same level) to be happy and not bothered by endless tweaking anymore.
And a month after I got one, nvidia opened up for freesync monitors...
Luckily there still isn't a good ips 3440x1440 100 native Hz, freesync monitor yet.
Apart from the new LG 950f, which is barely available and 350€ more than my Alienware :)
 
Yes I do have temporal on epic, should I turn it down? looking at your youtube video, IO don't have that flickering or blurring so maybe I'm a lucky teddy. I had Gsync and one of the worst decisions to get rid of it but I wanted a bigger screen so it came down to budget at the time.
 
Yes I do have temporal on epic, should I turn it down? looking at your youtube video, IO don't have that flickering or blurring so maybe I'm a lucky teddy. I had Gsync and one of the worst decisions to get rid of it but I wanted a bigger screen so it came down to budget at the time.
No no, anti aliasing should be set to the highest and to temporal.
What's your monitor specs (size, resolution etc) and what's your "resolution scale" set to in game?
 
In game resolution is 110% monitor is Benq EX3203R 32" 2560x1440 144hz freesync 2. Just noticed that my monitor can go up to 3840x2160 in game but I don't normally have it there. I tell you what is a bit weird, I've just been into Nvidia cp and the only resolution I can get is 60hz or 144hz that's with display port cable, should I change to HDMI? I should be able to get 100 or 120hz etc?

I've just watched a review of sim racing Garage and one of the comments was this.........
the G-sync is working on this monitor... u just have to create a custom resolution with 48-144Hz freesync range coz default is 24-144 (i think) which nvidia have problems with atm and thats why its going black. Maybe something to look at.
 
Last edited:

What are you racing on?

  • Racing rig

    Votes: 528 35.2%
  • Motion rig

    Votes: 43 2.9%
  • Pull-out-rig

    Votes: 54 3.6%
  • Wheel stand

    Votes: 191 12.7%
  • My desktop

    Votes: 618 41.2%
  • Something else

    Votes: 66 4.4%
Back
Top