Link to the laptop:
https://www.dell.com/...0aw51mr208
exact dates:
17-inch FHD 360Hz 5ms; 300 nits; 100% sRGB; Tobii eye tracking
17-inch FHD 300Hz 3ms; 300 nits; 100% AdobeRGB; Tobii eye tracking
With this link I was shown that there's a response time from the screen to answer (in this case 5ms or 3ms) and an input lag which, according to the calculation, is as follows for both:
1000ms: 360hz = 2.7777777ms + response time 5ms = 7.7777777ms
1000ms: 300hz = 3.3333333ms + response time 3ms = 6.3333333ms
the 300hz screen would be faster then right? Or am I wrong? Or are there different scenarios where it could turn out differently. Example: different games, etc. And it doesn't occur to me that humans can't perceive that anyway with their own input lag of 250ms. Any help would be really wonderful. I'm already grateful in advance for your time. Warm regards, your nerd
One more question: In my mind I think that input lag for games is also scalable because of higher frames with lower settings, so I suspect that this is far more important than the response time, which is not scalable.
https://inputlag.com/what-is-input-lag/
2 things to base:
The response time of a panel indicates the time it takes a pixel from white to black.
Input lag is the time it takes from the graphics card output to display the image on the screen.
How you get your bill now is unclear to me. If a panel has 360Hz, it can control every pixel 360 times per second. A screen with 5 ms creates a gradual change just like a screen with 3 ms. The problem, or rather the streaks, arise when the change is greater or, in the worst case, from white to black. A panel with 3ms would do better. But the 360Hz panel can display 360 different images per second and the 300Hz panel 300. Nothing is faster there.