I threw up. Everything should cover sRGB fully at this point. I get that this monitor's primary focus is the refresh rate, and it's not for people like me who'd rather have high resolution and wide colour at 60hz, but NTSC... And not even covering it fully... It's so depressing to look at.
Isn't 100% Never The Same Color what those cheap-ish displays with NT and VA panels are aiming for normally? It also seems the monitor is lacking (A)RGB lightning. Given 0% RGB lighting, it would be rather impossible to get any sRGB coverage, let alone (near) 100%.
Isnt "72% NTSC" marketing speak for nearly full sRGB support? NTSC is (more or less) a wider color space than sRGB.
That being said, VESA has definitely polluted the term "HDR" with the HDR 400 standard. The monitor industry is once again shooting themselves in the foot in exchange for some short term marketing buzz.
DisplayHDR 400 doesn't even make sense. As bad as 500 and 600 were, they at least required wide colour gamut support. DisplayHDR 400 requires sRGB with a 1000:1 contrast ratio. Which... is just a description of a normal non-HDR monitor.
I have a HDR400 monitor. It was one of the "nice to haves" on my list. It has been a source of some regret after seeing the improvement when games are outputting in HDR but the severe limitations of the HDR400 standard. My monitors usually last the best part of a decade but I'm even considering replacing it early because of this. Also, this monitor sacrifices way too much on the alter of refresh rates for my liking but, it isn't aimed at people like me. 1080P and even daring to quote the NTSC gamut shows this is a niche area where little matters aside from refresh rate.
Also, NTSC color space is a nonsense measurement that should have been abandoned years and years ago. I don't get why panel manufacturers keep on referencing it.
For the games this is designed, 1080p @ 280fps is no challenge: LOL, Dota 2, Overwatch, PUBG, Fortnite, CS:GO. Even modern mid-range computers can hit 300fps depending on the game/settings.
Has PUBG made some significant strides recently? When I last played it that game was so horrendously optimized. High end computers barely hit a stable 144 on all low settings, let alone 280.
PUBG works really well with FreeSync and G-SYNC, so this monitor is perfect for PUBG. It is also good to have a VRR range bigger than a frame rate range. So if your game is running 100fps-250fps, then a VRR range. Also 100fps @ 280Hz is much lower lag than 100fps at 144Hz, because each refresh cycle finishes transmitting over the cable in 1/280sec — even at just 100fps.
The original PUBG is difficult to get above 150fps consistently, but lots of people run PUBG Lite or the mobile version in Android emulator to achieve well over 200fps.
Because anything over 240 is still relatively unusual. When there are a hundred different choices and 240 is table stakes on low end monitors, this will stop being news.
It’s just called VESA Adaptive-Sync, and apparently, is also G-SYNC certified. Which means a better-than-average FreeSync — just probably didn’t license AMD’s FreeSync logo.
I honest to God can't tell any difference with my 200hz monitor. I can't tell much of a difference with 4k tv or gaming. I can't tell much of a difference having Gsync on or off. There are so so many buzzwords and things and I just wish the experiences were as awe inspiring as the price of the upgrades.
I watched a recent JaysTwoCents video where they struggled to tell once at 75Hz! It's like Hi-Fi, people have to convince themselves that spending the extra was 'totally worth it'. I'm out of that game personally.
I can actually tell some difference in high-fidelty..though it doesn't often come in the packages with the hi-fi label. As for the Hz thing, I used to go by Jay, and that was definitely my two cents at the time. One piece of electronics hardware that did blow my mind recently was the VR rigs they've got going...very immersive.
I suspect they weren't playing at a competitive level in a high speed game running at 200fps+ then. The step up from 75 to 120hz It's VERY noticeable if you are, 120 to 144 is slightly less noticeable but you can still tell, and same again for 240hz if your game runs at over 144fps.
In fact a lot of people can tell the difference just by moving the mouse about in Windows and dragging windows arround.
The tests on blurbusters.com can highlight issues too.
You will not tell if you are just watching something with a low frame rate, and anything that isn't actually interactive makes it harder to tell also.
After 120hz, it becomes stupid bragging about having hyper vision and reflexes. All the people that complaining about anything less than 240hz should go to boxing and earn millions with their super fast reflexes.
For now, perhaps. But people were saying 4K and 8K were worthless, and now they’re quickly becoming cheap. But long-term, high-Hz will be commoditized.
The difference between 120Hz and 1000Hz is roughly as big as the difference between 60Hz and 120Hz.
For LCDs, fps=Hz motion: 60Hz = 16.7ms worth of motion blur 120Hz = 8.3ms worth of motion blur (8.3ms better than above) 1000Hz = 1ms worth of motion blur (7.3ms better than above).
GPUs will eventually gain frame rate amplification technologies — see https;//www.blurbusters.com/frame-rate-amplification-tech — so that will also eventually solve the GPU-side problem, too.
Seeing how far real GPU development/power lags behind of GPU power demand, its very obvious that they need to solve it in such a "cheating" way. Even a 2080 Ti is still not fast enough for 4K and even struggles in some games in 1440p.
Even my mom and dad can see well beyond 120Hz — google “1000Hz” and you’ll see some excellent visual explanations on how the diminishing curve of returns still has benefits. 1000fps at 1000Hz looks like strobeless ULMB, for example, and there’s elimination of stroboscopic effects.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
42 Comments
Back to Article
casperes1996 - Friday, February 14, 2020 - link
"72% NTSC"I threw up. Everything should cover sRGB fully at this point. I get that this monitor's primary focus is the refresh rate, and it's not for people like me who'd rather have high resolution and wide colour at 60hz, but NTSC... And not even covering it fully... It's so depressing to look at.
A5 - Friday, February 14, 2020 - link
...yeah. 100% NTSC or sRGB should be table stakes for any monitor over $100 at this point.For the price these gaming monitors go for, we should be talking about 90+% of DCI-P3, like in high-end TVs.
HowDoesAnyOfThisWork - Sunday, February 16, 2020 - link
Isn't 100% Never The Same Color what those cheap-ish displays with NT and VA panels are aiming for normally?It also seems the monitor is lacking (A)RGB lightning. Given 0% RGB lighting, it would be rather impossible to get any sRGB coverage, let alone (near) 100%.
brucethemoose - Friday, February 14, 2020 - link
Isnt "72% NTSC" marketing speak for nearly full sRGB support? NTSC is (more or less) a wider color space than sRGB.That being said, VESA has definitely polluted the term "HDR" with the HDR 400 standard. The monitor industry is once again shooting themselves in the foot in exchange for some short term marketing buzz.
krazyfrog - Friday, February 14, 2020 - link
Yup. 72% NTSC should be around 99% sRGB since 100% sRGB is around 78% NTSC. For a gaming focused monitor that is sufficient.Guspaz - Friday, February 14, 2020 - link
DisplayHDR 400 doesn't even make sense. As bad as 500 and 600 were, they at least required wide colour gamut support. DisplayHDR 400 requires sRGB with a 1000:1 contrast ratio. Which... is just a description of a normal non-HDR monitor.philehidiot - Friday, February 21, 2020 - link
I have a HDR400 monitor. It was one of the "nice to haves" on my list. It has been a source of some regret after seeing the improvement when games are outputting in HDR but the severe limitations of the HDR400 standard. My monitors usually last the best part of a decade but I'm even considering replacing it early because of this. Also, this monitor sacrifices way too much on the alter of refresh rates for my liking but, it isn't aimed at people like me. 1080P and even daring to quote the NTSC gamut shows this is a niche area where little matters aside from refresh rate.Mr Perfect - Friday, February 14, 2020 - link
To be fair, it's also only 1080. Picture quality clearly isn't the focus here.jeremyshaw - Friday, February 14, 2020 - link
Also, NTSC color space is a nonsense measurement that should have been abandoned years and years ago. I don't get why panel manufacturers keep on referencing it.nerd1 - Monday, February 17, 2020 - link
google before throwing up, 100% sRGB = 72% NTSC usuallyPeachNCream - Friday, February 14, 2020 - link
Not sure about love, but it isn't hideous looking from a specs or design perspective so it has that going for it.Messsk - Friday, February 14, 2020 - link
This is great news for anyone who wants a monitor that no computer can keep up with and runs at a resolution that's 20 years old.nathanddrews - Friday, February 14, 2020 - link
For the games this is designed, 1080p @ 280fps is no challenge: LOL, Dota 2, Overwatch, PUBG, Fortnite, CS:GO. Even modern mid-range computers can hit 300fps depending on the game/settings.inighthawki - Friday, February 14, 2020 - link
Has PUBG made some significant strides recently? When I last played it that game was so horrendously optimized. High end computers barely hit a stable 144 on all low settings, let alone 280.mdrejhon - Sunday, February 16, 2020 - link
PUBG works really well with FreeSync and G-SYNC, so this monitor is perfect for PUBG. It is also good to have a VRR range bigger than a frame rate range. So if your game is running 100fps-250fps, then a VRR range. Also 100fps @ 280Hz is much lower lag than 100fps at 144Hz, because each refresh cycle finishes transmitting over the cable in 1/280sec — even at just 100fps.inighthawki - Monday, February 17, 2020 - link
OK that's great but has nothing to do with what I was replying to.nathanddrews - Tuesday, February 18, 2020 - link
The original PUBG is difficult to get above 150fps consistently, but lots of people run PUBG Lite or the mobile version in Android emulator to achieve well over 200fps.inighthawki - Tuesday, February 18, 2020 - link
Ah I see. I didn't realize PUBG Lite was a thing. Thanks!dotes12 - Friday, February 14, 2020 - link
It's actually a worse resolution than 20 years ago! This is only 16:9 1920x1080 when all my monitors from back then were 16:10 1920x1200...Zingam - Saturday, February 15, 2020 - link
1200 is for noobz! Pros use tripple 1440p!p1esk - Friday, February 14, 2020 - link
Don't we have 360Hz monitors? Why is this news?surt - Friday, February 14, 2020 - link
Because anything over 240 is still relatively unusual. When there are a hundred different choices and 240 is table stakes on low end monitors, this will stop being news.twtech - Saturday, February 15, 2020 - link
Are they IPS?jcbenten994 - Friday, February 14, 2020 - link
I dislike 1080...both of my 24" monitors are 1200...but I am not a gamer and I purchased well before 4K was a thing...Zingam - Saturday, February 15, 2020 - link
1200 is for noobz! Pros use tripple 1440p!Sahrin - Friday, February 14, 2020 - link
No FreeSync = Worthless.Beaver M. - Sunday, February 16, 2020 - link
No real G-Sync = worthless.mdrejhon - Sunday, February 16, 2020 - link
This monitor has FreeSync.It’s just called VESA Adaptive-Sync, and apparently, is also G-SYNC certified. Which means a better-than-average FreeSync — just probably didn’t license AMD’s FreeSync logo.
Bateluer - Friday, February 14, 2020 - link
1080p in the 4K age? Nah.milkywayer - Sunday, February 16, 2020 - link
Yup its stewpid to buy a "full HD" display in 2020. 4k should be the bare minimum.wanderer66 - Saturday, February 15, 2020 - link
This seems to be well into the territory of "diminishing returns".UsernameTaken - Saturday, February 15, 2020 - link
I honest to God can't tell any difference with my 200hz monitor. I can't tell much of a difference with 4k tv or gaming. I can't tell much of a difference having Gsync on or off. There are so so many buzzwords and things and I just wish the experiences were as awe inspiring as the price of the upgrades.jabber - Saturday, February 15, 2020 - link
I watched a recent JaysTwoCents video where they struggled to tell once at 75Hz! It's like Hi-Fi, people have to convince themselves that spending the extra was 'totally worth it'. I'm out of that game personally.UsernameTaken - Saturday, February 15, 2020 - link
I can actually tell some difference in high-fidelty..though it doesn't often come in the packages with the hi-fi label. As for the Hz thing, I used to go by Jay, and that was definitely my two cents at the time. One piece of electronics hardware that did blow my mind recently was the VR rigs they've got going...very immersive.Yakumo.unr - Sunday, March 15, 2020 - link
I suspect they weren't playing at a competitive level in a high speed game running at 200fps+ then. The step up from 75 to 120hz It's VERY noticeable if you are, 120 to 144 is slightly less noticeable but you can still tell, and same again for 240hz if your game runs at over 144fps.In fact a lot of people can tell the difference just by moving the mouse about in Windows and dragging windows arround.
The tests on blurbusters.com can highlight issues too.
You will not tell if you are just watching something with a low frame rate, and anything that isn't actually interactive makes it harder to tell also.
Vitor - Saturday, February 15, 2020 - link
After 120hz, it becomes stupid bragging about having hyper vision and reflexes. All the people that complaining about anything less than 240hz should go to boxing and earn millions with their super fast reflexes.Beaver M. - Sunday, February 16, 2020 - link
All these 240+ Hz monitors are only made for one game: CSGO.And there they are actually quite impressive.
mdrejhon - Sunday, February 16, 2020 - link
For now, perhaps. But people were saying 4K and 8K were worthless, and now they’re quickly becoming cheap. But long-term, high-Hz will be commoditized.The difference between 120Hz and 1000Hz is roughly as big as the difference between 60Hz and 120Hz.
For LCDs, fps=Hz motion:
60Hz = 16.7ms worth of motion blur
120Hz = 8.3ms worth of motion blur (8.3ms better than above)
1000Hz = 1ms worth of motion blur (7.3ms better than above).
GPUs will eventually gain frame rate amplification technologies — see https;//www.blurbusters.com/frame-rate-amplification-tech — so that will also eventually solve the GPU-side problem, too.
mdrejhon - Sunday, February 16, 2020 - link
Mistyped link — correct is https://www.blurbusters.com/frame-rate-amplificati...Beaver M. - Tuesday, February 18, 2020 - link
Seeing how far real GPU development/power lags behind of GPU power demand, its very obvious that they need to solve it in such a "cheating" way. Even a 2080 Ti is still not fast enough for 4K and even struggles in some games in 1440p.mdrejhon - Sunday, February 16, 2020 - link
Even my mom and dad can see well beyond 120Hz — google “1000Hz” and you’ll see some excellent visual explanations on how the diminishing curve of returns still has benefits. 1000fps at 1000Hz looks like strobeless ULMB, for example, and there’s elimination of stroboscopic effects.See https://www.blurbusters.com/1000hz-journey and https;//www.blurbusters.com/stroboscopics
AgaNaber - Sunday, May 31, 2020 - link
My english is not very good. Please help me.i have gtx 1080 i7 7700k. my monitor is 60 hz full hd. Should i buy this product?