Or just use FreeSync? It maybe made sense to use their own standard before their was an industry standard--although a strong arguement could be made that they should have put effort into making an industry standard instead of just going it alone. But now that we're *many years* post standard? That's not easily defensable.
If VESA didn't totally phone in the AdaptiveSync spec sheet this wouldn't be happening. And you're right, G-Sync shouldn't be a thing, but it is, because it has the most complete feature set.
Nvidia doesn't care about making money on G-Sync modules or scalers and new G-Sync monitors have worked with AMD GPUs for years, now. I don't see any other reason to have the monitor-side hardware other than necessity, so if Nvidia could provide the same features and reliability without the monitor-side hardware they would.
Note that prefering to do away with the part is not the same as selling the part at low margins. There is a cost of capital and if Nvidia is going to make the part they are going to want to maintain a margin on it. That's why including it in MediaTek's scalars makes sense. MediaTek gets a value-added product and Nvidia gets a way to produce it without much investment.
Intel FPGA modules are not cheap. ARM is cheaper and Nvidia does make money on certification and this new Scaler option. Just like Apple. Also FPGA equipped GSync monitors do not need Drivers to enable some features while the GSync compatible require. I do not know if Linux has any effect but on Windows. Windows 7 can utilize the G-Sync feature on an FPGA monitor because it does not require WDDM2.0 to enable the Adaptive Sync.
That said MediaTek is a garbage company because of their OSS policies, they never share Source code for Kernels in Android ecosystem and always against any sort of Development / Mods / etc unlike Qualcomm.
Nvidia make money, yes. I said that. And I said why they make money. And I explained why they would just as well not be making money by not selling it to begin with. They don't have a certification program to make money on the program. They have a certification program to support the feature which they use to add value to the product in order to maintain money on the product. Saying "they make money on the certification program" is just the same thing as saying that the feature actually adds value to the product. They wouldn't continue with the feature at all if it did not. The same can be said for the hardware. They will not invest in it if it's actually dragging their margins down. The point of GSync is to enable them to increase margins, not to decrease margins but sell more stuff. You haven't really replied to what I said, I'm not sure why you put it under my post.
“-although a strong arguement could be made that they should have put effort into making an industry standard instead of just going it alone.” See xkcd "Standards"
Freesync / G-sync compatible monitors do not do the same thing as what a dedicated G-sync module is meant for. Having a Freesync / G-sync compatible IPS monitor without a G-sync module was a frustrating experience, as I had to manually change the overdrive modes to minimize blur or overshoot depending on the framerates my games were getting. It was always a guessing game, and the image quality would suffer when the framerate would change.
Just get an OLED TV and be done with it. The IPS / VA / TN monitors are just bad. Unless you really need over 300Hz refresh rate it is no use. OLED won't have any of these garbage flaws of the premium monitors. LG's OLED in G3, G4 range is very solid and even better if you can get a C line of them and dump these monitors off. Bonus you do not get stupid matte coating nonsense.
Unfortunately that VRR flicker happens on all OLED afaik But it does happen on low brightness scenes only and causes that chrominance overshoot. I think G4's latest panel minimizes it heavily. Not sure if a processor can fix it or its a design flaw in OLED itself. G5 OLED is too many months away now. I won't touch a QD OLED panel though. MLA+ is the new OLED panel in G4, so maybe MLA2 will address these ? Time will tell.
You're joking right? TN panels are far more cost effective than OLED and in some parts of the world, the average person could sustain existence for months on the difference in prices between the two.
The 300Hz monitors are not cheap either, and a few years back a panel retailed for $700. That is a junk TN panel too. BenQ still sells overpriced garbage LCD technology.
These are not budget either, this is Nvidia we are talking about and then a Scaler HW in a monitor.
Really high refresh rate panels (over 240 Hz) that often are TN type are however a special and smaller market. For everyday use and some gaming, I would, if available, rather get a really good 32" or larger HDR IPS panel (with 144 Hz or faster, Quantum Dots and miniLED FALD) than an OLED. Although OLEDs have gotten a lot more resistant to burn-in, I'm still worried that the Windows home screen 10+ hours a day would start showing up within a year or so. But, that's my use case. For gaming and watching videos, OLED is - IMHO- still unbeaten.
And here all along I've been telling people with complaints about games on their televisions to buy computer monitors, and pay extra for the matte coating if that's what it takes so they don't have to see as much room reflections in the screen as imagery. It's like we live in separate worlds.. but how did we manage to meet here...?
Slightly OT: I still find it frustrating, that few (any?) of the decent OLED TVs have Display Port connectivity. Actually, if you know of a good UHD OLED TV with DP-in at 1.4 or higher, please reply and add a link - Thanks! I find it especially annoying, as DP is an open, royalty-free standard, while HDMI (and bearing the logo) requires a paid-fot license from to HDMI.org. If two good OLED UHD TVs were otherwise about the same (price, size, panel quality etc), I'd buy the one with a DP-in connector any day, and gladly forgo the third HDMI port for it.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
16 Comments
Back to Article
dwillmore - Wednesday, August 21, 2024 - link
Or just use FreeSync? It maybe made sense to use their own standard before their was an industry standard--although a strong arguement could be made that they should have put effort into making an industry standard instead of just going it alone. But now that we're *many years* post standard? That's not easily defensable.Samus - Thursday, August 22, 2024 - link
If VESA didn't totally phone in the AdaptiveSync spec sheet this wouldn't be happening. And you're right, G-Sync shouldn't be a thing, but it is, because it has the most complete feature set.Yojimbo - Friday, August 23, 2024 - link
Nvidia doesn't care about making money on G-Sync modules or scalers and new G-Sync monitors have worked with AMD GPUs for years, now. I don't see any other reason to have the monitor-side hardware other than necessity, so if Nvidia could provide the same features and reliability without the monitor-side hardware they would.Note that prefering to do away with the part is not the same as selling the part at low margins. There is a cost of capital and if Nvidia is going to make the part they are going to want to maintain a margin on it. That's why including it in MediaTek's scalars makes sense. MediaTek gets a value-added product and Nvidia gets a way to produce it without much investment.
Silver5urfer - Saturday, August 24, 2024 - link
Intel FPGA modules are not cheap. ARM is cheaper and Nvidia does make money on certification and this new Scaler option. Just like Apple. Also FPGA equipped GSync monitors do not need Drivers to enable some features while the GSync compatible require. I do not know if Linux has any effect but on Windows. Windows 7 can utilize the G-Sync feature on an FPGA monitor because it does not require WDDM2.0 to enable the Adaptive Sync.That said MediaTek is a garbage company because of their OSS policies, they never share Source code for Kernels in Android ecosystem and always against any sort of Development / Mods / etc unlike Qualcomm.
Yojimbo - Sunday, August 25, 2024 - link
Nvidia make money, yes. I said that. And I said why they make money. And I explained why they would just as well not be making money by not selling it to begin with. They don't have a certification program to make money on the program. They have a certification program to support the feature which they use to add value to the product in order to maintain money on the product. Saying "they make money on the certification program" is just the same thing as saying that the feature actually adds value to the product. They wouldn't continue with the feature at all if it did not. The same can be said for the hardware. They will not invest in it if it's actually dragging their margins down. The point of GSync is to enable them to increase margins, not to decrease margins but sell more stuff. You haven't really replied to what I said, I'm not sure why you put it under my post.Skeptical123 - Friday, August 23, 2024 - link
“-although a strong arguement could be made that they should have put effort into making an industry standard instead of just going it alone.” See xkcd "Standards"Dizoja86 - Saturday, August 24, 2024 - link
Freesync / G-sync compatible monitors do not do the same thing as what a dedicated G-sync module is meant for. Having a Freesync / G-sync compatible IPS monitor without a G-sync module was a frustrating experience, as I had to manually change the overdrive modes to minimize blur or overshoot depending on the framerates my games were getting. It was always a guessing game, and the image quality would suffer when the framerate would change.Silver5urfer - Wednesday, August 21, 2024 - link
Just get an OLED TV and be done with it. The IPS / VA / TN monitors are just bad. Unless you really need over 300Hz refresh rate it is no use. OLED won't have any of these garbage flaws of the premium monitors. LG's OLED in G3, G4 range is very solid and even better if you can get a C line of them and dump these monitors off. Bonus you do not get stupid matte coating nonsense.meacupla - Wednesday, August 21, 2024 - link
Those LG OLED models you mentioned still have the VRR flickerSilver5urfer - Saturday, August 24, 2024 - link
Unfortunately that VRR flicker happens on all OLED afaik But it does happen on low brightness scenes only and causes that chrominance overshoot. I think G4's latest panel minimizes it heavily. Not sure if a processor can fix it or its a design flaw in OLED itself. G5 OLED is too many months away now. I won't touch a QD OLED panel though. MLA+ is the new OLED panel in G4, so maybe MLA2 will address these ? Time will tell.PeachNCream - Thursday, August 22, 2024 - link
You're joking right? TN panels are far more cost effective than OLED and in some parts of the world, the average person could sustain existence for months on the difference in prices between the two.Silver5urfer - Saturday, August 24, 2024 - link
The 300Hz monitors are not cheap either, and a few years back a panel retailed for $700. That is a junk TN panel too. BenQ still sells overpriced garbage LCD technology.These are not budget either, this is Nvidia we are talking about and then a Scaler HW in a monitor.
eastcoast_pete - Sunday, August 25, 2024 - link
Really high refresh rate panels (over 240 Hz) that often are TN type are however a special and smaller market.For everyday use and some gaming, I would, if available, rather get a really good 32" or larger HDR IPS panel (with 144 Hz or faster, Quantum Dots and miniLED FALD) than an OLED. Although OLEDs have gotten a lot more resistant to burn-in, I'm still worried that the Windows home screen 10+ hours a day would start showing up within a year or so. But, that's my use case. For gaming and watching videos, OLED is - IMHO- still unbeaten.
zeromus - Friday, August 23, 2024 - link
And here all along I've been telling people with complaints about games on their televisions to buy computer monitors, and pay extra for the matte coating if that's what it takes so they don't have to see as much room reflections in the screen as imagery. It's like we live in separate worlds.. but how did we manage to meet here...?eastcoast_pete - Sunday, August 25, 2024 - link
Slightly OT: I still find it frustrating, that few (any?) of the decent OLED TVs have Display Port connectivity. Actually, if you know of a good UHD OLED TV with DP-in at 1.4 or higher, please reply and add a link - Thanks!I find it especially annoying, as DP is an open, royalty-free standard, while HDMI (and bearing the logo) requires a paid-fot license from to HDMI.org. If two good OLED UHD TVs were otherwise about the same (price, size, panel quality etc), I'd buy the one with a DP-in connector any day, and gladly forgo the third HDMI port for it.
PeachNCream - Tuesday, August 27, 2024 - link
DP is effectively dead as a video standard. HDMI, royalties included, has emerged as the industry-accepted video interface.