G Sync Monitor, Yes Or No?
#1
Posted 25 October 2014 - 04:16 PM
From my understanding, G sync is providing a bridge between refresh rate of the monitor and the fps of the computer hardware in order to start looking at the total picture in terms of response times or ms ratings? I may be a bit confused on that.
Anyhow, my TV is a Phillips 40PFL 7505D/F7. It is not ideal because of its max 60hz rating while gaming.
My rig:
Gigabyte GA-990FXA-UD3
AMD FX 8350
Zotac GTX 970
16 GB Corsair Vengeance 1600 RAM
Samsung Evo 500 GB SSD
WD Caviar Black 1 TB 7200
Rocketfish (lol, I know) 700w PSU
So, I was wondering what your guys' thoughts are on a monitor. I would like to give G sync a shot.
#2
Posted 25 October 2014 - 04:24 PM
#3
Posted 25 October 2014 - 04:39 PM
Granted, this is my first 144Hz monitor experience, as well as my first G-Sync experience, but even being a supposedly crappy TN panel (in comparison to IPS and PLS panels), this combo is awesome.
And G-Sync literally synchronizes the monitor refresh rate with what rate the graphics card is outputting. Even at lower FPS it's still smooth because they are in unison.
I highly recommend it.
My new system:
Corsair Carbide 200R windowed case
SeaSonic G-550 power supply
Intel Core i5-4690K processor
Stock Intel cooler
ASUS Maximus VII Gene mainboard
Corsair Vengeance LP 1600 4x4GB memory
Zotac GTX 980 Amp! Omega graphics card
Crucial MX100 512GB SSD
Seagate Barracuda 3TB HDD
ASUS BW-12B1ST Blu-ray burner
Acer XB270H Abprz 27" G-Sync monitor
Razer BlackWidow Stealth 2014 keyboard
Razer Taipan mouse
Razer Goliathus Small Speed mouse mat
Plantronics GameCom 308 headset
Plays MW:O with everything maxed out, smooth as smooth can be even in the midst of heavy face-hugging combat. Just got the build done yesterday, so I haven't tried any other games yet. But I assume they'll be just as incredible.
#4
Posted 26 October 2014 - 03:40 AM
EnzyteBob82, on 25 October 2014 - 04:16 PM, said:
What gsync does is refresh the screen as soon as the graphics card is done rendering a frame.
Normally screens refresh all the time at a fixed rate. If a new frame is done halfway the refresh you get a tearline on the center of the screen. Or you enable "wait for vsync", then the new frame is displayed after the current refresh is done.
Anyway, it's a big price premium, there's no competition yet, but if you want something at 120/144 hz and have an nvidia card with displayport I wouldn't get something without gsync. The most expensive screen is the rog swift, but it's also the best deal imo, the only 2560x1440 on 144hz. While gsync makes things smooth at low fps, the high refresh rate still helps, as the screen refreshes in 1000/144=7ms.
Anyway, Q1 2015 there will be a competing system that works with amd cards, gsync prices will probably drop at that point, so keep that in mind when you buy something.
#5
Posted 26 October 2014 - 07:32 AM
Edited by Catamount, 26 October 2014 - 07:32 AM.
#7
Posted 26 October 2014 - 07:49 AM
Catamount, on 26 October 2014 - 07:32 AM, said:
Nvidia made gsync hardware and pushed it to market, just like they did with cuda. Amd did the same with mantle, gcn only. Everybody does it that way if they're first. There is no extra latency with gsync btw, it's been measured by blurbusters.
If you want a screen that does variable refresh on both brands you're probably going to have to wait for a monitor manufacturer to make a screen that does both gsync and adaptivesync/freesync. Nvidia hardware probably isn't even compatible yet, as only amd's newest cards are.
#9
Posted 26 October 2014 - 04:22 PM
That being said, I feel like the monitors at their current price, is waaaaaay more than I want to spend. Especially when there is only one option if you want one that can produce resolutions over 1080p.
We will see what Q1 and Q2 of 2015 holds. He'll, we might even get some new tech for the holidays! Not too hopeful for that, but these things shouldn't be costing more than $450-$500 right now. Need some more competition to bring the prices down.
#10
Posted 26 October 2014 - 04:56 PM
It will stay niche and die out if Nvidia does not change that.
#11
Posted 26 October 2014 - 05:21 PM
#12
Posted 27 October 2014 - 07:09 AM
Flapdrol, on 26 October 2014 - 07:49 AM, said:
If you want a screen that does variable refresh on both brands you're probably going to have to wait for a monitor manufacturer to make a screen that does both gsync and adaptivesync/freesync. Nvidia hardware probably isn't even compatible yet, as only amd's newest cards are.
CUDA and Gsync are purposefully and entirely restricted to Nvidia products. It is illegal to use them on anything else. That's why they're exclusive technologies, not just proprietary. Mantle is not exclusive, and neither is Freesync. Yes, AMD designed Mantle to work with their GPUs (Freesync not necessarily), but they in no way legally preclude someone else taking up the ability to run these technologies in their own products. Nvidia could add Mantle or Freesync support if they so chose; AMD could not implemented Gsync or CUDA support. So no, these two approaches are not remotely comparable.
Also, Blurbusters disagrees with you about their conclusions. CS:GO showed an enormous spike in latency in their testing at high framerates. Yes, it only shows up at very high framerates, but what's the point of a 144hz monitor if you can't use it at 144hz? Freesync does not have this problem, at least in theory. Also, Freesync is not Adaptivesync. The latter is merely a component of the mechanism of the former. And of course only AMD's newer cards support Freesync; Adaptivesync requires the Displayport 1.2a standard, and cards prior to the7000 series don't have 1.2a ports. That doesn't mean AMD is making it exclusive, which they explicitly aren't. Saying that makes it exclusive is like claiming Assassing's Creed's good-performing graphics were AMD exclusive because only AMD cards supported DX10.1 at the time (until Nvidia quietly twisted arms and DX10.1 mysteriously disappeared, with an excuse that didn't hold up under scrutiny; oh Nvidia...).
If Gsync ends up not having the expected price disadvantage vs Freesync, it might hang around. Then Nvidia will just be doing harm to the market and consumers by purposefully forcing fragmentation, instead of hurting itself. If that isn't the case, however, then Nvidia requiring a more expensive monitor for this technology will not work in their favor, and with any luck, they'll accept a more univeral industry standard and give up Gsync, and there will be much rejoicing. Since EVERY Nvidia exclusive technology other than CUDA has failed, and that list is ong, odds are Gsync won't hang around, for the same reason prior exclusive techs didn't. One can only hope.
As for the OP, well if I were in the market for a monitor right now, I wouldn't wait for Freesync's release, let alone waiting to see if Nvidia implements support eventually. I'd hold my nose and get Gsync, and accept swallowing increased costs.
Edited by Catamount, 27 October 2014 - 07:16 AM.
#13
Posted 27 October 2014 - 07:56 AM
Catamount, on 27 October 2014 - 07:09 AM, said:
Nah, freesync will have this "problem" as well, the problem is the screens cant go over 144hz, so then the driver will have to limit the framerate, that means you're effectively turning on vsync if you go over that framerate. It's easily solved by running an ingame fps limiter just under the limit, the driver can only limit framerate at the end of the chain, which means you build up latency. Could go triple buffer, but then you'll reintroduce stutter.
Anyway, mantle is still completely closed, not that it makes a difference, intel and nvidia wouldn't support an api controlled by the direct competitor anyway. "true audio", more vendor lockin shenanigans. Then there's ******* like richard huddy making the most outrageous claims. Nvidia marketing just keeps their mouths shut when it comes to amd products, much better imo.
#14
Posted 27 October 2014 - 08:01 AM
And there's no evidence, whatsoever, that Freesync will share Gsync's latency problems and require a frame limiter. You're just making a blind assumption That's not why the latency hit anyways. Blur Busters did not exceed the monitor refresh; their high latency problem hit even at 143hz.
Edited by Catamount, 27 October 2014 - 08:05 AM.
#15
Posted 27 October 2014 - 08:27 AM
Catamount, on 27 October 2014 - 08:01 AM, said:
meh, when they capped at 120 fps, then the latency was on par with vsync off, 143 is a bit too close probably. there is no latency problem.
I don't make blind assumptions, I use common sense, if you dont want tearing you have to limit fps to under max screen speed or drop frames (with triple buffering), which will introduce stutter. Anyway, it won't be a problem with freesync either, as nearly all games have an optional built in fps limiter.
Edited by Flapdrol, 27 October 2014 - 08:28 AM.
#16
Posted 27 October 2014 - 08:38 AM
Flapdrol, on 27 October 2014 - 08:27 AM, said:
Where you get your information from, huh?
#17
Posted 27 October 2014 - 10:44 AM
In any case, a technology that works in 144hz monitors, where you have to hard cap your FPS at 120 (hassle, and you're paying for framerate you don't get to use) is hardly problem free. Between that and projected cost differences, I think I'll go with the tech that doesn't have that issue, costs less, and doesn't pat Nvidia on the back for purposefully fragmenting the market, choice provided. I wouldn't be surprised if Freesync just supplants adaptive Vsync altogether in the next Displayport revision, or maybe that's just wishful thinking so I can laugh at Nvidia (while buying three GPUs from them, go figure). Or maybe, just maybe, it's common sense, and I can decree it unequivocally true
Edited by Catamount, 27 October 2014 - 10:44 AM.
#18
Posted 27 October 2014 - 11:14 AM
It makes sense, game renders frame -> screen refreshes. If the game goes over the max refresh rate it doesn't work. Use your brain man.
#19
Posted 27 October 2014 - 02:21 PM
#20
Posted 27 October 2014 - 02:55 PM
It's not like Nvidia is suddenly going to suck bawls. My 980 will still be a good graphics card in four or five years. I won't have it by then, but it'll still be useful to some gamer.
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users