Jump to content

G Sync Monitor, Yes Or No?


39 replies to this topic

#1 EnzyteBob82

    Member

  • PipPip
  • The Infernal
  • The Infernal
  • 36 posts

Posted 25 October 2014 - 04:16 PM

Hi everyone, I've just recently upgraded my computer in order to play MWO at the highest settings. Now, I want to do something about my old school TV that Im playing on and see what this G sync is all about.

From my understanding, G sync is providing a bridge between refresh rate of the monitor and the fps of the computer hardware in order to start looking at the total picture in terms of response times or ms ratings? I may be a bit confused on that.

Anyhow, my TV is a Phillips 40PFL 7505D/F7. It is not ideal because of its max 60hz rating while gaming.

My rig:

Gigabyte GA-990FXA-UD3
AMD FX 8350
Zotac GTX 970
16 GB Corsair Vengeance 1600 RAM
Samsung Evo 500 GB SSD
WD Caviar Black 1 TB 7200
Rocketfish (lol, I know) 700w PSU

So, I was wondering what your guys' thoughts are on a monitor. I would like to give G sync a shot.

#2 Aznpersuasion89

    Member

  • PipPipPipPipPipPipPip
  • Elite Founder
  • Elite Founder
  • 614 posts
  • Locationca

Posted 25 October 2014 - 04:24 PM

I use g sync but only because I use a lcd tv at 60hz. If I don't use g sync the frame tearing is too annoying.

#3 Durant Carlyle

    Member

  • PipPipPipPipPipPipPipPipPip
  • Survivor
  • Survivor
  • 3,877 posts
  • LocationClose enough to poke you with a stick.

Posted 25 October 2014 - 04:39 PM

I just got an Acer XB270H Abprz 27" G-Sync monitor yesterday, along with a Zotac GTX 980 Amp! Omega graphics card (the last parts of my new gaming build). All I can say is ... WOW!

Granted, this is my first 144Hz monitor experience, as well as my first G-Sync experience, but even being a supposedly crappy TN panel (in comparison to IPS and PLS panels), this combo is awesome.

And G-Sync literally synchronizes the monitor refresh rate with what rate the graphics card is outputting. Even at lower FPS it's still smooth because they are in unison.

I highly recommend it.

My new system:

Corsair Carbide 200R windowed case
SeaSonic G-550 power supply
Intel Core i5-4690K processor
Stock Intel cooler
ASUS Maximus VII Gene mainboard
Corsair Vengeance LP 1600 4x4GB memory
Zotac GTX 980 Amp! Omega graphics card
Crucial MX100 512GB SSD
Seagate Barracuda 3TB HDD
ASUS BW-12B1ST Blu-ray burner
Acer XB270H Abprz 27" G-Sync monitor
Razer BlackWidow Stealth 2014 keyboard
Razer Taipan mouse
Razer Goliathus Small Speed mouse mat
Plantronics GameCom 308 headset

Plays MW:O with everything maxed out, smooth as smooth can be even in the midst of heavy face-hugging combat. Just got the build done yesterday, so I haven't tried any other games yet. But I assume they'll be just as incredible.

#4 Flapdrol

    Member

  • PipPipPipPipPipPipPipPip
  • The 1 Percent
  • The 1 Percent
  • 1,986 posts

Posted 26 October 2014 - 03:40 AM

View PostEnzyteBob82, on 25 October 2014 - 04:16 PM, said:

From my understanding, G sync is providing a bridge between refresh rate of the monitor and the fps of the computer hardware in order to start looking at the total picture in terms of response times or ms ratings? I may be a bit confused on that.

What gsync does is refresh the screen as soon as the graphics card is done rendering a frame.

Normally screens refresh all the time at a fixed rate. If a new frame is done halfway the refresh you get a tearline on the center of the screen. Or you enable "wait for vsync", then the new frame is displayed after the current refresh is done.

Anyway, it's a big price premium, there's no competition yet, but if you want something at 120/144 hz and have an nvidia card with displayport I wouldn't get something without gsync. The most expensive screen is the rog swift, but it's also the best deal imo, the only 2560x1440 on 144hz. While gsync makes things smooth at low fps, the high refresh rate still helps, as the screen refreshes in 1000/144=7ms.

Anyway, Q1 2015 there will be a competing system that works with amd cards, gsync prices will probably drop at that point, so keep that in mind when you buy something.

#5 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 26 October 2014 - 07:32 AM

There's absolutely nothing that precludes Nvidia from adding Freesync support either, because AMD made it an open standard (what a contrast...), and while Nvidia will cling to their exclusive standard for awhile, like they always do, it will most likely fail like every other in history except CUDA. Freesync isn't just supposed to be massively cheaper for lack of licensing fees or the same expensive hardware setup (it's little more than an extension of existing Displayport tech), but it's supposed to lack the two-way communication overhead (read: latency) that Gsync incurs. Given that, I quickly see it overtaking Gsync. At least this is what I hope, because I intend to own all Nvidia GPUs by this time next year (barring a fantastic 300 series release), and I don't want to be stuck with a Gsync monitor that pidgeonholes me into Nvidia until I replace the monitor.

Edited by Catamount, 26 October 2014 - 07:32 AM.


#6 Egomane

    Member

  • PipPipPipPipPipPipPipPipPipPip
  • 8,163 posts

Posted 26 October 2014 - 07:44 AM

View PostAznpersuasion89, on 25 October 2014 - 04:24 PM, said:

I use g sync but only because I use a lcd tv at 60hz. If I don't use g sync the frame tearing is too annoying.

I guess you are confusing V-Sync with G-Sync. I don't know of any TV with a Nvidia G-Sync module.

#7 Flapdrol

    Member

  • PipPipPipPipPipPipPipPip
  • The 1 Percent
  • The 1 Percent
  • 1,986 posts

Posted 26 October 2014 - 07:49 AM

View PostCatamount, on 26 October 2014 - 07:32 AM, said:

There's absolutely nothing that precludes Nvidia from adding Freesync support either, because AMD made it an open standard (what a contrast...), and while Nvidia will cling to their exclusive standard for awhile, like they always do, it will most likely fail like every other in history except CUDA. Freesync isn't just supposed to be massively cheaper for lack of licensing fees or the same expensive hardware setup (it's little more than an extension of existing Displayport tech), but it's supposed to lack the two-way communication overhead (read: latency) that Gsync incurs. Given that, I quickly see it overtaking Gsync. At least this is what I hope, because I intend to own all Nvidia GPUs by this time next year (barring a fantastic 300 series release), and I don't want to be stuck with a Gsync monitor that pidgeonholes me into Nvidia until I replace the monitor.

Nvidia made gsync hardware and pushed it to market, just like they did with cuda. Amd did the same with mantle, gcn only. Everybody does it that way if they're first. There is no extra latency with gsync btw, it's been measured by blurbusters.

If you want a screen that does variable refresh on both brands you're probably going to have to wait for a monitor manufacturer to make a screen that does both gsync and adaptivesync/freesync. Nvidia hardware probably isn't even compatible yet, as only amd's newest cards are.

#8 Aznpersuasion89

    Member

  • PipPipPipPipPipPipPip
  • Elite Founder
  • Elite Founder
  • 614 posts
  • Locationca

Posted 26 October 2014 - 09:18 AM

View PostEgomane, on 26 October 2014 - 07:44 AM, said:

I guess you are confusing V-Sync with G-Sync. I don't know of any TV with a Nvidia G-Sync module.


Woops, you are correct. Thanks :)

#9 EnzyteBob82

    Member

  • PipPip
  • The Infernal
  • The Infernal
  • 36 posts

Posted 26 October 2014 - 04:22 PM

From what I've gathered, this is technology that is here to stay. It will become the norm as reviews I have read pretty much all state that with G sync, the difference on most games is night and day.

That being said, I feel like the monitors at their current price, is waaaaaay more than I want to spend. Especially when there is only one option if you want one that can produce resolutions over 1080p.

We will see what Q1 and Q2 of 2015 holds. He'll, we might even get some new tech for the holidays! Not too hopeful for that, but these things shouldn't be costing more than $450-$500 right now. Need some more competition to bring the prices down.

#10 Thorqemada

    Member

  • PipPipPipPipPipPipPipPipPipPip
  • 6,365 posts

Posted 26 October 2014 - 04:56 PM

G-Sync will not become a standard as long it is an extra Piece of Hardware and increases the Monitor price as much as it does currently.
It will stay niche and die out if Nvidia does not change that.

#11 badkilik

    Member

  • PipPipPip
  • Bad Company
  • Bad Company
  • 53 posts
  • LocationFrozen City

Posted 26 October 2014 - 05:21 PM

As an owner of the VG248QE Asus monitor with G-Sync. I'd recommend it if you can afford it. Everyone who has negative things to say about G-Sync more than likely have yet to test the monitor out for themselves. Do you need G-Sync? No, you don't need G-sync. But the same can be said about your gaming keyboard, mouse, headset, or 60+ frames per second. Yet, people can justify buying these other gaming accessories but not a gaming monitor? Rest assured though that if you decide to buy the G-Sync monitor. I bet, your experience not only in this game but any other will never be the same without G-sync once you've experienced it for yourself. I for one will NEVER go back to a monitor for gaming without G-Sync or better.

#12 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 27 October 2014 - 07:09 AM

View PostFlapdrol, on 26 October 2014 - 07:49 AM, said:

Nvidia made gsync hardware and pushed it to market, just like they did with cuda. Amd did the same with mantle, gcn only. Everybody does it that way if they're first. There is no extra latency with gsync btw, it's been measured by blurbusters.

If you want a screen that does variable refresh on both brands you're probably going to have to wait for a monitor manufacturer to make a screen that does both gsync and adaptivesync/freesync. Nvidia hardware probably isn't even compatible yet, as only amd's newest cards are.


CUDA and Gsync are purposefully and entirely restricted to Nvidia products. It is illegal to use them on anything else. That's why they're exclusive technologies, not just proprietary. Mantle is not exclusive, and neither is Freesync. Yes, AMD designed Mantle to work with their GPUs (Freesync not necessarily), but they in no way legally preclude someone else taking up the ability to run these technologies in their own products. Nvidia could add Mantle or Freesync support if they so chose; AMD could not implemented Gsync or CUDA support. So no, these two approaches are not remotely comparable.

Also, Blurbusters disagrees with you about their conclusions. CS:GO showed an enormous spike in latency in their testing at high framerates. Yes, it only shows up at very high framerates, but what's the point of a 144hz monitor if you can't use it at 144hz? Freesync does not have this problem, at least in theory. Also, Freesync is not Adaptivesync. The latter is merely a component of the mechanism of the former. And of course only AMD's newer cards support Freesync; Adaptivesync requires the Displayport 1.2a standard, and cards prior to the7000 series don't have 1.2a ports. That doesn't mean AMD is making it exclusive, which they explicitly aren't. Saying that makes it exclusive is like claiming Assassing's Creed's good-performing graphics were AMD exclusive because only AMD cards supported DX10.1 at the time (until Nvidia quietly twisted arms and DX10.1 mysteriously disappeared, with an excuse that didn't hold up under scrutiny; oh Nvidia...).

If Gsync ends up not having the expected price disadvantage vs Freesync, it might hang around. Then Nvidia will just be doing harm to the market and consumers by purposefully forcing fragmentation, instead of hurting itself. If that isn't the case, however, then Nvidia requiring a more expensive monitor for this technology will not work in their favor, and with any luck, they'll accept a more univeral industry standard and give up Gsync, and there will be much rejoicing. Since EVERY Nvidia exclusive technology other than CUDA has failed, and that list is ong, odds are Gsync won't hang around, for the same reason prior exclusive techs didn't. One can only hope.


As for the OP, well if I were in the market for a monitor right now, I wouldn't wait for Freesync's release, let alone waiting to see if Nvidia implements support eventually. I'd hold my nose and get Gsync, and accept swallowing increased costs.

Edited by Catamount, 27 October 2014 - 07:16 AM.


#13 Flapdrol

    Member

  • PipPipPipPipPipPipPipPip
  • The 1 Percent
  • The 1 Percent
  • 1,986 posts

Posted 27 October 2014 - 07:56 AM

View PostCatamount, on 27 October 2014 - 07:09 AM, said:

Also, Blurbusters disagrees with you about their conclusions. CS:GO showed an enormous spike in latency in their testing at high framerates. Yes, it only shows up at very high framerates, but what's the point of a 144hz monitor if you can't use it at 144hz? Freesync does not have this problem, at least in theory.


Nah, freesync will have this "problem" as well, the problem is the screens cant go over 144hz, so then the driver will have to limit the framerate, that means you're effectively turning on vsync if you go over that framerate. It's easily solved by running an ingame fps limiter just under the limit, the driver can only limit framerate at the end of the chain, which means you build up latency. Could go triple buffer, but then you'll reintroduce stutter.

Anyway, mantle is still completely closed, not that it makes a difference, intel and nvidia wouldn't support an api controlled by the direct competitor anyway. "true audio", more vendor lockin shenanigans. Then there's ******* like richard huddy making the most outrageous claims. Nvidia marketing just keeps their mouths shut when it comes to amd products, much better imo.

#14 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 27 October 2014 - 08:01 AM

Mantle is slated to become open; Nvidia as a rule opens nothing. DX12 obsoleted Mantle by not sucking horribly (mission accomplished, AMD?), but that's not the point. You can't compare Mantle and Freesync to Gsync and CUDA, because Nvidia technologies are exclusive, always, and always fail, except for CUDA. I know a lot of developers in various industries who are getting quickly sick of CUDA as well, largely because Nvidia GPUs suck at double precision while cheap, tempting AMD GPUs are sitting around (hence AMD purchases during the Litecoin boom last year).

And there's no evidence, whatsoever, that Freesync will share Gsync's latency problems and require a frame limiter. You're just making a blind assumption :P That's not why the latency hit anyways. Blur Busters did not exceed the monitor refresh; their high latency problem hit even at 143hz.

Edited by Catamount, 27 October 2014 - 08:05 AM.


#15 Flapdrol

    Member

  • PipPipPipPipPipPipPipPip
  • The 1 Percent
  • The 1 Percent
  • 1,986 posts

Posted 27 October 2014 - 08:27 AM

View PostCatamount, on 27 October 2014 - 08:01 AM, said:

And there's no evidence, whatsoever, that Freesync will share Gsync's latency problems and require a frame limiter. You're just making a blind assumption :P That's not why the latency hit anyways. Blur Busters did not exceed the monitor refresh; their high latency problem hit even at 143hz.

meh, when they capped at 120 fps, then the latency was on par with vsync off, 143 is a bit too close probably. there is no latency problem.

I don't make blind assumptions, I use common sense, if you dont want tearing you have to limit fps to under max screen speed or drop frames (with triple buffering), which will introduce stutter. Anyway, it won't be a problem with freesync either, as nearly all games have an optional built in fps limiter.

Edited by Flapdrol, 27 October 2014 - 08:28 AM.


#16 Goose

    Member

  • PipPipPipPipPipPipPipPipPip
  • Civil Servant
  • Civil Servant
  • 3,463 posts
  • Twitch: Link
  • LocationThat flattop, up the well, overhead

Posted 27 October 2014 - 08:38 AM

View PostFlapdrol, on 27 October 2014 - 08:27 AM, said:

I don't make blind assumptions, I use common sense, if you dont want tearing you have to limit fps to under max screen speed or drop frames (with triple buffering), which will introduce stutter.

Where you get your information from, huh?

#17 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 27 October 2014 - 10:44 AM

He gets his information from "common sense", duh. No, there's no evidence Freesync will share any given behaviour with Freesync, but it's just "common sense"! Like how I know the world is 6,000 years old; geology? Pshhh It's just common sense. And geocentrism? Hello, you can SEE the sun rise every morning so clearly it's just common sense that it's revolving around us. Gosh! Stupid people and their science and necessary and sufficient conditions and predictions and null hypotheses *grumble grumble*

In any case, a technology that works in 144hz monitors, where you have to hard cap your FPS at 120 (hassle, and you're paying for framerate you don't get to use) is hardly problem free. Between that and projected cost differences, I think I'll go with the tech that doesn't have that issue, costs less, and doesn't pat Nvidia on the back for purposefully fragmenting the market, choice provided. I wouldn't be surprised if Freesync just supplants adaptive Vsync altogether in the next Displayport revision, or maybe that's just wishful thinking so I can laugh at Nvidia (while buying three GPUs from them, go figure). Or maybe, just maybe, it's common sense, and I can decree it unequivocally true :)

Edited by Catamount, 27 October 2014 - 10:44 AM.


#18 Flapdrol

    Member

  • PipPipPipPipPipPipPipPip
  • The 1 Percent
  • The 1 Percent
  • 1,986 posts

Posted 27 October 2014 - 11:14 AM

All the freesync demo's have been capped below max screen refresh rate as well.

It makes sense, game renders frame -> screen refreshes. If the game goes over the max refresh rate it doesn't work. Use your brain man.

#19 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 27 October 2014 - 02:21 PM

Well my brain is rather mush today thanks to two exams, but my keen bio major sense of math (read: I suck at math) suggests that 143<144.

#20 Durant Carlyle

    Member

  • PipPipPipPipPipPipPipPipPip
  • Survivor
  • Survivor
  • 3,877 posts
  • LocationClose enough to poke you with a stick.

Posted 27 October 2014 - 02:55 PM

Buying a G-Sync monitor doesn't lock you to Nvidia. If you change your mind in a couple of years and go AMD, it's still a good monitor with 144Hz refresh rate (most of them, anyway -- I've seen one 4k G-Sync at 60Hz).

It's not like Nvidia is suddenly going to suck bawls. My 980 will still be a good graphics card in four or five years. I won't have it by then, but it'll still be useful to some gamer.





1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users