Jump to content

Benefit Of Using Nvidia Cards


32 replies to this topic

#21 BLOOD WOLF

    Member

  • PipPipPipPipPipPipPipPipPipPip
  • The Jaws
  • The Jaws
  • 6,368 posts
  • Locationnowhere

Posted 27 August 2014 - 10:50 PM

About to get me GTX 750 Ti, From EVGA. I heard it was a pretty strong card. I got a medium sized case so I can go bigger.

#22 Zuesacoatl

    Member

  • PipPipPipPipPipPipPip
  • Elite Founder
  • Elite Founder
  • 614 posts
  • LocationColorado

Posted 27 August 2014 - 11:58 PM

View Postzortesh, on 27 August 2014 - 04:07 PM, said:

Shadow play is the real advantage to using nividia cards, if you like to take recordings of your games anyways.

AMD offers this now too, with no hit to graphics performance what so ever.

In all honesty, you have to look at a card to card basis, and then what price point you sit at. each card and point can be a game changer where you sit. I like anandtech.com myself, and will follow it up with toms(even though toms is very heavy into Nvidia and Intel for the kickbacks). Do not marry one brand or the other, marry the card that you are going to use in the price range you can afford that will give you the best performance in that slot.

#23 JigglyMoobs

    Member

  • PipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 1,445 posts

Posted 28 August 2014 - 12:36 AM

I seem to get lower input latency on nvidia cards. With AMD, the pre-rendered frame setting does not seem to work correctly on my computer.

#24 Chrithu

    Member

  • PipPipPipPipPipPipPipPip
  • Bad Company
  • 1,601 posts
  • LocationGermany

Posted 28 August 2014 - 01:20 AM

Hmm in general I think except for PhysX both are pretty even.

As far as this game goes I think you're better off with NVidia cards not because the engine would favor them but because PGI signed with NVidia to be a "The way it's meant to be played" game. This means they work together with NVidia on performance optimizations for NVidia cards. That per se wouldn't be much of an issue. But I read several articles lately stating that NVidia has clauses in those cooperation contracts that explicitly prohibit the gamedev that signs with them to work together with AMD at the same time until the optimizations for NVIdia cards are done. This is done to ensure that at least for some time a "The way it's meant to be played" game runs better on NVidia cards. That is understandable from a economic point of view but for me as a gamer that's just disgusting greedy bullshit, which simply tells me to never return to NVidia again as they don't mind actively "hurting" gamers in order to put forth their own causes.

That said I have no proof whatsoever that this is the case for this game, but whenever PGI talks about optimizations they only mention SLI and not Crossfire which strongly suggests that at the time being they aren't optimizing for AMD cards.

#25 Egomane

    Member

  • PipPipPipPipPipPipPipPipPipPip
  • 8,163 posts

Posted 28 August 2014 - 01:30 AM

View PostJason Parker, on 28 August 2014 - 01:20 AM, said:

Hmm in general I think except for PhysX both are pretty even.

As far as this game goes I think you're better off with NVidia cards not because the engine would favor them but because PGI signed with NVidia to be a "The way it's meant to be played" game. This means they work together with NVidia on performance optimizations for NVidia cards. That per se wouldn't be much of an issue. But I read several articles lately stating that NVidia has clauses in those cooperation contracts that explicitly prohibit the gamedev that signs with them to work together with AMD at the same time until the optimizations for NVIdia cards are done. This is done to ensure that at least for some time a "The way it's meant to be played" game runs better on NVidia cards. That is understandable from a economic point of view but for me as a gamer that's just disgusting greedy bullshit, which simply tells me to never return to NVidia again as they don't mind actively "hurting" gamers in order to put forth their own causes.

That said I have no proof whatsoever that this is the case for this game, but whenever PGI talks about optimizations they only mention SLI and not Crossfire which strongly suggests that at the time being they aren't optimizing for AMD cards.

The way it's meant to played is only a sign that nvidia was, at one point, somehow involved in code optimization.

AMD has a similar program but does not insist on their logo to be included on the program startup. One example for that is Star Citizen. AMD is heavily involved in it, but you will not find their brand logo anywhere during the startup of the hangar or arena commander. You will probably not see it in the finished game either.

TWIMTBP is no proof or sign of favoring one card over the other. It's not a rare occurance that an AMD card will actually run better on a game with this logo.

Like I said before: There is no real difference between the two brands performance wise. One game might be better for AMD, one is better for Nvidia. In the end it equals out. There is no way of knowing before you try it, except if there is already a benchmark available somewhere. Logos mean nothing!

Edited by Egomane, 28 August 2014 - 01:49 AM.


#26 Flapdrol

    Member

  • PipPipPipPipPipPipPipPip
  • The 1 Percent
  • The 1 Percent
  • 1,986 posts

Posted 28 August 2014 - 02:42 AM

In many tests the nvidia driver handles the CPU overhead on rendering better. It's very much possible this game performs better on nvidia cards because of this, but I've not seen it tested for this particular game.

I'd test it, but my amd card is too weak to get a valid result.

#27 Fire and Salt

    Member

  • PipPipPipPipPipPipPip
  • 526 posts
  • LocationFlorida

Posted 28 August 2014 - 06:00 AM

G Sync is far more than a marketing angle.

We have been playing variable fps games on fixed refresh rate monitors since the dawn of gaming.

Should nvida have used an open standard? Maybe.

But video cards/monitors have literally been "doing it wrong" since the start.
Fixed refresh rates are for movies which are also recorded at fixed frame rates, not dynamically rendered 3d graphics.



Whether its the gsync brand name or an open standard that catches on in the next few years is anyone's guess... but dynamic refresh rates are the wave of the future.


Too bad the monitors that support it are super expensive. (Gsync doesnt work without one)

#28 Egomane

    Member

  • PipPipPipPipPipPipPipPipPipPip
  • 8,163 posts

Posted 28 August 2014 - 06:21 AM

View PostFire and Salt, on 28 August 2014 - 06:00 AM, said:

G Sync is far more than a marketing angle.

It is! Adaptive Sync / FreeSync can do the same, or at least come to a very similar result, without the need for en extra 120,- $ chip or module in the monitor.

And so far there are not many screens out there supporting either. Those that do come with a heavy price tag.

#29 Flapdrol

    Member

  • PipPipPipPipPipPipPipPip
  • The 1 Percent
  • The 1 Percent
  • 1,986 posts

Posted 28 August 2014 - 06:44 AM

View PostEgomane, on 28 August 2014 - 06:21 AM, said:

It is! Adaptive Sync / FreeSync can do the same, or at least come to a very similar result, without the need for en extra 120,- $ chip or module in the monitor.

And so far there are not many screens out there supporting either. Those that do come with a heavy price tag.

Only the rog swift is available, adaptive sync screens haven't been announced yet.

The rog swift is quite expensive, so expensive in fact, gsync should be a relatively modest part of the cost. Anyway, as it's the only 2560x1440 144hz screen it's hard to compare it to anything else, despite the price it's sold out everywhere.

#30 Fire and Salt

    Member

  • PipPipPipPipPipPipPip
  • 526 posts
  • LocationFlorida

Posted 28 August 2014 - 06:45 AM

G sync will be marketing fluff once adaptive sync monitors actually exist.


I haven't been able to find any.


Maybe I need to look again. I would buy one if I could find one for under $400, as long as it was high-res.

#31 Oderint dum Metuant

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 4,758 posts
  • LocationUnited Kingdom

Posted 28 August 2014 - 07:28 AM

Have owned both; both perform well in differing games. That's just the way it is.

You go for where you get the most for your budget.
As long as you don't run a AMD FX series cpu; the GPU choice is fairly simple.

#32 Widowmaker1981

    Member

  • PipPipPipPipPipPipPipPipPipPip
  • The Widow Maker
  • The Widow Maker
  • 5,031 posts
  • LocationAt the other end of the pretty lights.

Posted 02 September 2014 - 04:33 AM

my understanding (which has been borne out by anecdotal experience) is that Intel chipsets tend to be happier bedfellows with nVidia cards, and AMD chipsets like ATI cards better.

could be bullcrap, but its what i go by (hence i stick to nvidia, because i use intel chips)

Edited by Widowmaker1981, 02 September 2014 - 04:34 AM.


#33 Flapdrol

    Member

  • PipPipPipPipPipPipPipPip
  • The 1 Percent
  • The 1 Percent
  • 1,986 posts

Posted 02 September 2014 - 06:53 AM

View PostWidowmaker1981, on 02 September 2014 - 04:33 AM, said:

my understanding (which has been borne out by anecdotal experience) is that Intel chipsets tend to be happier bedfellows with nVidia cards, and AMD chipsets like ATI cards better.

could be bullcrap, but its what i go by (hence i stick to nvidia, because i use intel chips)

There's no evidence for that at all, all amd cards run better with intel cpu's. Also, in many games an amd cpu will perform better with an nvidia gpu because the driver runs more efficiently.

like here:
http://techreport.co...battlefield-4/2





1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users