Benefit Of Using Nvidia Cards
#21
Posted 27 August 2014 - 10:50 PM
#22
Posted 27 August 2014 - 11:58 PM
zortesh, on 27 August 2014 - 04:07 PM, said:
AMD offers this now too, with no hit to graphics performance what so ever.
In all honesty, you have to look at a card to card basis, and then what price point you sit at. each card and point can be a game changer where you sit. I like anandtech.com myself, and will follow it up with toms(even though toms is very heavy into Nvidia and Intel for the kickbacks). Do not marry one brand or the other, marry the card that you are going to use in the price range you can afford that will give you the best performance in that slot.
#23
Posted 28 August 2014 - 12:36 AM
#24
Posted 28 August 2014 - 01:20 AM
As far as this game goes I think you're better off with NVidia cards not because the engine would favor them but because PGI signed with NVidia to be a "The way it's meant to be played" game. This means they work together with NVidia on performance optimizations for NVidia cards. That per se wouldn't be much of an issue. But I read several articles lately stating that NVidia has clauses in those cooperation contracts that explicitly prohibit the gamedev that signs with them to work together with AMD at the same time until the optimizations for NVIdia cards are done. This is done to ensure that at least for some time a "The way it's meant to be played" game runs better on NVidia cards. That is understandable from a economic point of view but for me as a gamer that's just disgusting greedy bullshit, which simply tells me to never return to NVidia again as they don't mind actively "hurting" gamers in order to put forth their own causes.
That said I have no proof whatsoever that this is the case for this game, but whenever PGI talks about optimizations they only mention SLI and not Crossfire which strongly suggests that at the time being they aren't optimizing for AMD cards.
#25
Posted 28 August 2014 - 01:30 AM
Jason Parker, on 28 August 2014 - 01:20 AM, said:
As far as this game goes I think you're better off with NVidia cards not because the engine would favor them but because PGI signed with NVidia to be a "The way it's meant to be played" game. This means they work together with NVidia on performance optimizations for NVidia cards. That per se wouldn't be much of an issue. But I read several articles lately stating that NVidia has clauses in those cooperation contracts that explicitly prohibit the gamedev that signs with them to work together with AMD at the same time until the optimizations for NVIdia cards are done. This is done to ensure that at least for some time a "The way it's meant to be played" game runs better on NVidia cards. That is understandable from a economic point of view but for me as a gamer that's just disgusting greedy bullshit, which simply tells me to never return to NVidia again as they don't mind actively "hurting" gamers in order to put forth their own causes.
That said I have no proof whatsoever that this is the case for this game, but whenever PGI talks about optimizations they only mention SLI and not Crossfire which strongly suggests that at the time being they aren't optimizing for AMD cards.
The way it's meant to played is only a sign that nvidia was, at one point, somehow involved in code optimization.
AMD has a similar program but does not insist on their logo to be included on the program startup. One example for that is Star Citizen. AMD is heavily involved in it, but you will not find their brand logo anywhere during the startup of the hangar or arena commander. You will probably not see it in the finished game either.
TWIMTBP is no proof or sign of favoring one card over the other. It's not a rare occurance that an AMD card will actually run better on a game with this logo.
Like I said before: There is no real difference between the two brands performance wise. One game might be better for AMD, one is better for Nvidia. In the end it equals out. There is no way of knowing before you try it, except if there is already a benchmark available somewhere. Logos mean nothing!
Edited by Egomane, 28 August 2014 - 01:49 AM.
#26
Posted 28 August 2014 - 02:42 AM
I'd test it, but my amd card is too weak to get a valid result.
#27
Posted 28 August 2014 - 06:00 AM
We have been playing variable fps games on fixed refresh rate monitors since the dawn of gaming.
Should nvida have used an open standard? Maybe.
But video cards/monitors have literally been "doing it wrong" since the start.
Fixed refresh rates are for movies which are also recorded at fixed frame rates, not dynamically rendered 3d graphics.
Whether its the gsync brand name or an open standard that catches on in the next few years is anyone's guess... but dynamic refresh rates are the wave of the future.
Too bad the monitors that support it are super expensive. (Gsync doesnt work without one)
#28
Posted 28 August 2014 - 06:21 AM
Fire and Salt, on 28 August 2014 - 06:00 AM, said:
It is! Adaptive Sync / FreeSync can do the same, or at least come to a very similar result, without the need for en extra 120,- $ chip or module in the monitor.
And so far there are not many screens out there supporting either. Those that do come with a heavy price tag.
#29
Posted 28 August 2014 - 06:44 AM
Egomane, on 28 August 2014 - 06:21 AM, said:
And so far there are not many screens out there supporting either. Those that do come with a heavy price tag.
Only the rog swift is available, adaptive sync screens haven't been announced yet.
The rog swift is quite expensive, so expensive in fact, gsync should be a relatively modest part of the cost. Anyway, as it's the only 2560x1440 144hz screen it's hard to compare it to anything else, despite the price it's sold out everywhere.
#30
Posted 28 August 2014 - 06:45 AM
I haven't been able to find any.
Maybe I need to look again. I would buy one if I could find one for under $400, as long as it was high-res.
#31
Posted 28 August 2014 - 07:28 AM
You go for where you get the most for your budget.
As long as you don't run a AMD FX series cpu; the GPU choice is fairly simple.
#32
Posted 02 September 2014 - 04:33 AM
could be bullcrap, but its what i go by (hence i stick to nvidia, because i use intel chips)
Edited by Widowmaker1981, 02 September 2014 - 04:34 AM.
#33
Posted 02 September 2014 - 06:53 AM
Widowmaker1981, on 02 September 2014 - 04:33 AM, said:
could be bullcrap, but its what i go by (hence i stick to nvidia, because i use intel chips)
There's no evidence for that at all, all amd cards run better with intel cpu's. Also, in many games an amd cpu will perform better with an nvidia gpu because the driver runs more efficiently.
like here:
http://techreport.co...battlefield-4/2
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users