Jump to content

Nvidia Gameworks Bad For Radeon?


26 replies to this topic

#21 Smokeyjedi

    Member

  • PipPipPipPipPipPipPipPip
  • Liquid Metal
  • Liquid Metal
  • 1,040 posts
  • LocationCanada

Posted 31 May 2014 - 06:16 AM

View PostThanatos676, on 31 May 2014 - 04:51 AM, said:


This sums it up...

And than Nvidia redid some low end code in their display driver which cranked out tonnes of performance across a ton of GPUs similar to mantle........burning ring of fire for sure.

#22 Alreech

    Member

  • PipPipPipPipPipPipPipPip
  • Little Helper
  • Little Helper
  • 1,649 posts

Posted 01 June 2014 - 07:08 AM

View PostThanatos676, on 31 May 2014 - 04:51 AM, said:


This sums it up...

No one cares about Mantle ?
No one except DICE ? :P

The guys who produce Battlefield and the new Star Wars Battlefront ?
The Frostbite Engine used for those games will be heavily used for other games like Need 4 Speed or Dragon Age.

#23 ApolloKaras

    Member

  • PipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 1,974 posts
  • LocationSeattle, Washington

Posted 01 June 2014 - 07:19 AM

The topic is broader than bad for Amd. It'd bad for the general public. There are a few things that Amd excels at in the video card market. You don't see many coin miners going after Nvidia cards. AMD has command of the opencl portion and most of the open source frameworks. With regards to Mantle, it's powerful. The question is can you get developers on board. So if you have a few companies that make a blockbuster game really showcasing mantle, it won't be so over hyped as it is now.

However for gaming Nvidia has it right now. Hence why I have a GTX670. What would be better for the general public is AMD put a little more R&D money into the driver dept...

#24 Chrithu

    Member

  • PipPipPipPipPipPipPipPip
  • Bad Company
  • 1,601 posts
  • LocationGermany

Posted 01 June 2014 - 07:47 AM

View PostAlreech, on 01 June 2014 - 07:08 AM, said:

No one cares about Mantle ?
No one except DICE ? :P

The guys who produce Battlefield and the new Star Wars Battlefront ?
The Frostbite Engine used for those games will be heavily used for other games like Need 4 Speed or Dragon Age.


And these aren't the only ones. CryEngine and UnrealEngine 4 will adopt it aswell as far as I know. And a number of upcoming games using either engine announced suporting it aswell.

The performance gains are factual. Any developer turning it down would be stupid. But in all seriousness no one not even AMD realistically expected that games in the end phase of their production would adopt it as it would involve serious engine rewrites. So it is no wonder that it takes some time until the first games using mantle are released.

As far as that NVida GameWorks and optimization bullshit in general goes:

I don't care for the reasons. If a game runs shabby on my machine I will not buy it. Period. And sure as hell I will not buy a new Graphics Card for every new hit game coming out. And I think I am not alone in that stance. In the end this **** will allways hurt the game publishers the most in my view.

#25 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 01 June 2014 - 09:05 AM

I find it amusing that people are still trying to trash the fact that at every pricepoint, AMD makes the better cards (well, okay, the litecoin boom put a kink in that, but only because they were, again, better cards for litecoin mining). Yes, people buy the performance they want, and the performance they want is the best they'll get for their budget, unless you have literally no budget and $6,000 on a pair of Titan Z GPUs is nothing.

Would everyone who has more than a $6000 budget for GPUs please raise their hand? Yeah, that's what I thought. For the other 99.9999% of us (oh us poor, poor plebs), performance/$ is still king, within our price range. Get over it, fanboys; Nvidia will have their turn.

They'll have their turn at Mantle, too, something that's seen an enormous amount of adoption considering how major an effort that is to implement, and how short a time it's been out. This isn't hardware Physx, which is just a little bit of fluff thrown onto the DirectX game that's already there, but oh hey, speaking of failures! ;)

View PostJason Parker, on 01 June 2014 - 07:47 AM, said:

I don't care for the reasons. If a game runs shabby on my machine I will not buy it. Period. And sure as hell I will not buy a new Graphics Card for every new hit game coming out. And I think I am not alone in that stance. In the end this **** will allways hurt the game publishers the most in my view.


This will depend on perceptions. It's possible developers who use Gameworks will be shooting themselves in the foot, because AMD users will just get performance that's so subpar, that they won't buy the games, though that would only persist in the short-term until everyone bought Nvidia GPUs. If Gameworks saw major implementation, and the situation was that bad, I wouldn't stick with AMD for my next purchase, that's for sure. Dirty or not, Nvidia would have us over a barrel.

Even in the short-term it's equally possible that AMD GPUs will play the games just well enough that AMD users will buy them, because they're unique games, but will simply wish for Nvidia GPUs. That could stick, and really influence the market.


I can't imagine either case avoiding an anti-trust suit.


There are, however, other ways this could play out. AMD could learn how to work with games made with Gameworks and eliminate the gap, maybe even quickly. AMD's got a pretty good driver team (both companies do).


Gameworks might also just end up seeing poor adoption, like hardware Physx. Studios may decide it's not work pursuing it if it leaves their AMD users clamouring for answers about poor performance, or it just may not end up conferring any real advantages. AMD might start working more with these companies to give alternative, but comparably improved means to optimize performance that are, as always, GPU-agnostic. We see this a fair bit. No one cared about Physx, and the same thing is going to kill Gsync, because a GPU-agnostic alternative is coming out.

Honestly, Nvidia's endeavors to try to corner the market with exclusive technologies has, with the exception of CUDA, more or less failed in almost every case. Either no one adopts them, or universal technologies replace them. Even CUDA's days are probably numbered. CUDA took hold in an age when Nvidia GPUs were the only ones that did GPGPU well, because they bought Ageia. Now that situation is reversed. AMD GPUs have consistently beaten the hell out of Nvidia GPUS in GPGPU, except for exorbitantly expensive Nvidia GPUs that offer effectively no more performance but cost twice as much (shows how far ahead AMD is: Nvidia has to strip GPGPU-capabilities of note out altogether just to have a card of remotely comparable gaming value). The only thing likely allowing CUDA to hold on is saturation; it's riding on past success.


So yeah, Gameworks is eye-roll worthy, but a threat to the GPU market or the gaming industry, or consumers? Don't give Nvidia business practices too much credit. They've been at these games forever; they've never worked, save one exception.

#26 Garou Wolfs Haven

    Member

  • PipPip
  • Survivor
  • 23 posts
  • LocationNorth Central Indiana

Posted 02 June 2014 - 11:23 AM

For users and programmers having compatibility standards is essential. Competition keeps the market affordable for users. As for developers they could care less about competition other than it forces them to advance faster gaining them a few more sales but any of them would be quite content to have the whole market and force us to buy at their own pace.

#27 B0oN

    Member

  • PipPipPipPipPipPipPipPipPip
  • 2,870 posts

Posted 02 June 2014 - 12:04 PM

Even though I ´m a poor pauper (literally) right now, I would not fall for the "more-punch-per-dollar" argument, which runs big-time for AMD since quite some time, and that is just down to the simple fact that nVidia has tidier images (read as: better imagequality) overall, which shows very good with 120/144Hz displays and the fact that even in some of the newer games even such massive framerates can be achieved (be it through pure singlecard power or SLI/XFiring) for an desirable amount of time, funnily it´s mostly nVidia cards excelling in raw power delivery which endears them very much to my ever so PC-performance-hungry heart.
Hell, if ATI were all of a sudden up on par (or maybe even better than that ?) with their overall imagequality, I´d totally be trying them again, not a biggie .

More or less the balance between AMD/ATI and nVidia for GPU´s can be equated the same as is between Intel and AMD for CPU´s .
Intel/nVidia gives moe raw performance, but it´s gonna cost, whereas AMD/ATI are for the "power/price" people.
Nothing bad so far .
But ...
This balance is today more lopsided than it was a decade ago, where the power/price balance between those blocks was very even, wich made for one very, very important thing : competition, which futhermore brought forwards the all-important innovation .

So: Yes, it is good that AMD buckled down and developed Mantle and, most important, kept it OpenSource, whereas nVidia did pretty much do nothing in these regards (and trust me, both will need these sorts of tech to stay competetive, because you can´t go forever with dieshrinks and archiectures alone ), except for announcing/introducing DX12 (with all the other partners), which could be a genuine timing coincidence (they were announced pretty close, timewise seen, to each other) or just a "something-needed-to-be-put-out-to-show-we-are-proactive" while tryin to figure something like an answer on Mantle out .

Hopefully this means all the companies involved in the above text will push on harder than they did those last few years, because things were stagnating in the PC sector quite massively just because consoles were selling so much better and they were on a fixed hardware equipment-base, which didn´t change all the time and didn´t have HW variants in the hundred-thousands which made it easier to progam for, which, honestly said, is a state not befitting to the "glorious master-race called PC" which should totally be the leading edge of R&D-labs around the world just by virtue of having so much more usability possibilites than a console has .

Well, 2 cents and that ...





1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users