Jump to content

About min specs again


21 replies to this topic

#1 Click

    Member

  • PipPipPipPipPip
  • Shredder
  • 102 posts
  • LocationPortugal

Posted 23 June 2012 - 04:03 PM

Ok so, this has been bugging me ever since they announced them and I haven't seen anyone asking *these specific questions* so bare with me if its a repost.

First of all, you can check my rig on my signature. Yes, its almost 5 years old. No, I do NOT have the cash to upgrade nor will I in a near future. Unfortunately.

Now, heres whats bugging me:

MWO gfx card min specs will be either a nv 8800GT or a ati 57**, right? OK, but riddle me this, the direct equivalent/adversary for the 8800GT was the 3870, the 5700 series was released nearly two years after the G92 8800GT and is rated 80% or so better.
So, why the discrepancy between GFX brands? Does this game have any problem with ATI cards? I own a 3870 and to think I might not be able to play this while someone with the other brand's direct equivalent (2/3% performance difference) will, is...ridiculous to put it mildly. Especially since I played Crysis 2 on high/very high settings.

What do you guys think?

#2 Phasics

    Member

  • PipPipPipPipPipPip
  • 273 posts

Posted 23 June 2012 - 04:08 PM

As you've said you have no cash for new gear and since its FP2, download try it see how it fares on your rig and if you can play it great and if not well you only wasted a few GB of your monthly download and some time, I think your good either way.

#3 OpCentar

    Member

  • PipPipPipPipPipPipPip
  • 547 posts

Posted 23 June 2012 - 04:17 PM

@Click

It's the game engine - CryTek favours NVidia over ATi.

But if you played Crysis 2 then you will have no problems with MWO.

#4 Fear Radick

    Member

  • PipPipPipPipPipPip
  • 238 posts

Posted 23 June 2012 - 04:19 PM

View PostClick, on 23 June 2012 - 04:03 PM, said:

Ok so, this has been bugging me ever since they announced them and I haven't seen anyone asking *these specific questions* so bare with me if its a repost.

First of all, you can check my rig on my signature. Yes, its almost 5 years old. No, I do NOT have the cash to upgrade nor will I in a near future. Unfortunately.

Now, heres whats bugging me:

MWO gfx card min specs will be either a nv 8800GT or a ati 57**, right? OK, but riddle me this, the direct equivalent/adversary for the 8800GT was the 3870, the 5700 series was released nearly two years after the G92 8800GT and is rated 80% or so better.
So, why the discrepancy between GFX brands? Does this game have any problem with ATI cards? I own a 3870 and to think I might not be able to play this while someone with the other brand's direct equivalent (2/3% performance difference) will, is...ridiculous to put it mildly. Especially since I played Crysis 2 on high/very high settings.

What do you guys think?


It could come down to firmware compatibility of those particular cards, and not so much their actual power.

#5 Orion Pirate

    Member

  • PipPipPipPipPipPip
  • 249 posts
  • LocationNorfolk, Virginia

Posted 23 June 2012 - 04:26 PM

View PostClick, on 23 June 2012 - 04:03 PM, said:


What do you guys think?


I think the 8800GT was a very advanced card for its time and has aged better then other cards... But I don't have any facts on this, just what I think. :D

Edited by Orion Pirate, 23 June 2012 - 04:26 PM.


#6 Click

    Member

  • PipPipPipPipPip
  • Shredder
  • 102 posts
  • LocationPortugal

Posted 23 June 2012 - 04:58 PM

Thanks for all the input. I know Crytek's game engines have always worked better with nvidia cards but nothing to the extent of warranting such a difference in minimum requirements (again, it's a 80% gap), at least to my knowledge of the area.

Radick, both cards support the exact same features except for nv's proprietary SW (CUDA & Physx) and afaik MWO doesn't make use of those so, what do you mean with firmware compatibility?

Orion Pirate, things don't work like that on the hardware market. GFX cards or any piece of HW is targeted at a certain performance/price segment (so as not to butcher upcoming generations) and the scores they get are the scores they get, plus or minus a few % due to driver updates. The G92 was a good chip, so good nvidia used it in a couple of rebrands, but nothing I couldn't reach with my card through overclock. And definitely nothing in any way comparable to a 5770..so again, why?

I hoped a mod could shed some light on this issue. I will DL the game nonetheless, I think I can run it but the fact is, if ati owners are at such a disadvantage here developers should just come out and state it clearly.

#7 Saevus

    Member

  • PipPipPipPipPipPip
  • 280 posts
  • LocationRight side of Upside down

Posted 23 June 2012 - 05:05 PM

I have used ATI since the debacle that was my experience with the 8800 series (they kept dying on me for no reason), that and ATI was using less power for a long time, I've been told that issue is largely one of the past in the last couple series, but I digress, I am going with an earlier post on this one, I think it's a firmware compatability.

It's interesting you mention this, we've been discussing it at my work. I am at the rebuild point in my computer cyle, and it looks like my next rig will be about $750, but the big sticking point for me is: do I go back to Nvidia because of this game or press on with ATI since it's been so nice to me over the last 4 or 5 years? As to your dilemma, I think you will be ok, but it would be interesting to get an official answer, I thought those two cards were oddly matched as well.

#8 Click

    Member

  • PipPipPipPipPip
  • Shredder
  • 102 posts
  • LocationPortugal

Posted 24 June 2012 - 02:21 PM

Hey, I re-read the post the devs made about min specs and I saw that 5600 cards are mentioned too so, this reminded me of something..

Maybe this "discrepancy" in the min specs between ATI/NV is because of ATI dropping driver support for cards previous to gen 5000. Older cards with enough "firepower" will still most likely work but maybe the devs can't officially commit to that because of ATI's decision..? It's the only conclusion I can come to.

I'd really like confirmation from the devs on this. Some of us still own these cards and it's not nice to leave us hanging without a simple 'why'..

#9 Giverous

    Member

  • PipPipPipPipPipPip
  • 291 posts
  • LocationBrighton, UK

Posted 24 June 2012 - 02:39 PM

In the early nvidia-ati wars nvidia concentrated on gaming performance, ATI concentrated on all-round and video playback performance. While the difference has narrowed considerably since then, there were serious discrepancies between cards running the "same" spec and performing very differently.

If memory serves, a lot of it has to do with how the cards put together - where memory is situated, how it interfaces with the processor etc along with drivers. In the era you're talking about it's not a stretch to believe that ATI were 1/2 generations behind nvidia in terms of 3d performance.

Ill look for some benchmarks between the two.

#10 Click

    Member

  • PipPipPipPipPip
  • Shredder
  • 102 posts
  • LocationPortugal

Posted 24 June 2012 - 03:03 PM

Yeah do that, you will see that the difference between a 8800GT and a 3870 is very small. I don't want to turn this topic into a ATI vs Nv but I didn't buy this card out of the blue back then, I researched and informed myself to the best of my ability as anyone tech savvy would.

I knew all the trade offs and trade ons..it had a lil bit less performance but it was made on a smaller die process (55nm vs 65nm for the 8000 series), so it consumed, heated less, was more reliable (NV had hkmg solder problems at the time that caused what Saevus mentioned) and had a bigger OC potential thanks to using GDDR4 vs GDDR3.

And anyway, even if my card wasn't enough there is the twice as powerful 4870, of which the officially supported 5770 is a mere die shrink with overclock, but *still isn't* supported..

That's why I don't think lacking performance is the answer to my question and that's why the devs saying something here would be so important..what you're asking me I've already double checked as not making any sense..

Edited by Click, 24 June 2012 - 03:05 PM.


#11 silentD11

    Member

  • PipPipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 816 posts
  • LocationWashington DC

Posted 24 June 2012 - 04:13 PM

The nvidia 8800/9800 (same GPU once the 8800gt hit) was way ahead of it's time, it was far faster than the ati 2900 series or the ati 38x0 cards. Plus the game engine this game is on (crytek) favors nvidia cards. The selling point of the ati 38x0 and even 48x0 was they were cheaper, ran cooler with less power, and less prone to failure than nvidia through the DX10 era... but nvidia was faster all through DX10. Also ATi's support for those cards went out before nvidia dropped support.

This doesn't mean the cards won't run the game though, you can often run games with hardware that isn't supported provided the performance is there, the drivers run the game, not the card. It's just what they tested the game with and what they can firmly say it works with.

Also keep in mind that the amd 5xxx series has been out for a LONG TIME. Nvidia was stuck in DX10 forever due to the fiasco that was fermi aka the 4x0 series cards. So it makes sense that the test cards for AMD would be a later series as AMD released first.

Another factor to keep in mind is that the nvidia 8800gt,/9800gt/gts250 and even one of the 4x0 gts cards all used the same GPU! So that GPU has been floating around forever in new nvidia series just being renamed.

#12 Elkarlo

    Member

  • PipPipPipPipPipPipPip
  • 911 posts
  • LocationGermany

Posted 24 June 2012 - 04:43 PM

I had the Oportunity to Benchmark Two Cards on the Same System (P5Q-Pro Asus Board C5D E8500) two Gainward GS Cards.
A 9800GT GS and a 4850 GS (only ATI Gainward made)
Both hat the same cooling System.
As they were GS samples Gainward cheated a little, and in reality the 9800 GT was a fullspec with full shaders 9800 gtx capacity of 818mhz outspecing ALL OC models of the g92b.
The Ati 4850 was a full size 4870.
Gaming Benchmark: i had to pull the 9800 gtx to full limit to reach in the worst test for the ATI Card the same performance normaly 4870 was 10-20% faster. ( Crisis 1 it was)

Friend of mine had a 3870 it had roughly the same performance as a 9800gtx not overclocked but on a quad Core System.

Nvidia g92b Card only supports Directx 10
ATI Cards Support Directx 10.1 which means some more Features, is someway in the mid between Directx 10 and 11.

So the 3870 should work finde for MWO, BUT Crisis tend to favour Nvidia Cards.

Biggest Issue, the Ati series Cards are out of Support so only Bugfixes for them and no direct Support, if you have Issues with MWO, there is only a slight Chance that there will be a Bugfix.

And the reason why the G92b is still in support?
Because nvidia still sell the chips with added DirectX support.
( it's then called GF 108 but it's basically a g92c in 40nm, thats all 430 and 440 models)

Edited by Elkarlo, 24 June 2012 - 04:45 PM.


#13 silentD11

    Member

  • PipPipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 816 posts
  • LocationWashington DC

Posted 24 June 2012 - 05:11 PM

The 4800 ati cards were against the gtx260 and 285 nvidia cards at the time, the nvidia cards were faster but hotter.

#14 Vashts1985

    Member

  • PipPipPipPipPipPipPipPip
  • 1,115 posts

Posted 24 June 2012 - 05:14 PM

my 8800GTX is sitting above my computer desk in a plastic stand.

still worked as of when wrenched it from my computer to add my 570's.

Edited by Vashts1985, 24 June 2012 - 05:14 PM.


#15 silentD11

    Member

  • PipPipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 816 posts
  • LocationWashington DC

Posted 24 June 2012 - 05:16 PM

View PostVashts1985, on 24 June 2012 - 05:14 PM, said:

my 8800GTX is sitting above my computer desk in a plastic stand.

still worked as of when wrenched it from my computer to add my 570's.


I had tri 8800gtx cards and had to bake them in the oven a couple times to reflow them. The vram cooling on the 8800gt/9800gt was pretty damn horrid and known for cooking itself...

I liked the cards, but oh lord oh lord.

#16 Vashts1985

    Member

  • PipPipPipPipPipPipPipPip
  • 1,115 posts

Posted 24 June 2012 - 05:21 PM

View PostsilentD11, on 24 June 2012 - 05:16 PM, said:


I had tri 8800gtx cards and had to bake them in the oven a couple times to reflow them. The vram cooling on the 8800gt/9800gt was pretty damn horrid and known for cooking itself...

I liked the cards, but oh lord oh lord.


ive heard for various sources that some of them shipped with memory problems, thought it was hit or miss.

i must have lucked out, but mine did overheated a few times before i upgraded, still, it kept on chugging along.

Edited by Vashts1985, 24 June 2012 - 05:22 PM.


#17 jrock

    Member

  • PipPipPip
  • 81 posts
  • LocationChicago

Posted 25 June 2012 - 04:25 AM

View PostClick, on 23 June 2012 - 04:03 PM, said:

Ok so, this has been bugging me ever since they announced them and I haven't seen anyone asking *these specific questions* so bare with me if its a repost.

First of all, you can check my rig on my signature. Yes, its almost 5 years old. No, I do NOT have the cash to upgrade nor will I in a near future. Unfortunately.

Now, heres whats bugging me:

MWO gfx card min specs will be either a nv 8800GT or a ati 57**, right? OK, but riddle me this, the direct equivalent/adversary for the 8800GT was the 3870, the 5700 series was released nearly two years after the G92 8800GT and is rated 80% or so better.
So, why the discrepancy between GFX brands? Does this game have any problem with ATI cards? I own a 3870 and to think I might not be able to play this while someone with the other brand's direct equivalent (2/3% performance difference) will, is...ridiculous to put it mildly. Especially since I played Crysis 2 on high/very high settings.

What do you guys think?


I made up a doc of all the CPU and GPU's

on the GPU page I filtered by GFLOPS to give a good idea of what's what.

from what I've heard and read the basic requirements for any kind of quality experience is a quad core CPU, 4-8GB RAM and one of those GPU's on my list (basically 5-600 GFLOPS or higher). You can look up your card on Wikipedia to get the GFLOPS rating.

https://docs.google....A2enFxcGc#gid=0

#18 Click

    Member

  • PipPipPipPipPip
  • Shredder
  • 102 posts
  • LocationPortugal

Posted 25 June 2012 - 01:46 PM

Wow, you must've had a lot of work compiling all that info..but I don't think it'll help much =\

To be honest Gflops are a terrible way to measure a card's performance, it doesn't take into account memory bandwidth or texture fillrate. As an example, the 3870 and 8800GT dish out at stock 496 and 336 Gflops respectively, yet the 8800GT is like 3% faster on average. These figures were doubled in the following generation.

If you really want to gauge a card's performance you have to use a real benchmark, like 3dmark 06. There both cards score around 10k.

Again, please don't turn this topic into a brand war, I just wanted an answer as to why my card isn't officially supported when one so similar is and if it's because the game has problems with ATI cards. But this topic has been around for a while and still no official answer..so nevermind, I've already reached my own conclusions anyway.

#19 Grimarch

    Member

  • PipPipPipPipPip
  • Overlord
  • Overlord
  • 151 posts
  • LocationGuildford

Posted 25 June 2012 - 01:50 PM

Assume NVIDIA GeForce GTX 550 Ti will be fine?

#20 Gorith

    Member

  • PipPipPipPipPipPip
  • 476 posts

Posted 25 June 2012 - 02:17 PM

Someone correct me if I am wrong about this but isn't the Crytek3 engine highly optimized for Nvidia but not really ATI?





6 user(s) are reading this topic

0 members, 6 guests, 0 anonymous users