Jump to content

Cry3 Enginehardware Requirments


54 replies to this topic

#1 Bad Karma 308

    Member

  • PipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 411 posts

Posted 03 December 2012 - 12:22 PM

Looks like Crytek just posted their Cry3 Engine hardware requirements for Crysis 3.

As we are on the same engine I wonder how close our finaly specs will be to theirs......


From MaximumPC: http://www.maximumpc...run_crysis_3134

Quote: "Brace yourself, this might hurt."

The minimum system requirements aren't too obscene, though they're going to leave some entry-level and older rigs on the sidelines. Here's how it shakes out:
  • Windows Vista, Windows 7, or Windows 8
  • DirectX 11 graphics card with 1GB video RAM
  • Dual-core CPU
  • 2GB memory (3GB on Vista, because apparently it's a pig of an OS)
  • Example 1 (Nvidia/Intel)

    - Nvidia GTS 450
    - Intel Core 2 Duo 2.4GHz (E6600)
  • Example 2 (AMD)

    - AMD Radeon HD 5770
    - AMD Athlon 64 X2 2.7GHz (5200+)
Is your system still in the running? Yes? Good, now let's jump up to the Recommended system requirements:
  • Windows Vista, Windows 7, or Windows 8
  • DirectX 11 graphics card with 1GB video RAM
  • Dual-core CPU
  • 4GB memory
  • Example 1 (Nvidia/Intel)

    - Nvidia GTX 560
    - Intel Core i5 750
  • Example 2 (AMD)

    - AMD Radeon HD 5870
    - AMD Athlon64 II X4 805
Congratulations if you're still able to play Crysis 3, you have a very good system. But is it a great setup? You'll need a fairly burly PC to pull off the High Performance specifications, which look like this:
  • Windows Vista, Windows 7, or Windows 8
  • DirectX 11 graphics card with 1GB video RAM
  • Latest quad-core CPU
  • SLI / CrossFire configuration will run even better
  • 8GB memory
  • Example 1 (Nvidia/Intel)

    - Nvidia GTX 680
    - Intel Core i7 2600k
  • Example 2 (AMD)

    - AMD Radeon HD 7970
    - AMD Bulldozer FX 4150
If you want to turn on all the eye candy, you're going to need a fast system that's particularly well equipped in the GPU department. Do you have a system that can manage it, and if not, do you plan on upgrading?

Edited by Bad Karma 308, 03 December 2012 - 12:25 PM.


#2 Egomane

    Member

  • PipPipPipPipPipPipPipPipPipPip
  • 8,163 posts

Posted 03 December 2012 - 12:32 PM

Please note, that while the minimum specs are quite low, the still require a DirectX 11 capable craphic card.
Everything that is below that, will most likely not work for C3. No matter how powerful it might be.

I have seen a post in the C3 forums that suggests, that with everything turned to max settings, even the high performance specs with a GTX680 or HD7970, are probably not enough for a carefree gaming experience.

#3 Bad Karma 308

    Member

  • PipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 411 posts

Posted 03 December 2012 - 12:46 PM

I read an article awhile back, I think from Anand or Tom's, that in discussion with the Crytek developers they were shooting for the same level of complexity from the Cry3 engine that pushed the original Cryengine. They want to push the limits of hardware design unlike the boondoggle that was crysis 2. "But can it run Crysis", may be coming back to haunt us.



If I can dig up the interview I'll post the link.

Edited by Bad Karma 308, 03 December 2012 - 12:50 PM.


#4 Vulpesveritas

    Member

  • PipPipPipPipPipPipPipPipPip
  • 3,003 posts
  • LocationWinsconsin, USA

Posted 03 December 2012 - 12:50 PM

View PostBad Karma 308, on 03 December 2012 - 12:46 PM, said:

I read an article awhile back, I think from Anand ot Tom's, that in discussion with the Crytek developers they were shooting for the same level of complexity from the Cry3 engine that pushed the original Cryengine. They want to push the limits of hardware design unlike the boondoggle that was crysis 2. "But can it run Crysis", may be coming back to haunt us.



If I can dig up the interview I'll post the link.

I bet I know how they're doing it.
The answer: Tessellate EVERYTHING! TOP-SECRET TESSELLATED TOAD TECH!

Nah, if they end up just doing that, it will make me facepalm, but so far it's looking like they're stepping up on everything from textures to physics.

#5 Bad Karma 308

    Member

  • PipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 411 posts

Posted 03 December 2012 - 01:21 PM

Did they get the FPS throttling system right this time? The faulty throttle system in the original Cry engine when set to max (or just about any setting level) was notorious for trying to push a near unlimited IO while simultaniously attempting to do full map renders. Just about every card at the time would choke, even the top tier.

I have several thousand K20Xs we're configuring and evaluating at work for a mid-tier supercomputer. I'm curious if I could force the GK110s to accept DirextX11(.1) and see what the upper limits of the engine is capable of.

Edited by Bad Karma 308, 03 December 2012 - 01:23 PM.


#6 fxrsniper

    Member

  • PipPipPipPipPipPip
  • 234 posts
  • LocationEast Coast

Posted 03 December 2012 - 03:29 PM

Windows Vista, Windows 7, or Windows 8
DirectX 11 graphics card with 1GB video RAM
Latest quad-core CPU
SLI / CrossFire configuration will run even better
8GB memory
Example 1 (Nvidia/Intel)

- Nvidia GTX 680
- Intel Core i7 2600k
Example 2 (AMD)

- AMD Radeon HD 7970
- AMD Bulldozer FX 4150
I disagree with most of this You dont need all that to run this game any higher end Quad core Excluding AMD FX series any of the Phenom black Quad core 965 and up ESP. 980 black and any GTX 400 series or higher can run this game on max settings.
i5 2500k or equal and same range GPU can run this game.

#7 fxrsniper

    Member

  • PipPipPipPipPipPip
  • 234 posts
  • LocationEast Coast

Posted 03 December 2012 - 03:43 PM

View PostBad Karma 308, on 03 December 2012 - 01:21 PM, said:

Did they get the FPS throttling system right this time? The faulty throttle system in the original Cry engine when set to max (or just about any setting level) was notorious for trying to push a near unlimited IO while simultaniously attempting to do full map renders. Just about every card at the time would choke, even the top tier.

I have several thousand K20Xs we're configuring and evaluating at work for a mid-tier supercomputer. I'm curious if I could force the GK110s to accept DirextX11(.1) and see what the upper limits of the engine is capable of.

Dont know but GK110's are not out yet but you could try it on the GK104/GK107/GK106 which is GK107 GTX650 or GK104/ GTX 660ti/670/680 and GK106 GTX 650ti

Edited by fxrsniper, 03 December 2012 - 05:22 PM.


#8 Sir Roland MXIII

    Member

  • PipPipPipPipPipPipPipPip
  • The Spear
  • The Spear
  • 1,152 posts
  • LocationIdaho

Posted 03 December 2012 - 06:39 PM

Hi-frikkin-LARIOUS. My all-in-one A10 5800K Trinity APU is above the RECOMENDED specs. I just about fell out of my chair laughing. This is difficult for people to achieve recomended specs on CryE3? HOW? Seriously, HOW? I mean, my bloody cheap-as-hell APU / MB and 2 4gb sticks of G Skill Ripjaws X Series DDR3 1866 was only a little above 300 USD.

Which makes hitting recomended specs on CryE3 absolutely, mind-blowingly, INSANELY affordable. Brace myself? HA! They wish. This is serious budget computing <points at computer>, and this is serious budget computing beating rec' specs <points at OP>. LMFAO.

View PostBad Karma 308, on 03 December 2012 - 12:22 PM, said:

If you want to turn on all the eye candy, you're going to need a fast system that's particularly well equipped in the GPU department. Do you have a system that can manage it, and if not, do you plan on upgrading?


As for your question, if I am under Top Specs, do I plan to upgrade, yes, I do. According to rumor mill the next gen of APUs will be out at some point next year, either as "Trinity 2.0" Richland in H1, or Kaveri in H2. Either by then, or at that time, I will upgrade not only my APU but I will also purchase a seperate GPU. Hopefully something in the 7800 or 7900 series will be able to Crossfire with that APU but I'm not expecting such. We'll see.

Of course, one would assume that PGI will allow us to use SLI / Crossfire by then but... I'm increasingly uncertain if they could find their keisters with two hands and a MAP. It'd be nice, but atm I find myself expecting an announcement in January telling the Clans they need to push back their invasion acouple months, so PGI has time to dig themselves out of the hole they're putting themselves in.

Edited by Sir Roland MXIII, 03 December 2012 - 06:47 PM.


#9 Bad Karma 308

    Member

  • PipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 411 posts

Posted 03 December 2012 - 08:56 PM

View Postfxrsniper, on 03 December 2012 - 03:43 PM, said:

Dont know but GK110's are not out yet


My company signed for just over 6,000 (K20X derivative) GK110s into our inventory a few months back as part of an ongoing DoD development cycle. We've been running test and evaluation for our distributed simulation partners ever since. Some Portions of the NDAs dropped off just a few weeks ago. An yeah, from what I can say, they are beasts.


View PostSir Roland MXIII, on 03 December 2012 - 06:39 PM, said:


Of course, one would assume that PGI will allow us to use SLI / Crossfire by then but...


I too am impatiently waiting for SLI to kick in.

#10 Barbaric Soul

    Member

  • PipPipPipPipPipPipPip
  • 887 posts

Posted 04 December 2012 - 03:20 AM

View PostBad Karma 308, on 03 December 2012 - 08:56 PM, said:


My company signed for just over 6,000 (K20X derivative) GK110s into our inventory a few months back as part of an ongoing DoD development cycle. We've been running test and evaluation for our distributed simulation partners ever since. Some Portions of the NDAs dropped off just a few weeks ago. An yeah, from what I can say, they are beasts.


You actually have GK110 GPUs at work? OH HELL. Yes, I am jealous

#11 fxrsniper

    Member

  • PipPipPipPipPipPip
  • 234 posts
  • LocationEast Coast

Posted 04 December 2012 - 01:21 PM

View PostSir Roland MXIII, on 03 December 2012 - 06:39 PM, said:

Hi-frikkin-LARIOUS. My all-in-one A10 5800K Trinity APU is above the RECOMENDED specs. I just about fell out of my chair laughing. This is difficult for people to achieve recomended specs on CryE3? HOW? Seriously, HOW? I mean, my bloody cheap-as-hell APU / MB and 2 4gb sticks of G Skill Ripjaws X Series DDR3 1866 was only a little above 300 USD.

Which makes hitting recomended specs on CryE3 absolutely, mind-blowingly, INSANELY affordable. Brace myself? HA! They wish. This is serious budget computing <points at computer>, and this is serious budget computing beating rec' specs <points at OP>. LMFAO.



As for your question, if I am under Top Specs, do I plan to upgrade, yes, I do. According to rumor mill the next gen of APUs will be out at some point next year, either as "Trinity 2.0" Richland in H1, or Kaveri in H2. Either by then, or at that time, I will upgrade not only my APU but I will also purchase a seperate GPU. Hopefully something in the 7800 or 7900 series will be able to Crossfire with that APU but I'm not expecting such. We'll see.

Of course, one would assume that PGI will allow us to use SLI / Crossfire by then but... I'm increasingly uncertain if they could find their keisters with two hands and a MAP. It'd be nice, but atm I find myself expecting an announcement in January telling the Clans they need to push back their invasion acouple months, so PGI has time to dig themselves out of the hole they're putting themselves in.

AMD is not cutting it like they used to I switched 5 months ago after using AMD for 18 years the PyleDriver series might change it some but the APU's are ok for there price but not high end by any means

#12 Sir Roland MXIII

    Member

  • PipPipPipPipPipPipPipPip
  • The Spear
  • The Spear
  • 1,152 posts
  • LocationIdaho

Posted 05 December 2012 - 10:28 AM

View Postfxrsniper, on 04 December 2012 - 01:21 PM, said:

AMD is not cutting it like they used to I switched 5 months ago after using AMD for 18 years the PyleDriver series might change it some but the APU's are ok for there price but not high end by any means


Of course they aren't high end - THAT is why I laughed so hard at the specs!

Oh and Trinity APUs use Piledriver cores. And while I have yet to be able to bench' my 5800K Trinity APU, I plan to do that once FedEx and UPS come by today and I get the delivery of my SpinQ VT heatsink installed.

EDIT I should also mention the Vishera CPU line is on Piledriver cores as well.

Edited by Sir Roland MXIII, 05 December 2012 - 10:31 AM.


#13 fxrsniper

    Member

  • PipPipPipPipPipPip
  • 234 posts
  • LocationEast Coast

Posted 06 December 2012 - 11:07 AM

View PostSir Roland MXIII, on 05 December 2012 - 10:28 AM, said:


Of course they aren't high end - THAT is why I laughed so hard at the specs!

Oh and Trinity APUs use Piledriver cores. And while I have yet to be able to bench' my 5800K Trinity APU, I plan to do that once FedEx and UPS come by today and I get the delivery of my SpinQ VT heatsink installed.

EDIT I should also mention the Vishera CPU line is on Piledriver cores as well.

No I agree I hope they get it all worked out I was with them for 18 years with all my builds and for my business

#14 silentD11

    Member

  • PipPipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 816 posts
  • LocationWashington DC

Posted 06 December 2012 - 11:54 AM

It's worth keeping in mind that "minimum" specs in PC games have often meant "will run at low resolution low everything, and by run we mean a single digit slide show, but it will run". Recommended has often meant "will get reasonable frame rates at reasonable settings".

Minimum has very rarely meant "will run well" and recommended has very rarely meant "can play at high details with good frame rates".

Cryengine games have always been offenders in this area.

#15 Sassafras

    Rookie

  • Overlord
  • Overlord
  • 7 posts

Posted 06 December 2012 - 12:19 PM

View PostSir Roland MXIII, on 03 December 2012 - 06:39 PM, said:

Hi-frikkin-LARIOUS. My all-in-one A10 5800K Trinity APU is above the RECOMENDED specs.



How do you figure that? The A10-5800K has a Radeon HD 7660D, while the minimum GPU spec is a Radeon HD 5770, which is slightly faster than the HD 7750. The 7660 and the 7750 both run at the same 800 MHz frequency, but the 7660 has only 75% of the shaders (384 vs 512), and 50% of the ROPs (8 vs 16) of the 7750, so you'd be looking at somewhere around 2/3 of the speed of the minimum 5770 and nowhere near the same class as the recommended 5870.

#16 Sir Roland MXIII

    Member

  • PipPipPipPipPipPipPipPip
  • The Spear
  • The Spear
  • 1,152 posts
  • LocationIdaho

Posted 07 December 2012 - 03:00 PM

View PostSassafras, on 06 December 2012 - 12:19 PM, said:


How do you figure that? The A10-5800K has a Radeon HD 7660D, while the minimum GPU spec is a Radeon HD 5770, which is slightly faster than the HD 7750. The 7660 and the 7750 both run at the same 800 MHz frequency, but the 7660 has only 75% of the shaders (384 vs 512), and 50% of the ROPs (8 vs 16) of the 7750, so you'd be looking at somewhere around 2/3 of the speed of the minimum 5770 and nowhere near the same class as the recommended 5870.


Well, all I'm going to say is, you're GPU mathematics are... off. Two-thirds the performance on my APU of a 5770 isn't even in the ballpark's parking lot.

#17 Diemos

    Member

  • PipPip
  • 22 posts

Posted 07 December 2012 - 03:19 PM

View PostSir Roland MXIII, on 03 December 2012 - 06:39 PM, said:

Hi-frikkin-LARIOUS. My all-in-one A10 5800K Trinity APU is above the RECOMENDED specs. I just about fell out of my chair laughing. This is difficult for people to achieve recomended specs on CryE3? HOW? Seriously, HOW? I mean, my bloody cheap-as-hell APU / MB and 2 4gb sticks of G Skill Ripjaws X Series DDR3 1866 was only a little above 300 USD.

Which makes hitting recomended specs on CryE3 absolutely, mind-blowingly, INSANELY affordable. Brace myself? HA! They wish. This is serious budget computing <points at computer>, and this is serious budget computing beating rec' specs <points at OP>. LMFAO.



As for your question, if I am under Top Specs, do I plan to upgrade, yes, I do. According to rumor mill the next gen of APUs will be out at some point next year, either as "Trinity 2.0" Richland in H1, or Kaveri in H2. Either by then, or at that time, I will upgrade not only my APU but I will also purchase a seperate GPU. Hopefully something in the 7800 or 7900 series will be able to Crossfire with that APU but I'm not expecting such. We'll see.

Of course, one would assume that PGI will allow us to use SLI / Crossfire by then but... I'm increasingly uncertain if they could find their keisters with two hands and a MAP. It'd be nice, but atm I find myself expecting an announcement in January telling the Clans they need to push back their invasion acouple months, so PGI has time to dig themselves out of the hole they're putting themselves in.



HA! I just built an A10 based system, was playing this on Low settings with the onboard GPU, just installed my 7770 Ghz Ed. a couple days ago, and can now play everything on High or Ultra, yet I can't seem to eliminate the fricken mouse lag from Skyrim. ****.

#18 Kurayami

    Member

  • PipPipPipPipPipPipPip
  • Stone Cold
  • Stone Cold
  • 916 posts
  • LocationSochi

Posted 07 December 2012 - 03:33 PM

something tells me this is BS.especially "dx11 support"

#19 Sir Roland MXIII

    Member

  • PipPipPipPipPipPipPipPip
  • The Spear
  • The Spear
  • 1,152 posts
  • LocationIdaho

Posted 07 December 2012 - 04:31 PM

View PostDiemos, on 07 December 2012 - 03:19 PM, said:



HA! I just built an A10 based system, was playing this on Low settings with the onboard GPU, just installed my 7770 Ghz Ed. a couple days ago, and can now play everything on High or Ultra


Good to hear. What resolution are you playing at?

#20 Bad Karma 308

    Member

  • PipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 411 posts

Posted 07 December 2012 - 11:10 PM

A direct comparison for On-chip/integrated graphics to the discreet card isn't really easy to do. For one, most discreet carts tend to use GDDR5 while on-board graphics have to use a portion of the DDR2/3 system ram, which is far slower with less bandwidth, and also denies that rams use to the CPU.
They are far more limited and throttled by power & heat constraints.

If you want a decent hierarchical chart on where the regular cards stack up (by performance then : http://www.tomshardw...iew,3107-7.html Whole Article: http://www.tomshardw...clock,3106.html

If you want to see the performance charts on your on-die APUs then: http://www.tomshardw...5400k,3224.html

And I hate to say it but when your talking about comparing on-die against discreet cards and then adding in DX11, it is like picking your favorite chihuahua to compete against a rabid pit-bull.

Edited by Bad Karma 308, 07 December 2012 - 11:12 PM.






1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users