Jump to content

Amd To Intel Expectations


94 replies to this topic

#81 Flapdrol

    Member

  • PipPipPipPipPipPipPipPip
  • The 1 Percent
  • The 1 Percent
  • 1,986 posts

Posted 29 December 2014 - 03:53 AM

View PostKuritaclan, on 29 December 2014 - 03:30 AM, said:

@graphic cards. Maxwell is a better Kepler. The power saving is little to nothing on the new architecture when the card is heavly stressed. Only when the card has not that much to do Maxwell is superior in saving energy over Kepler.

The power efficiency is very much dependant on what values the graphics cards manufacturers set in the gpu bios. The card will boost up until the set tdp is reached, the higher boost clocks mean higher voltages and efficiency goes down with higher voltage.

The 980 is a pretty impressive piece of tech, chip size is in between 280X and 290X, only a 256 bit memory interface and a lower power requirement. I guess nvidia makes a lot of money on them, as they should be signifcantly cheaper to make and sell for more than the competition.

Edited by Flapdrol, 29 December 2014 - 03:55 AM.


#82 Kuritaclan

    Member

  • PipPipPipPipPipPipPipPip
  • Ace Of Spades
  • 1,838 posts
  • LocationGermany

Posted 29 December 2014 - 04:06 AM

@Flapdrol

Yes the Maxwell Cards 970 and 980 are impressive. Also there oc potential. But the advertised power saving isn't that high when it comes to stressfull performance holes. In this case Maxwell behave like Kepler (Maxwell is advanced Kepler or Kepler 2.0). If you have a card what is tweakable over 1,5Ghz clock you would do so if there is a performance win by up to 20% (depending on the OEM settings). A 970 which can oc'ed perform like a graphic card what costs nearly 150 or 200$ more, should be tweaked. Since when will you use the power else in 5 years? Then there a newer generations on graphic cards, and they perform double as good with even more powersaving i guess.

#83 MechWarrior4172571

    Member

  • PipPipPipPipPipPip
  • Bridesmaid
  • 251 posts

Posted 29 December 2014 - 07:44 AM

View PostKuritaclan, on 29 December 2014 - 04:06 AM, said:

@Flapdrol

Since when will you use the power else in 5 years?

5 year cycle for a video card is way.. WAY too much. 4k monitors are knocking on the doors and GTX 980 barely makes it into the game of x1 4k monitor at 60+ fps. You need to SLI it to run 3 or 2 4k monitor setup. Obviously it's already too little too late. 5 years is way too long for a video card ownership cycle. Most people should project to get 2 to 3 year use for their new video card (provided it's not a bargain outgoing one.) Personally I barely survived with GTX 465 until I was recently able to upgrade to GTX 970; but that 465 struggled at the end with lackluster performance and I had to keep that 465 overclocked to the max just to keep up.

#84 Kuritaclan

    Member

  • PipPipPipPipPipPipPipPip
  • Ace Of Spades
  • 1,838 posts
  • LocationGermany

Posted 29 December 2014 - 09:09 AM

I don't think GTX 970/980s are capable for 4k on modern games. Yes they can produce good FPS for older Games. And near 30FPS in modern games. The Maxwell cards are good "2K" cards (2560x1440).

I upgrade my 260 to a 770. Now i have a "good" card for Full HD on all Games - and graphic details will be reduced till FPS go under 30 in games coming up in the future. Somebody who get a 970/980 now maybe get a 2560x1440 monitor and will stay with this also 5 years. If you wanna have more, well than for sure you have to reduce the cycle and can you will get a new card after 2 or 3 years.

Unless you accept to have microdrops with SLI/CF you don't get >60FPS with newest graphic cards on a 4k resolution in modern games.

#85 MechWarrior4172571

    Member

  • PipPipPipPipPipPip
  • Bridesmaid
  • 251 posts

Posted 29 December 2014 - 09:20 AM

View PostKuritaclan, on 29 December 2014 - 09:09 AM, said:

I don't think GTX 970/980s are capable for 4k on modern games. Yes they can produce good FPS for older Games. And near 30FPS in modern games. The Maxwell cards are good "2K" cards (2560x1440).

I upgrade my 260 to a 770. Now i have a "good" card for Full HD on all Games - and graphic details will be reduced till FPS go under 30 in games coming up in the future. Somebody who get a 970/980 now maybe get a 2560x1440 monitor and will stay with this also 5 years. If you wanna have more, well than for sure you have to reduce the cycle and can you will get a new card after 2 or 3 years.

Unless you accept to have microdrops with SLI/CF you don't get >60FPS with newest graphic cards on a 4k resolution in modern games.


You are correct. Modern hardware is behind modern games for ultra settings at max 'home-available' resolution monitors.. practically every time. Here is an example where GTX 980 (the latest the greatest, right?) only manages 42 fps at 4k resolution.. http://www.anandtech...x-980-review/13 Some titles are ok but some are obviously already pwning the new hardware, like when the Crysis came out.. 30 fps here at ultra.. http://www.anandtech...x-980-review/14

#86 Exarch Levin

    Member

  • PipPipPipPipPip
  • Moderate Giver
  • Moderate Giver
  • 118 posts

Posted 29 December 2014 - 08:11 PM

Quote

And as mentioned before w8 for skylake with DDR4 coming up 2015. If you wanna built a new rig.

I can see that as hopefully the new chips won't have the Intel heat gimp issue but DDR4 is prohibitively expensive and with DDR3 still being very pricey I'm not holding my breath waiting for prices to drop like a rock.

Quote

Modern hardware is behind modern games for ultra settings at max 'home-available' resolution monitors.. practically every time

I'd say it's more that modern gaming software is lagging behind gaming hardware yet again, that hardware is rarely utilized properly by contemporary software and so we have to throw excessive amounts of hardware power at a software problem to solve it.

BTW I'm not sure what Anand was trying to get at by running the reference cooler 290X (and telling us that "uber mode" on that = aftermarket cooler performance!) but it is still nice to see that the dated AMD tech was able to hold its own against Nvidia's newest even though Anand ran a gimped version of said dated tech.

Anand says that Battlefield 4 multiplayer matches drop, in their experience, to about half the FPS observed in similar single-player matches. Sounds a lot like MWO :) ;)

#87 -Pilgrim-

    Rookie

  • The Patron Saint
  • 7 posts

Posted 10 January 2017 - 05:37 PM

these post im reading are old and honestly very conflicting base on my experience and mwo gamers on twitch. i have a fx8350 4.4ghz gtx1070 8g and for my son i5-4440 rx480 8g both pcs have 12gig ram. anyway, intel is far better but its not consistent 60fps and i have not found a good reason to upgrade my fx8350 as i find it to believe all this people posting they are getting the 60 fps ALL the time. i mean im getting a solid 57-70 fps 95% but ALL the time. i wish you could actually do a video recording of your game screen before making suggestions that would end up losing me money. i don't care to spend on i76600k if you can help me decide on it. honestly the help i got from email from tech support was a bad decision. i could have move from amd to intel instead of buying this 420$ gtx1070. yes it helped but not to my satisfaction.

#88 xWiredx

    Member

  • PipPipPipPipPipPipPipPip
  • Elite Founder
  • Elite Founder
  • 1,805 posts

Posted 10 January 2017 - 07:20 PM

View PostPilgrim42, on 10 January 2017 - 05:37 PM, said:

these post im reading are old and honestly very conflicting base on my experience and mwo gamers on twitch. i have a fx8350 4.4ghz gtx1070 8g and for my son i5-4440 rx480 8g both pcs have 12gig ram. anyway, intel is far better but its not consistent 60fps and i have not found a good reason to upgrade my fx8350 as i find it to believe all this people posting they are getting the 60 fps ALL the time. i mean im getting a solid 57-70 fps 95% but ALL the time. i wish you could actually do a video recording of your game screen before making suggestions that would end up losing me money. i don't care to spend on i76600k if you can help me decide on it. honestly the help i got from email from tech support was a bad decision. i could have move from amd to intel instead of buying this 420$ gtx1070. yes it helped but not to my satisfaction.


1) Why the hell did you bother necro-ing a post that is over 2 years old?

2) Intel's IPC advantage is pretty vast compared to currently available AMD chips (until Ryzen is commercially available)

3) We have already documented the performance difference that proves Intel chips are better for MWO - if you've bothered necro-ing a 2 year-old thread but can't bother to read any other threads regarding performance, that is your own fault

4) For your reference, a 4.5GhZ Haswell CPU is where you no longer run into large dips in framerate with the settings at very high with a 1080p monitor, so lesser chips will definitely need the settings turned down

5) Extrapolating from the above, a 6600K is definitely about what you need if you want that kind of experience

#89 Peter2k

    Member

  • PipPipPipPipPipPipPipPipPip
  • Elite Founder
  • Elite Founder
  • 2,032 posts
  • LocationGermany

Posted 11 January 2017 - 12:29 AM

View PostPilgrim42, on 10 January 2017 - 05:37 PM, said:

these post im reading are old and honestly very conflicting base on my experience and mwo gamers on twitch. i have a fx8350 4.4ghz gtx1070 8g and for my son i5-4440 rx480 8g both pcs have 12gig ram. anyway, intel is far better but its not consistent 60fps and i have not found a good reason to upgrade my fx8350 as i find it to believe all this people posting they are getting the 60 fps ALL the time. i mean im getting a solid 57-70 fps 95% but ALL the time. i wish you could actually do a video recording of your game screen before making suggestions that would end up losing me money. i don't care to spend on i76600k if you can help me decide on it. honestly the help i got from email from tech support was a bad decision. i could have move from amd to intel instead of buying this 420$ gtx1070. yes it helped but not to my satisfaction.


The gtx 1070 will help you in every other game ever made to get more fps
Whish MWO had a benchmark as it is the most CPU limited game ;-)


why not get yourself a 7600K then

I've gotten mine stable at 5Ghz under Air (actually also 5.2, but reverted to 5 for now, AiO coming, installing on the weekend)

there sure are no more dips that are bad any more

edit:
also a 6600k is an i5
6700K is i7

i5 7600K
i7 7700K

also hope you know the difference

Edited by Peter2k, 11 January 2017 - 12:35 AM.


#90 darqsyde

    Member

  • PipPipPipPipPipPip
  • The Blood Bound
  • The Blood Bound
  • 348 posts
  • Facebook: Link
  • Twitter: Link
  • Twitch: Link
  • LocationFar Beyond The Black Horizon

Posted 11 January 2017 - 06:24 PM

You might also want to wait until we get some idea of the performance of the I3-7350K. (It looks like it might be a decent price performer.)

Also...Necromancy is evil!

#91 McHoshi

    Member

  • PipPipPipPipPipPipPipPip
  • 1,163 posts
  • Facebook: Link
  • LocationGermany

Posted 11 January 2017 - 10:23 PM

I expect Ryzen to be a good prize performer... so just wait and relax Posted Image

#92 darqsyde

    Member

  • PipPipPipPipPipPip
  • The Blood Bound
  • The Blood Bound
  • 348 posts
  • Facebook: Link
  • Twitter: Link
  • Twitch: Link
  • LocationFar Beyond The Black Horizon

Posted 12 January 2017 - 09:00 PM

We are all waiting on the Ryzen reviews...please AMD don't mess this up

#93 Smokeyjedi

    Member

  • PipPipPipPipPipPipPipPip
  • Liquid Metal
  • Liquid Metal
  • 1,040 posts
  • LocationCanada

Posted 16 January 2017 - 05:28 PM

View PostThe Potatoe Whisperer, on 22 November 2014 - 02:14 PM, said:

Hmm... i loved my 8320. There were just these little "things" that made me want to give intel a shot. The biggeet one was the hesitation in core activation. If a core maxed out it would take a fraction of a second for it to switch an inactive core/cores on. Games would just lag while it was doing this. At that point and it would drive me crazy. I have a feeling it was an indication of a failing motherboard because it was getting progressively worse. Im gonna make a back up rig with the FX chip and see how this intel does.

core parking the program is your best friend on fx platform, a literal god send for that issue.

#94 Smokeyjedi

    Member

  • PipPipPipPipPipPipPipPip
  • Liquid Metal
  • Liquid Metal
  • 1,040 posts
  • LocationCanada

Posted 16 January 2017 - 05:34 PM

View PostMechWarrior4172571, on 02 December 2014 - 03:47 PM, said:

It would be nice if you would post in the future, if you get to it, pics for such occasions with not only minimal scan but with maximum maxed out (in past tense) due to some heavy workout, like benchmark or furmark burnin--this would be reflected in the "max" settings having recorded that... just a thought. It would make it a lot more interesting to look at.

Exactly. I have just discovered that MWO runs my 6 core Phenom II 1090T Thuban processor at 1 core maxed out--hogging that one core to death, and SOOOO, it shows that it would want higher and higher frequency from that core in order to process faster (at gameplay). doh Sure, I do realize that intel is fast at single thread and beats AMD and that Phenoms don't have hyperthreading, but,... this is ridiculous. My GTX 970 is running mwo at 50% utilization and only 1 cpu core (out of 6) is fully utilized (the other ones are at 10% utilization or less.)

wow i guess i can post my fx config files, not sure theyd even work at this point in thew game, but goddamn they were helpful those 3-4 weeks i tinkered meticulously for thread designation. a few years back ill dig up a link.

#95 Smokeyjedi

    Member

  • PipPipPipPipPipPipPipPip
  • Liquid Metal
  • Liquid Metal
  • 1,040 posts
  • LocationCanada

Posted 16 January 2017 - 05:50 PM

https://mwomercs.com...ost__p__4042918

yup that used to uncork the **** outta MWO on FX platform. i only can see 1 dip below 60FPS on that graph. hott damn i was onto something. I dont play now due to the weight I have to carry and skill level I must maintain after 5 years of between 10-40 drops a day. goddamn. I was sick with MWO fever.

after seeing how the cnfg files has changed slightly here is a newer revised offering.

gp_option_ShowCockpitGlass=0 (off/on)
r_DepthOfField=0 (off/on)
r_HDRGrainAmount=0.0 (film grain amount)
r_motionBlur = 0
r_MultiThreaded = 1
cl_fov = 70 (Default is 75)

sys_MaxFPS = 144
d3d10_TripleBuffering = 1
d3d11_TripleBuffering = 1
d3d9_TripleBuffering = 1

e_GsmCache = 1
r_FogShader = 0


r_silhouettePOM = 0
r_UsePOM = 0

sys_budget_streamingthroughput = 21250048
sys_LocalMemoryGeometryStreamingSpeedLimit = 20752
sys_LocalMemoryTextureStreamingSpeedLimit = 20752 ---->2000mhz 10-10-10 2T timings on vishera
sys_streaming_max_bandwidth = 20752

r_WaterUpdateThread = 0

sys_streaming_CPU = 2
sys_budget_soundCPU = 0

sys_budget_videomem = 4196 -----------> this is your Vram


ca_thread0Affinity = type: int current: 5
ca_thread1Affinity = type: int current: 3
r_WaterUpdateThread = type: int current: 5
sys_main_CPU = type: int current: 0
sys_physics_CPU = type: int current: 1
sys_streaming_CPU = type: int current: 1
sys_TaskThread0_CPU = type: int current: 3
sys_TaskThread1_CPU = type: int current: 7
sys_TaskThread2_CPU = type: int current: 4
sys_TaskThread3_CPU = type: int current: 6
sys_TaskThread4_CPU = type: int current: 2
sys_TaskThread5_CPU = type: int current: 1

Edited by Smokeyjedi, 19 January 2017 - 08:23 AM.






1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users