Jump to content

Wired Asked For It, So Here It Is. Ryzen 1700 Fps In Mwo


105 replies to this topic

#81 NocturnalBeast

    Member

  • PipPipPipPipPipPipPipPipPip
  • Shredder
  • Shredder
  • 3,685 posts
  • LocationDusting off my Mechs.

Posted 08 October 2017 - 09:28 PM

View PostNARC BAIT, on 08 October 2017 - 06:09 PM, said:

yeah ... but ... almost all intel new cpu's from the last few years, have required a new motherboard to go ... coffee lake is barely a tick past kaby .... other than bonus heat, I'd expect them to have more trouble with the platform for ice lake ... again, another stupid name ... at least your not likely to drown in a coffee lake, it kills you in other ways .... like not sleeping ...


According to Intel the 8th gen cpus can perform 30% more IPC than the 7th gen, but I have read that people who have gotten to test the sample cpus are reporting that it is actually about 15% more, not 30%. That is still a decent increase, but not enough for me to have waited. Anyway, my CPU/GPU perform significantly better than my old setup.

#82 Bill Lumbar

    Member

  • PipPipPipPipPipPipPipPipPip
  • Death Star
  • 2,073 posts

Posted 09 October 2017 - 01:41 AM

View PostEd Steele, on 08 October 2017 - 09:28 PM, said:


According to Intel the 8th gen cpus can perform 30% more IPC than the 7th gen, but I have read that people who have gotten to test the sample cpus are reporting that it is actually about 15% more, not 30%. That is still a decent increase, but not enough for me to have waited. Anyway, my CPU/GPU perform significantly better than my old setup.

So the expected or claimed gains from Intel this time around are similar to AMD's FX line up from the past? Say it ain't so......

#83 ninjitsu

    Member

  • PipPipPipPipPipPip
  • FP Veteran - Beta 2
  • FP Veteran - Beta 2
  • 402 posts

Posted 09 October 2017 - 08:20 AM

View PostEd Steele, on 08 October 2017 - 09:28 PM, said:


According to Intel the 8th gen cpus can perform 30% more IPC than the 7th gen, but I have read that people who have gotten to test the sample cpus are reporting that it is actually about 15% more, not 30%. That is still a decent increase, but not enough for me to have waited. Anyway, my CPU/GPU perform significantly better than my old setup.

My understanding is that it's a ~5% IPC improvement. The value is in the core count, not IPC.

#84 NocturnalBeast

    Member

  • PipPipPipPipPipPipPipPipPip
  • Shredder
  • Shredder
  • 3,685 posts
  • LocationDusting off my Mechs.

Posted 09 October 2017 - 10:00 AM

View Postninjitsu, on 09 October 2017 - 08:20 AM, said:

My understanding is that it's a ~5% IPC improvement. The value is in the core count, not IPC.


What makes the Intel chips better for most modern games is the power of each individual core (IPC), not the amount of cores. For things like video / audio encoding or massive multitasking stuff, the extra cores will make a huge difference and that is where the Ryzen chips catch up to or even surpass the current Intel chips. I am pleased with my new I7, that plus the fast memory and SSD drive have my computer booting faster than my cellphone.

#85 gaIaxor

    Member

  • PipPip
  • 24 posts

Posted 09 October 2017 - 02:34 PM

Don't be fooled by stealth OC reviews and Intel PR. IPC (instructions per cycle) is the same as Skylake/Kaby Lake. Only the frequencies have been (covertly) increased. Intel keeps the turbo frequencies a secret and the motherboard manufacturers try to fool reviewers and customers with default overclocking (~10% more performance with auto max turbo on all cores and +1 Ghz L3 cache overclocking).
http://translate.goo...mainboard-test/

Edited by gaIaxor, 09 October 2017 - 02:35 PM.


#86 NocturnalBeast

    Member

  • PipPipPipPipPipPipPipPipPip
  • Shredder
  • Shredder
  • 3,685 posts
  • LocationDusting off my Mechs.

Posted 09 October 2017 - 04:20 PM

View PostgaIaxor, on 09 October 2017 - 02:34 PM, said:

Don't be fooled by stealth OC reviews and Intel PR. IPC (instructions per cycle) is the same as Skylake/Kaby Lake. Only the frequencies have been (covertly) increased. Intel keeps the turbo frequencies a secret and the motherboard manufacturers try to fool reviewers and customers with default overclocking (~10% more performance with auto max turbo on all cores and +1 Ghz L3 cache overclocking).
http://translate.goo...mainboard-test/


You do realize that the clock frequency determines how many cycles there are and IPC is the amount of instructions the processor can execute per each cycle, Therefore high clock speed + high IPC is great.

#87 gaIaxor

    Member

  • PipPip
  • 24 posts

Posted 10 October 2017 - 02:56 AM

You wrote:

View PostEd Steele, on 08 October 2017 - 09:28 PM, said:

According to Intel the 8th gen cpus can perform 30% more IPC than the 7th gen, but I have read that people who have gotten to test the sample cpus are reporting that it is actually about 15% more, not 30%.


That is wrong. Please check the meaning of "IPC".

Until Skylake Intel was able to increase the IPC between generations (of the Core architecture). But now it seems like the old Core architecture is maxed out and the IPC cannot be increased further or it is too expensive. (Already with Skylake it was difficult for Intel to increase IPC and had to maintain the same base clock speed and lower the turbo clock speed compared to Haswell.) So now Intel only relies on higher clock speeds and a better process to increase ST performance. Reviewers need to make that transparent and not be fooled by stealth OC/"reviewer BIOS".

I doubt that customers will get the same BIOS with default overclocking, because of higher temperatures, instability, lower lifespan and higher RMA costs. Don't be suprised when 8700k is slower than expected and does not reach for example >1500 in Cinebench or same fps like in Coffee Lake reviews.

Edited by gaIaxor, 10 October 2017 - 03:49 AM.


#88 xWiredx

    Member

  • PipPipPipPipPipPipPipPip
  • Elite Founder
  • Elite Founder
  • 1,805 posts

Posted 10 October 2017 - 05:02 AM

View PostgaIaxor, on 10 October 2017 - 02:56 AM, said:

You wrote:



That is wrong. Please check the meaning of "IPC".

Until Skylake Intel was able to increase the IPC between generations (of the Core architecture). But now it seems like the old Core architecture is maxed out and the IPC cannot be increased further or it is too expensive. (Already with Skylake it was difficult for Intel to increase IPC and had to maintain the same base clock speed and lower the turbo clock speed compared to Haswell.) So now Intel only relies on higher clock speeds and a better process to increase ST performance. Reviewers need to make that transparent and not be fooled by stealth OC/"reviewer BIOS".

I doubt that customers will get the same BIOS with default overclocking, because of higher temperatures, instability, lower lifespan and higher RMA costs. Don't be suprised when 8700k is slower than expected and does not reach for example >1500 in Cinebench or same fps like in Coffee Lake reviews.

After spending hours comparing reviews, methodologies, numbers, etc. I came up with a 1% increase in single-threaded performance between the 7700K and 8700K, which can almost definitely be attributed to the cache. It also disappears the higher you go with clockspeed because I'm relatively certain the thermals are not very good with this chip (and multiple reviews have stated that as well, so it's pretty likely).

The key with Coffee Lake will be to keep it cool. I will probably use a 360mm radiator if I build one (that's a big 'if' since Ice Lake rumors for 2H2018 have already started, though I don't believe them at this stage) and I'll probably use Thermal Grizzly paste. Probably looking at a new, more open case so I can keep power delivery cooler, too, as a precaution.

It's nice to see that what Ryzen got right with its architecture is now troubling Intel enough to get them to be competitive. :)

#89 EvangelX

    Member

  • PipPip
  • FP Veteran - Beta 1
  • FP Veteran - Beta 1
  • 38 posts
  • LocationAU

Posted 10 October 2017 - 05:07 AM

Just did an upgrade to the Ryzen 5 1600, B350 Tomahawk, 16gb 2400 (R9 380 4gb) & it kicks arse! I am really impressed with the performance even BF1 plays smooth @ 1080

#90 NocturnalBeast

    Member

  • PipPipPipPipPipPipPipPipPip
  • Shredder
  • Shredder
  • 3,685 posts
  • LocationDusting off my Mechs.

Posted 10 October 2017 - 07:36 AM

Well, I guess arguing about 8th gen Intel CPUs is kind of pointless since I already bought my 7700k and I will not be in a position to upgrade again for quite a while. My 7700k is working fine, although my case does feel like a space heater after an hour or so of gaming. I really need to hurry and get the rest of my case fans in there (I can fit 6 120mm fans in the case).

#91 NARC BAIT

    Member

  • PipPipPipPipPipPipPip
  • Ace Of Spades
  • 518 posts
  • Twitch: Link
  • LocationAustralia

Posted 10 October 2017 - 10:32 AM

watched a Linux vid earlier in the day where he was showing off a delidding tool from der8auer ... looked much better than most of the hacked vids I've seen for it, only cringing when linus drops the tool ... what else would you expect though ... I think one of the things I really love about my ryzen build, is the lower power consumption compared to my last FX, and its clearly not dissipating as much heat ...

built a budget ryzen 3 system today, for someone not interested in games ... and wow, that el cheapo memory really makes a throughput difference .... still though, its quick enough for the users needs, and god damn was it a cheap build .... I've got it for another day or so, but the cheapest available GPU doesn't cut the mustard for MWO ( or anything besides web browsing / video ) ... a gtx 710, about as powerful as a gtx 460 / 560m ...

#92 NocturnalBeast

    Member

  • PipPipPipPipPipPipPipPipPip
  • Shredder
  • Shredder
  • 3,685 posts
  • LocationDusting off my Mechs.

Posted 10 October 2017 - 05:47 PM

View PostNARC BAIT, on 10 October 2017 - 10:32 AM, said:

watched a Linux vid earlier in the day where he was showing off a delidding tool from der8auer ... looked much better than most of the hacked vids I've seen for it, only cringing when linus drops the tool ... what else would you expect though ... I think one of the things I really love about my ryzen build, is the lower power consumption compared to my last FX, and its clearly not dissipating as much heat ...

built a budget ryzen 3 system today, for someone not interested in games ... and wow, that el cheapo memory really makes a throughput difference .... still though, its quick enough for the users needs, and god damn was it a cheap build .... I've got it for another day or so, but the cheapest available GPU doesn't cut the mustard for MWO ( or anything besides web browsing / video ) ... a gtx 710, about as powerful as a gtx 460 / 560m ...


The Ryzen 3 is probably good on the low-end, although I am not sure how it compares to a 7th gen I3.


#93 NocturnalBeast

    Member

  • PipPipPipPipPipPipPipPipPip
  • Shredder
  • Shredder
  • 3,685 posts
  • LocationDusting off my Mechs.

Posted 10 October 2017 - 11:41 PM

G-SYNC was a new technology to me, but I have learned quite a bit about it over these couple days. I now believe, that after a few adjustments, I am getting the best performance I can (without over clocking) out of my hardware. I assume that Free-Sync works in a similar manner, if you have an AMD GPU and a Free-Sync monitor. I also concede that Free-Sync monitors are available for roughly half the price of a G-SYNC monitor (although Nvidia cards are powerhouses).

Edited by Ed Steele, 11 October 2017 - 11:13 AM.


#94 NARC BAIT

    Member

  • PipPipPipPipPipPipPip
  • Ace Of Spades
  • 518 posts
  • Twitch: Link
  • LocationAustralia

Posted 11 October 2017 - 02:53 PM

both technologies have adaptive sync at their base, and I'd expect that youd be able to leverage that aspect despite both companies having software locks to try to ensue that you go with their tech .... is gsync/freesync better than adaptive sync ? slightly ... varying depending on what model of what your looking at .... freesync is open, and more monitors support it, because it didn't come with a blow out in costs that ... well ... most end users wont notice the differences, either way ....

I wouldn't buy a g-sync monitor that isn't at least 120hz ... I had a 120hz monitor when the 3D stuff was trying to build momentum ... it ended up just being a high refresh monitor because 3D was pretty much junk .... and then it didn't last much more than the warranty either ... I don't mean that to be a reflection of current options on the market ... just another example of the first wave of people paying through the nose for another half baked tech ... but hey ... we paved the way for VR, and I'm in no hurry to sign up for that fund R'n'D again ....

#95 Bill Lumbar

    Member

  • PipPipPipPipPipPipPipPipPip
  • Death Star
  • 2,073 posts

Posted 11 October 2017 - 04:08 PM

Totally cool with you guys sharing your hardware experiences here.....blow it up. I don't have a lot of time right now, rebuilding my 2005 Altima Se-r engine..... first time I have ever taken on something this big. I will post stuff as I have time.

#96 NocturnalBeast

    Member

  • PipPipPipPipPipPipPipPipPip
  • Shredder
  • Shredder
  • 3,685 posts
  • LocationDusting off my Mechs.

Posted 11 October 2017 - 04:34 PM

View PostBill Lumbar, on 11 October 2017 - 04:08 PM, said:

Totally cool with you guys sharing your hardware experiences here.....blow it up. I don't have a lot of time right now, rebuilding my 2005 Altima Se-r engine..... first time I have ever taken on something this big. I will post stuff as I have time.


Working on cars is for people with lives ;-)

#97 NocturnalBeast

    Member

  • PipPipPipPipPipPipPipPipPip
  • Shredder
  • Shredder
  • 3,685 posts
  • LocationDusting off my Mechs.

Posted 11 October 2017 - 04:37 PM

View PostNARC BAIT, on 11 October 2017 - 02:53 PM, said:

both technologies have adaptive sync at their base, and I'd expect that youd be able to leverage that aspect despite both companies having software locks to try to ensue that you go with their tech .... is gsync/freesync better than adaptive sync ? slightly ... varying depending on what model of what your looking at .... freesync is open, and more monitors support it, because it didn't come with a blow out in costs that ... well ... most end users wont notice the differences, either way ....

I wouldn't buy a g-sync monitor that isn't at least 120hz ... I had a 120hz monitor when the 3D stuff was trying to build momentum ... it ended up just being a high refresh monitor because 3D was pretty much junk .... and then it didn't last much more than the warranty either ... I don't mean that to be a reflection of current options on the market ... just another example of the first wave of people paying through the nose for another half baked tech ... but hey ... we paved the way for VR, and I'm in no hurry to sign up for that fund R'n'D again ....


My monitor is 144Hz, but what I found is that you need to use in-game settings to limit your fps to 3 fps lower than your monitor's max refresh rate when using G-SYNC otherwise, it is possible for the fps to exceed the max refresh rate and cause tearing and fps drops.

Edited by Ed Steele, 11 October 2017 - 04:37 PM.


#98 NARC BAIT

    Member

  • PipPipPipPipPipPipPip
  • Ace Of Spades
  • 518 posts
  • Twitch: Link
  • LocationAustralia

Posted 12 October 2017 - 06:19 AM

I can sit right on my 60 hz limit, but 60.1 goes badly ... in my experience, the in game limiter isn't very reliable, and it actually gives me a worse experience than a tuned vsync configuration .... if your going to try that, make sure you set the vsync mode to 'fast' in the NVidia control panel, or adaptive might work better depending on the actual panel .... if you set the 'pre rendered frames' to 1, you shouldn't 'feel' any input lag at 140 fps ... my screen actually pauses on frames ... but only in MWO ... I cant replicate the behaviour in anything else ... every other program happily goes over 60 fps ...

#99 NocturnalBeast

    Member

  • PipPipPipPipPipPipPipPipPip
  • Shredder
  • Shredder
  • 3,685 posts
  • LocationDusting off my Mechs.

Posted 13 October 2017 - 09:29 PM

View PostNARC BAIT, on 12 October 2017 - 06:19 AM, said:

I can sit right on my 60 hz limit, but 60.1 goes badly ... in my experience, the in game limiter isn't very reliable, and it actually gives me a worse experience than a tuned vsync configuration .... if your going to try that, make sure you set the vsync mode to 'fast' in the NVidia control panel, or adaptive might work better depending on the actual panel .... if you set the 'pre rendered frames' to 1, you shouldn't 'feel' any input lag at 140 fps ... my screen actually pauses on frames ... but only in MWO ... I cant replicate the behaviour in anything else ... every other program happily goes over 60 fps ...


MWO doesn't have an FPS limit in the controls, you can only set the refresh rate of the monitor. I believe that you can limit the fps through the user.cfg file, so I will have to try that.

#100 NARC BAIT

    Member

  • PipPipPipPipPipPipPip
  • Ace Of Spades
  • 518 posts
  • Twitch: Link
  • LocationAustralia

Posted 15 October 2017 - 04:04 AM

View PostEd Steele, on 13 October 2017 - 09:29 PM, said:

I believe that you can limit the fps through the user.cfg file, so I will have to try that.
yeah, you could try that, but its as unreliable as hell ... you'll set one value ... and sometimes it will do it .... others it will be +/- 10 frames in that second ... which is pretty nasty when ur only locking in 60 fps ...





1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users