Jump to content

Request For Benchmarks With Phenom Ii X6


84 replies to this topic

#1 Narcissistic Martyr

    Member

  • PipPipPipPipPipPipPipPipPip
  • Veteran Founder
  • Veteran Founder
  • 4,242 posts
  • LocationLouisville, KY

Posted 13 August 2013 - 08:13 AM

I want to test a theory that I have that mwo doesn't currently work well with fx CPUs due to only being able to effectively use one core of each module.

So, could someone log cpu utilization, per core cpu utilization, gpu utilization, and fps on a system with a phenom ii x6 and a GPU that won't bottleneck the system. It'd be even better if you could run a number of tests at different clock rates too but I'll take what I can get.

Thanks in advance.

#2 Byzan

    Member

  • PipPipPipPipPip
  • 111 posts

Posted 15 August 2013 - 11:18 AM

FX CPU's are just bad. but technically each module has 2 integer cores that share a single floating point unit which is why they are not truely 8 core CPU's. But "4 module 8 thread" sounds like a silly way to describe a cpu. But I guess that is how Intel HT CPU's are described.

I have a phenom II 550 BE PC sitting around that would be the best I could do.

IMO AMD really screwed up dropping the phenom archetecture for the piledriver/bulldozer

By now we could surely have had 6,7,8 core 22nm or lower 4GHz+ Phenom Based CPU's that would perform better and more power efficent than the crap they have today if they had just kept to the path they were on with the Phenom II

The Real worry is that even their lattest release CPU is only 32mn tech AMD CPUs seem to be stuck at 32mn manufacturing process, Intel have been producing at 22mn for a while and their roadmap suggests they are going to be going much smaller very quickley.

AMD are kinda struggling through it seems but have to drop prices right down to make their CPUs attractive enough to sell.

#3 Odins Fist

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 3,111 posts
  • LocationThe North

Posted 15 August 2013 - 01:30 PM

View PostByzan, on 15 August 2013 - 11:18 AM, said:

FX CPU's are just bad.


Depending on what you are doing this is a fairly accurate statement, the FX are not complete junk though, they simply aren't worth the upgrade cost if you have a Phenom II Black series Thuban CPU, even though I have watched the prices drop on the FX series.

I'm running a Phenom II x6 1100 Thuban at 4.27GHZ on water, I "WAS" going to go FX series after I bought one of the two 990-FX Asus mobos I have, but the single threaded performance of FX series is worse on the FX-6300 than my old Phenom II 965 Denenb chip clock for clock, as a matter of fact to get about the same single threaded performance of a Phenom II chip, the FX Series has to be clocked 450-600 MHZ above a Phenom II chip depending on what CPUs you are comparing.

That alone, and the stats of the FX-6300 being beat by the Phenom II x4 965d, the Phenom II 960t, and the 1100-Thuban in single threaded apps, I was not going to upgrade. I have a friend with the FX-8350, and he was at 4.8GHZ on water, but he had to bring it back down until he can't afford to buy a much bigger power supply, to power his cards and OC. The FX series goes into power hungry mode when overclocking (thirsty), and that compared to the OC I see on a i7-3820, and an i7-4770 haswell, I do not plan on going with AMD on my next build, I was an AMD fan for years, but they just tanked with the FX.. Don't even get me started on the FX-9590, it is probably the worst deal ever at $899.99 for a factory overclocked chip, trying to overclock the already OC'd chip melts way past 220 watts that the chip is already at stock... $900.00 failure.
In terms of power usage, and performance the FX series loses to Intel. For the money if you are building brand new, AMD is still decent for the price versus performance, but if you are thinking of just upgrading your chip, don't bother if you have a good Thuban chip still..

One example for you, running an older game Sins of a Solar Empire (original and rebellion), I went head to head with my 1100 Thuban at 4.27GHZ against a 2011 socket i7-3820 quad core clocked lower.. The i7-3820 quad core beat my Phenom II x6 1100 Thuban by 20% on CPU usage (hundreds of strike craft and other units), with my Phenom II pegged at 100% usage (game only uses one core) versus 80% for the i7-3820..

Yep, AMD won't be on my next build, so what about the cost, i'm over the FX series, was a long time ago.
http://odinswolves.enjin.com/home

Edited by Odins Fist, 16 August 2013 - 09:05 AM.


#4 Byzan

    Member

  • PipPipPipPipPip
  • 111 posts

Posted 15 August 2013 - 03:36 PM

yeah here in NZ the FX9590 is $1500

the most crazy BS waste of money I think i've ever seen, not to mention the fact that a 220w CPU will regester quite significantly on your power bill.

can get an i7 4770k for $480

just a straight up better cpu and its 84w

or an i7 3930k for $830 and it will whipe the floor with any AMD cpu and use almost half the power

however if you look down the price bracket it's true the FX series does have a role to play.

FX6300 for $170 - that's down in the i3 Price Bracket and you would expect it to perform better than an i3 maybe not per core but at least in threaded applications. Does still hog a lot of power though for what it is compared to the i3 which is a significant invisable added cost. FX6300 95w vs i3 54w

it is an option for a new build but I dont see any reason to upgrade if you already have a 3-6 core later model phenom II

I feel like AMD killed the Golden Duck in getting rid of Phenom II for this crap

Even my 550 BE even though it's only dual core I can run it at 4GHz so it still does the job gaming. even though I dont use it any more, I am qurious to see how it goes in MWO though I think I'll fire it up this weekend.

#5 Narcissistic Martyr

    Member

  • PipPipPipPipPipPipPipPipPip
  • Veteran Founder
  • Veteran Founder
  • 4,242 posts
  • LocationLouisville, KY

Posted 15 August 2013 - 03:55 PM

View PostByzan, on 15 August 2013 - 03:36 PM, said:

Does still hog a lot of power though for what it is compared to the i3 which is a significant invisable added cost. FX6300 95w vs i3 54w


In the US power averages $0.08/kWh. That's a $3.28 difference per 1000 hours of operation @ full power which is 41 days @ full power. So considering the fx6300 is a heck of a lot more chip for the money than an i3 in the multithreaded world you're talking maybe a $20-40 total power cost difference over the lifetime of the chip if you run it flat out 24/7.

#6 Odins Fist

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 3,111 posts
  • LocationThe North

Posted 16 August 2013 - 08:54 AM

View PostNarcissistic Martyr, on 15 August 2013 - 03:55 PM, said:

the fx6300 is a heck of a lot more chip for the money than an i3 .


FX-6300 or FX-6350 versus an i3..??? What (facepalm) forget about cost for a minute.


Here we go again... Price completely aside from the conversation, there is "NO WAY" anyone should be comparing the i3 to AMD's newest series of FX CPUs, that is a sad an tired old argument..

Here is an "EXAMPLE" of how the matching should go if you want to compare newer to newest releases against each other in terms of competition.

#1. FX-6300 or 6350 versus i7-3820 Sandy Bridge socket 2011
#2. FX-8320 or 8350 versus i7-4770K Haswell up to the i7-3960X Extreme
#3. FX-9370 or FX-9590 versus i7-3960X Extreme and i7-3970X
#4. FX-4100 to FX-4350 versus i7-3770K and i7-4770K Haswell

It's AMDs top of the line versus Intel's top of the line, you can't compare a 1984 Chevette to a 1984 Corvette, come on now, that's not a fair comparison, Best versus Best or G-T-F-O.

Cherry picking Intel CPUs to put against the newest that AMD has to offer is just lame.
LEAVE the cost out of it, and put newer model against newer model..

In single threaded apps it's 20% or better perfomance with Intel depending on what chips you are putting against each other in core usage and still available resources left, I have tested this, I have watched it being done, and that is part of the reason i'm not going AMD on my next build, for what I do it doesn't make sense to go AMD anymore.. Not to mention that some Intel CPUs have pretty decent integrated graphics for everyday users (not gamers).
Yeah for some chip versus chip comparison it won't matter much, but here it seems that gaming is the issue, and by that I mean gaming on an un-optimized (single core using) program that is also highly GPU dependant as well.

If you're going to compare, then be fair, put newer model versus newer model, and don't be fooled by the sad and tired old approach to comparing Intel's CPUs against AMDs CPUs.. I have seen this way too often, and for as long as it has been going on, you would think that people would know better by now. If you're trying to use i3 versus FX as an argument on performance, (nevermind price now), then it's just cherry picking at it's best, and possibly FAANN BOOOY-ism at it's worst. I used to be a hardcore AMD fan, but no longer..
http://odinswolves.enjin.com/home

Edited by Odins Fist, 16 August 2013 - 10:17 AM.


#7 LennStar

    Member

  • PipPipPipPipPipPip
  • Ace Of Spades
  • 476 posts

Posted 16 August 2013 - 09:15 AM

If an APU is good enough for your purposes :P - I have written that somewhere in the patch comments already. Before 12:12 I could play without problems if 2 threads were running (BOINC), but not with 4, that decreased the fps. (Before that with a Phenom II X4 I could run 4 BOINC threads without Problems and play) WITH 12:12 I manually switch BOINC completely off because I have random "spikes" of low fps. (how much and bad seems to depend on the work unit) So... at least if something is using the second integer core MWO has a problem even if the int core it runs on is idle. The breaking point is definitely the FPU from my experiences. You CAN run 3 or even 4 integer-dependend programs on 4 cores, but MWO needs at one FPU for itself. (That said, apart from the MWO Problems I am quite satisfied with my 130W peak PC with a 130€ CPU+APU "core". It's "good enough" to use AMD speak and 200€ cheaper in buying and less energy then an Intel+GPU System. Energy is an issue if you pay 0,26€/kWh)

Edited by LennStar, 16 August 2013 - 09:20 AM.


#8 Odins Fist

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 3,111 posts
  • LocationThe North

Posted 16 August 2013 - 09:26 AM

http://www.overclock...re-new-to-boinc

That should help you a bit... :P

Edited by Odins Fist, 16 August 2013 - 09:27 AM.


#9 Narcissistic Martyr

    Member

  • PipPipPipPipPipPipPipPipPip
  • Veteran Founder
  • Veteran Founder
  • 4,242 posts
  • LocationLouisville, KY

Posted 16 August 2013 - 10:43 AM

Odin's Fist,

My comment was not intended to show that AMD has a superior architecture, intel wins that contest hands down. For the average consumer, the total cost of ownership is a concern so my comment was intended to show that the added electricity cost of the higher TDP AMD chips should not a receive a high weight when deciding which chip to purchase for a gaming computer. (If one is buying desktops for an entire office on the other hand efficiency extremely important due to increased cooling costs from hundreds or thousands of CPUs)

So, while I feel that fx6300 is superior to the i3 3220 with games becoming increasingly multithreaded and LGA1155 being a dead socket (I will of course reconsider this position when the haswell i3 chips come out as the upgrade potential will be there again), each individual consumer will ultimately have to make their choice based off of their intended usage. For example, as a relatively poor grad student, at $180 the fx8320 was a superior choice to the i5 3350p because I do Photoshop work and CAD rendering on my computer for fun, school, and profit and it would play most games well enough (even mwo although I like to complain). As I'll be less poor next time I build a computer I'll probably go for an octocore haswell-e chip (or 4) and a very powerful professional GPU for better rendering performance.

Edited by Narcissistic Martyr, 16 August 2013 - 10:51 AM.


#10 Odins Fist

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 3,111 posts
  • LocationThe North

Posted 16 August 2013 - 10:50 AM

View PostNarcissistic Martyr, on 16 August 2013 - 10:43 AM, said:

Odin's Fist,

My comment was not intended to show that AMD has a superior architecture, intel wins that contest hands down. For the average consumer, the total cost of ownership is a concern so my comment was intended to show that the added electricity cost of the higher TDP AMD chips should not a receive a high weight when deciding which chip to purchase for a gaming computer. (If one is buying desktops for an entire office on the other hand efficiency extremely important due to increased cooling costs from hundreds or thousands of CPUs)

So, while I feel that fx6300 is superior to the i3 3220 with games becoming increasingly multithreaded and LGA1155 being a dead socket, I will of course reconsider my recothis position when the haswell i3 chips come out as the upgrade potential will be there again. However, each individual consumer will ultimately have to make their choice based off of their intended usage. For example, as a relatively poor grad student, at $180 the fx8320 was a superior choice to the i5 3350p because I do Photoshop work and CAD rendering on my computer for fun, school, and profit and it would play most games well enough (even mwo although I like to complain). As I'll be less poor next time I build a computer I'll probably go for an octocore haswell-e chip (or 4) and a very powerful professional GPU for better rendering performance.


I completely understand the budget issue, and sympathize with you.. I hope you get exactly what you need, and that it fits in your budget.

I just get sick of people cherry picking what Intel CPU to put up against AMDs latest offering, it has been going on for a long time.
http://odinswolves.enjin.com/home

Edited by Odins Fist, 16 August 2013 - 11:16 AM.


#11 MercJ

    Member

  • PipPipPipPipPip
  • Galaxy Commander III
  • Galaxy Commander III
  • 184 posts

Posted 16 August 2013 - 11:57 AM

Love how there was a request for benchmarks, and the response is "but which chip is better??" :lol: Anyway, don't have access to an X6, just an X2 unlocked to quad (B55 @ 3.8) - would be interested to see Thuban results as well.

FWIW, I don't think people should compare FX to Core chips, there ISN'T really a fair comparison, whether you factor in price, purpose or whatever - it's becoming obvious that they are beginning to branch in different directions. Intel focuses on x86 performance, AMD is looking at the eventual limitations and is taking a different path to improve performance. Will it pay off? Guess we'll find out eventually. APUs will have some pretty compelling advantages if HSA takes off. At what point do you stop calling a component a CPU? Just wanted to add some thoughts most probably don't think about if they're directly comparing one CPU to another.

Perhaps more relevant, I really can't tell the difference between my 2500K/7970 platform and FX-8320 4.6GHz/7950 platform when playing MWO. I notice a couple more "spikes" of dropped frames on a Phenom II B55 @ 3.8GHz in 12v12, not sure yet what causes it. Too many variables at this point.

EDIT: Also, to the OP, do you have a motherboard that can switch off the additional integer core for each module, making it a 1:1 integer/FPU core? You may be able to answer your question that way as well - just something to try!

Edited by MercJ, 16 August 2013 - 12:20 PM.


#12 awdwikisi

    Member

  • PipPip
  • 36 posts

Posted 16 August 2013 - 01:01 PM

@ Narcissistic Martyr
in an attempt to get the highest frames possible in eyeinifinity 3/6, i have kinda played this out, wont go into 6 though
i have tesed in both windows7 AND windows8
windows7 because mainstream, windows8 because advanced core parking/memory utilization
this game is not running in dx11, which is a MASSIVE factor in low framerates
the 6970/7950 run about 45-55% in 1080p mode and 80+% in eyeinfinity
7970 runs 40-50% in 1080p and 75% in eyeinfinity
proccesor utilization is always around 40% with either intel or amd

all cpus oc'd to 4.0ghz with no turbo core/core boost/hyperthreading
i have tested this out with these processors/motherboards/gpus
1045thuban, 8120fx, i7-3820
990fx ud3/990fx crosshair5/msi bigbanpower2
(6970) (2x7950) (3x7970lightinings)
Crossfire does not work as i have tested it, it MAY work for others
windows7
8120+7950= 20-35 med-high settings @ 1080p, 15-30 med @ 3456x2048
8120+6970= 25-40 med-high settings @ 1080p, 20-30 med @ 3456x2048
windows8
8120+7950= 25-45 med-high settings @ 1080p, 15-35 med @ 3456x2048
at this point i raged quit the 8120 as ocing it to 5ghz did nothing to help frames
windows 8 and 7
1045T+7950(880/1250) 22-27 frames @veryhigh settings @ 3456x2048
1045T+7950(900/1250) 24-32 frames @veryhigh settings @ 3456x2048
1045T+7950(880/1250) 22-27 frames @veryhigh settings @ 3240x1920
1045T+7950(900/1250) 24-32 frames @veryhigh settings @ 3240x1920
1045T+7970(1050/1500) 27-45 frames @ veryhigh settings @ 3240x1920
windows8
i7-3820+7950 35-55 frames @veryhigh settings @3240x1920
i7-3820+7970 45-60 frames @veryhigh settings @3240x1920
i have also tested a apu 3650 ocd to 3.4ghz, it barely runs 800x600@ 20 frames

findings thuban marginally than 8120fx better for now, and the i7/i5 best

#13 Narcissistic Martyr

    Member

  • PipPipPipPipPipPipPipPipPip
  • Veteran Founder
  • Veteran Founder
  • 4,242 posts
  • LocationLouisville, KY

Posted 16 August 2013 - 04:26 PM

View PostMercJ, on 16 August 2013 - 11:57 AM, said:

Love how there was a request for benchmarks, and the response is "but which chip is better??" :lol: Anyway, don't have access to an X6, just an X2 unlocked to quad (B55 @ 3.8) - would be interested to see Thuban results as well.

EDIT: Also, to the OP, do you have a motherboard that can switch off the additional integer core for each module, making it a 1:1 integer/FPU core? You may be able to answer your question that way as well - just something to try!


To be honest I was expecting this since my request was a bit odd and very specific. I have already turned off half my cores and it helped a lot with my averages but it ended up that the improved clock speed I was able to get was more important. This is why I wanted CPU utilization logs from to see if mwo just can't deal with the 2core/module architecture

View PostFightin the 3rd, on 16 August 2013 - 01:01 PM, said:

windows7
8120+7950= 20-35 med-high settings @ 1080p, 15-30 med @ 3456x2048
8120+6970= 25-40 med-high settings @ 1080p, 20-30 med @ 3456x2048
windows8
8120+7950= 25-45 med-high settings @ 1080p, 15-35 med @ 3456x2048
at this point i raged quit the 8120 as ocing it to 5ghz did nothing to help frames
windows 8 and 7
1045T+7950(880/1250) 22-27 frames @veryhigh settings @ 3456x2048
1045T+7950(900/1250) 24-32 frames @veryhigh settings @ 3456x2048
1045T+7950(880/1250) 22-27 frames @veryhigh settings @ 3240x1920
1045T+7950(900/1250) 24-32 frames @veryhigh settings @ 3240x1920
1045T+7970(1050/1500) 27-45 frames @ veryhigh settings @ 3240x1920

findings thuban marginally than 8120fx better for now, and the i7/i5 best


Thanks for the data, it was helpful. I don't suppose you also took core utilization data in your logs? No need to run extra tests if you didn't but if you did, feel free to upload the raw data somewhere so I can compare it to my data.

#14 Odins Fist

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 3,111 posts
  • LocationThe North

Posted 16 August 2013 - 05:08 PM

View PostMercJ, on 16 August 2013 - 11:57 AM, said:

Love how there was a request for benchmarks, and the response is "but which chip is better??"


Exactly... The better chip has better performance... Question answered.. End of Story :lol:

#15 Havok1978

    Member

  • PipPipPipPipPipPip
  • Fury
  • Fury
  • 371 posts
  • LocationTexaz!!

Posted 16 August 2013 - 05:45 PM

what a load of crap, I have an FX6300 and it screams like a banshee....
the entire idea of using AMD is utilizing its overclockability. now this is to note mine is a Vishera core not a Zambezi.

also when stating a program uses only this and that , thus the game runs no faster etc is misinformation. this line of single-minded thinking would be accurate for an arcade machine, but a PC is NOT an arcade machine.

in any PC you have background tasks and etc running always, those things continue to run during gaming... so the other cores can run those things thereby freeing up resources and allowing your cpu to dedicate resources to the primary task.

I will look into obtaining benchmark scores for my rig, but currently I run 60FPS on every game I play with no dip in framerate save for on skyrim it drops to 46 or so due to the mods i run. also to note my system runs at arou 46 celcius under full load

also it was said earlier but an FX6300 is NOT 170$.. its 110$

Edited by Havok1978, 16 August 2013 - 05:56 PM.


#16 Havok1978

    Member

  • PipPipPipPipPipPip
  • Fury
  • Fury
  • 371 posts
  • LocationTexaz!!

Posted 16 August 2013 - 07:07 PM

ok so i just ran sandra benchmarks, my rig butchered an i7 gaming rig.... you can do better than my rig of course, but my rig only cost 750$ sooooo... ya.. so anywho will gladly post the results... soon as i get a handle on how to get them on here

#17 Narcissistic Martyr

    Member

  • PipPipPipPipPipPipPipPipPip
  • Veteran Founder
  • Veteran Founder
  • 4,242 posts
  • LocationLouisville, KY

Posted 16 August 2013 - 08:01 PM

View PostHavok1978, on 16 August 2013 - 07:07 PM, said:

ok so i just ran sandra benchmarks, my rig butchered an i7 gaming rig.... you can do better than my rig of course, but my rig only cost 750$ sooooo... ya.. so anywho will gladly post the results... soon as i get a handle on how to get them on here


mwo benchies instead please? Preferably with CPU utilization both total % and per core%, GPU % utilization, and FPS. My current understanding is that fx6300's only use 3/6 cores in mwo.

Also, if you have a dropbox you can upload your data logs there. If you don't dropbox is free so get one.

Also, the $170 is in NZ$

*Edit* Please stop threadjacking me. Please? Pretty pretty pretty please with bourbon on top?

Edited by Narcissistic Martyr, 16 August 2013 - 08:03 PM.


#18 awdwikisi

    Member

  • PipPip
  • 36 posts

Posted 16 August 2013 - 09:27 PM

i do not have any screen shots for bulldozer, i can later for the thuban/i7

"I want to test a theory that I have that mwo doesn't currently work well with fx CPUs due to only being able to effectively use one core of each module."-narcissistic martyr

gave up with having anything to do with bulldozer after 20+ hours talking and dozens of emails to various windows7-8 devs
only to have the exact same answer "un-optimized".
that is the answer, if windows7/8 does not know how correctly assign cores and halfcores and utilize them, how can pgi effectively take advantage of it.

i refer to bulldozer as having cores and halfcores, that is the terminology i use.
i have attempted to set cores manually for bulldozer, this resulted in 2 experiences
1. when set to full cores, stable frame rates(not higher)
2. when set to half cores, computer would crash.
i will not claim to know why this happened
i did NOT test turning off bulldozer halfcores in bios, too frustrated at that point

i can tell you there was NO usable difference between 4ghz all the way to 5ghz in terms of frame rate improvement for MWO. what gave a small improvement was changing to windows 8, then later on overclocking the gpu a bit

this is in 1080p, 2560x1600, 3240x1920, 3600x1920, and 5760x2160
only thing that is affected with larger rez is gpu which went up
1x1080p 6970/7950 peaked at 55% and the 7970 was 50% utilization
6970/7950 hit 100% utilization at 3240x1920
7970 hit 100% utilization at 5760x2160
i say peaked as gpu utilization went up, cpu utilization went down or stayed level
however certain maps i see more cpu utiliztion vs gpu but not more than what is stated below

during loading
thuban uses 4 cores/i7-3820 uses 3 cores/
bulldozer uses 2 cores and 2 half cores and has used 3 cores 3 halfcores
^ i caveat this as windows7 see's no difference between bulldozer cores and windows8 does
however its a crap shoot, sometimes its 4 cores and no half cores other times its 2 cores/2halfcores, it has NEVER loaded on all halfcores, unless i forced it too.

during gameplay
thuban uses 2 cores/i7-3820 uses 2 cores/
bulldozer uses 2 cores and 1 halfcore ditto the caveat

something i did not note in previous post
i7-3820+7950(7970) 55-60 @ veryhigh settings @ 1080p
i7-3820+7970 45-60 @ veryhigh settings @ 3240x1920 and 3600x1920
i7-3820+7970 25-45 @ veryhigh settings @ 5760x2160
8120+7970 15-25 @ veryhigh settings @ 5760x2160
1045t+7970 10-20 @ veryhigh settings @ 5760x2160


thoughts:
its not a finished game, no dx11/other features found in crysis3 and
windows7 does NOT understand how to efficiently utilize the unique "zambezi" and "vishera" based cpu's wait and see what windows 8.1 improves
i am neither for or against amd or intel, and add amd has a better idea for cpu's but software isnt there yet

#19 Narcissistic Martyr

    Member

  • PipPipPipPipPipPipPipPipPip
  • Veteran Founder
  • Veteran Founder
  • 4,242 posts
  • LocationLouisville, KY

Posted 16 August 2013 - 11:08 PM

View PostFightin the 3rd, on 16 August 2013 - 09:27 PM, said:

thoughts:
its not a finished game, no dx11/other features found in crysis3 and
windows7 does NOT understand how to efficiently utilize the unique "zambezi" and "vishera" based cpu's wait and see what windows 8.1 improves
i am neither for or against amd or intel, and add amd has a better idea for cpu's but software isnt there yet


Thank you for confirming my hypothesis. Still it's weird mwo only used 2/6 cores during play on thuban chips when it'll use 4 cores on my fx8320 (in quad core mode).

#20 Odins Fist

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 3,111 posts
  • LocationThe North

Posted 16 August 2013 - 11:12 PM

View PostHavok1978, on 16 August 2013 - 05:45 PM, said:

what a load of crap, I have an FX6300 and it screams like a banshee....
the entire idea of using AMD is utilizing its overclockability.
..........................................................................................................
ok so i just ran sandra benchmarks, my rig butchered an i7 gaming rig.... you can do better than my rig of course, but my rig only cost 750$ sooooo...



Intel Core i7-3770K advantages over FX-6300




Posted Image
  • In single-threaded programs, the Intel Core i7-3770K CPU has 67% better performance.
  • In multi-threaded tasks, the Intel i7-3770K is 56% faster.
  • Memory performance of the Intel Core i7-3770K is better.
  • In graphics applications, the Intel Core i7-3770K is 4% faster.
  • The Intel i7-3770K features integrated HD 4000 GPU. While this Graphics Processing Unit is not fast enough to run latest games at full resolution, it can be used for casual gaming and 3D apps.
  • Power consumption of the i7-3770K is lower.
CPU Mark Relative to Top 10 Common CPUs














^ I debate some of those numbers, I think they are a little high myself.




http://www.cpubenchm...X-6300+Six-Core

As of 17th of August 2013 - Higher results represent better performnce.


AMD FX-6300 Six-Core 3.5GHz 6,383
PassMark Software © 2008-2013




"butchered an i7"..???? What..??? Doubt it highly...

#1. Not all games are multi thread optimized, and saying that upgrading to a FX-6300 from a Phenom II 1100t clocked to 4.27 to 4.3GHZ on water (cool as ice) is pretty funny. It's not worth upgrading to that chip from the 1100 thuban, sorry.
Clock for clock in single threaded apps my Phenom II 965d edged out the FX-6300 slightly, my Phenom II 960-thuban (with a better memory controller than the denebs) clock for clock beat it in single threaded apps, you want to talk about overclockability?? Lets go there for a second, to reach what my 960t was doing in single threaded apps, the FX-6300 had to be overclocked 400mhz higher than my 960t, and the fact that I had the 960t (stock 3.0ghz) to 4.2Ghz 24/7 stable means that to reach the same single threaded performance, the FX-6300 (3.5ghz stock) "HAS" to be clocked to 4.6ghz at least is an upside??? Really..?? Once you start overclocking the FX series chips they start gobbling power "FACT".
The FX series was suppossed to be a huge improvement, but they're not, they "DON'T" stomp i7 Intel CPUs, not what should be there direct competition, sorry... Cherry pick the oldest Intel i7, and you might a leg up, but let's put the i7 3820 2011 socket, or the i7 4770k haswell 1150 socket with a GTX 680 4gb against it and see how it does, then the story changes huh..
Price isn't the issue here, not even an issue... If you want to stack Intel against AMD don't cherry pick old models, put the best of the newest of each company against the best of each company, then we can talk..

#2. By the time that the FX series would have had a huge impact on multi threaded games is not an argument, that time isn't here, and by the time they might have had a chance they won't be as fast as Intel, and I doubt that AMD is going to be in the high end CPU business when that time comes...
"OH WAIT" AMD already said they won't be in the high end CPU market in the future (THEIR WORDS), Not mine.... If you have an older bottom of the barrel AMD Phenom II chip, then upgrading to an FX series CPU would be a good upgrade, maybe.

#3. Are AMD CPUs worth buying for gaming..?? "YES" especially if you are on a budget (still need a good video card)
Are AMDs the best CPU out there..?? "NO" so put on the brakes...

#4. Cherry picking what i7 to put up against the FX-6300 is fail...

View PostNarcissistic Martyr, on 16 August 2013 - 11:08 PM, said:


Thank you for confirming my hypothesis. Still it's weird mwo only used 2/6 cores during play on thuban chips when it'll use 4 cores on my fx8320 (in quad core mode).


You do realize what you're seeing in core usage right..??
Piledriver is still based on the same basic design as Bulldozer, with the "8 core" chip containing four Piledriver modules, each of which contains a pair of integer cores... While AMD markets these as individual CPU cores, each module’s pair of integer cores shares a number of resources, including the fetch and decode units, a Floating Point scheduler (FPU) and 2MB of L2 cache.... This is part of AMD’s design philosophy of focusing on multi-threaded performance, with each module able to process two threads simultaneously. This comes at the cost of single threaded performance and with the down-side that relatively few applications are able to make use of four cores in multi-threaded workloads, let alone eight!

The debate on whether or not it is really an 8 core or not has been raging for a while, maybe that's what you're seeing for core usage ..??

Edited by Odins Fist, 18 August 2013 - 06:26 PM.






1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users