Jump to content

[GUIDE] Hardware Mythbusters - An In-Depth Hardware Guide



1329 replies to this topic

#81 Henchman 24

    Member

  • PipPipPipPipPipPipPip
  • The Determined
  • The Determined
  • 529 posts
  • LocationRhode Island

Posted 24 May 2012 - 12:45 PM

View PostIron Harlequin, on 17 May 2012 - 11:28 AM, said:

gaming on a laptop? thats laughable.

yes i know that sounds elitist, it was meant to.


Perhaps you missed these guys for the last 8 years or so...
http://www.sagernotebook.com/

not so laughable

#82 cipher

    Member

  • PipPipPipPipPipPipPip
  • 660 posts
  • Facebook: Link
  • LocationState College, PA

Posted 24 May 2012 - 12:49 PM

Indeed. Gaming on a laptop is fine, provided one has the right laptop. ;)

#83 TheRulesLawyer

    Member

  • PipPipPipPipPipPipPipPip
  • 1,415 posts
  • LocationChicagoland

Posted 24 May 2012 - 01:30 PM

View PostVulpesveritas, on 24 May 2012 - 11:55 AM, said:

@Theruleslawyer

The thing is that in a relatively cut and clean system that AMD doesn't have as much of an advantage, with multitasking, bloatware, and an antivirus the extra cores make real world usage much easier. Seeing as ive been able to push my Phenom II x4 to full usage with Google Chrome + mcafee + nettalk + VLC media player, i would say that things are fairly balanced in a PC for a normal conser. Then there's the FX-4170 which is 75% as fast as an i5-2500k while the i5 costs 50% more. Not that bad of a deal if you ask me, at least for gaming, as that's nearly a $100 difference which can go instead to a better graphics card in most situations.



Sure, which is part of why I'd tell everyone to stick to quad core or better systems anymore. Much less chance to get glitches in frame rates with extra cores. However most games barely use two cores effectively, let alone 4 or more. If we weren't talking about games I'd say AMD has the advantage in general use for performance per dollar. As for stepping down to a fx-4170.... I can't say if that's a great choice or not. The GPU clearly is more important than the CPU in games, but some games do hammer the cpu pretty hard as well. Oddly world of tanks is a pretty cpu heavy game in my experience. For example my old e8500 got horrible lags, but moving to my new i7-2600k, it was perfect with the same gpu. I suspect it heavily depends on the game. The one thing I have about CPU's is they tend to be much more expensive to change out. It means a new MB and ram a lot of the time. I tend to buy a bit better cpu than I think I need and have it last through 2-3 GPU changes. Plus its a whole lot easier to upgrade without having to do a nuke and pave with a MB change. I've rarely found a cpu upgrade on the same socket is worthwhile unless you bought really low end to begin with. I suppose that is more an upgrade strategy issue that a real value issue. Still, as long as we're talking about a gaming rig I don't think AMD is quite the clear cut winner you make it out to be.

#84 Vulpesveritas

    Member

  • PipPipPipPipPipPipPipPipPip
  • 3,003 posts
  • LocationWinsconsin, USA

Posted 24 May 2012 - 10:08 PM

View PostTheRulesLawyer, on 24 May 2012 - 01:30 PM, said:


Sure, which is part of why I'd tell everyone to stick to quad core or better systems anymore. Much less chance to get glitches in frame rates with extra cores. However most games barely use two cores effectively, let alone 4 or more. If we weren't talking about games I'd say AMD has the advantage in general use for performance per dollar. As for stepping down to a fx-4170.... I can't say if that's a great choice or not. The GPU clearly is more important than the CPU in games, but some games do hammer the cpu pretty hard as well. Oddly world of tanks is a pretty cpu heavy game in my experience. For example my old e8500 got horrible lags, but moving to my new i7-2600k, it was perfect with the same gpu. I suspect it heavily depends on the game. The one thing I have about CPU's is they tend to be much more expensive to change out. It means a new MB and ram a lot of the time. I tend to buy a bit better cpu than I think I need and have it last through 2-3 GPU changes. Plus its a whole lot easier to upgrade without having to do a nuke and pave with a MB change. I've rarely found a cpu upgrade on the same socket is worthwhile unless you bought really low end to begin with. I suppose that is more an upgrade strategy issue that a real value issue. Still, as long as we're talking about a gaming rig I don't think AMD is quite the clear cut winner you make it out to be.

The game does matter quite a bit, as well as what instruction sets are being used. AMD is an absolute winner in my mind in a sub-$1000 gaming computer, and is competitive up to about $1250-1500, at which point you've probably maxed out your GPU. Once that occurs yes Intel is a better choice for most people. The simple thing is that generally, the more physics involved, the more CPU heavy it is going to be. Until of course, DirectX 11 is used, at which point the GPU takes over physics calculations should the developer of the game take advantage of it, as can be seen in CryENGINE 3 for example, where the game is very CPU heavy in DirectX 9 in Crysis 2 for example, whereas when in DirectX 11 is used it is nearly completely GPU heavy.
Another upside to what you just mentioned, is that AMD supports its motherboards for longer. Look at how long AM3 has lasted, with support for the CPUs on AM2+ (except for the 6 core processors in most cases, but thubian was later on of course). Similarly, high end AM3 chipset boards support bulldozer AM3+ chips with a BIOS tweak. Giving AMD an edge on two year to four year upgrade cycles, unlike Intel who looks to be in a continuous two year motherboard change cycle.

#85 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 25 May 2012 - 01:46 PM

I wouldn't label Bulldozer as an unequivocal winner or loser at this point for gaming. I think I'd instead simply call it the more interesting choice. For general CPU-intensive computing it is an unequivocal winner, being generally faster than its Sandy Bridge contemporaries at an pricepoint due to being a cheap octacore CPU (the same way Thuban was always very fast for its price in such applications).

Generally speaking, the FX-8150 almost almost tends to beat the modestly more expensive Core i5 2500k, and is often hot on the heels of the vastly more expensive i7 2600k (http://www.guru3d.co...cessor-review/1). Ivy Bridge, being only a few percent faster than SB, didn't do much to change that.


Games are not really multithreaded, precisely because they're not generally very intensive, a couple exceptions aside, which tends to make them the exception, but even then, note the types of gaming situations where faster CPUs make any kind of difference at these price-points: irrelevant ones. Advantages crop up for faster CPUs generally when playing games at absurdly low resolutions and settings, like that Anandtech article which got such an advantage in SC2 by playing it at 1024x768 at middling settings (really? 1024x768? Because someone with a new system is really going to play at such a resolution?), or in older DX9 titles, usually in cases where the difference is some unmeaningful distinction between one extraneously high framerate and another (let's face it, no one cares if they get 160fps or 130fps).


So you end up with a situation where Bulldozer really is generally the superior CPU at any pricepoint it hits, and in the one case that's the exception, namely gaming, it isn't tangibly inferior in most cases.


What makes it interesting is that if games do start to become more CPU-intensive, they'll likely do it by taking advantage of more cores more than doing anything else. How do I know? Simple: that's what basically every other sector of software has done. So if games ever do become CPU-intensive, that move will favor Bulldozer/Piledriver more than SB/IB.


Also keep in mind that Windows 8 gives a performance advantage to Bulldozer and its successors, so those chips are also actually somewhat faster than they appear anyways (which means they're ever further superior to equivalently priced Intel chips, and even closer to more expensive ones); Windows just hasn't been able to actually properly utilize them until now.

As Vulpes correctly points out, you generally also get a better upgrade path with AMD anyways. Funny how the socket situation completely reversed after LGA775 and 939...

Edited by Catamount, 25 May 2012 - 01:47 PM.


#86 Ironheart

    Member

  • PipPip
  • 35 posts

Posted 26 May 2012 - 10:28 AM

Considering their almost poor efficiency Bulldozer does not rank as high in price / performance as you make it look like. Depending on your usage, where you live, etc, you are looking at ~20 bucks a year at least.

Also those comparison charts are kinda laughable.

#87 Vulpesveritas

    Member

  • PipPipPipPipPipPipPipPipPip
  • 3,003 posts
  • LocationWinsconsin, USA

Posted 26 May 2012 - 10:35 AM

View PostIronheart, on 26 May 2012 - 10:28 AM, said:

Considering their almost poor efficiency Bulldozer does not rank as high in price / performance as you make it look like. Depending on your usage, where you live, etc, you are looking at ~20 bucks a year at least.

Also those comparison charts are kinda laughable.

Really... so the difference in power consumption between an i5 and an FX 125w CPU is 30 watts, correct? The most expensive power in the USA is 15 cents / kilowatt, or 15 cents more for every 300 or so hours you run your PC, or 15 cents more for every two weeks of it being on 24/7. Supposing your PC is on six hours a day, that brings it up to paying 15 cents more every two months, or about a dollar a year difference. Even if it's on 24/7, that's $4-5 a year more on your electricity bill that you have to pay on the AMD CPU (this is of course, without overclocking. With overclocking it does get closer to that $20 figure, though none of the stated values previously were based upon overclocking.) Gotta love people who do their research.
Average price of electricity in the USA, c2003 (in cents)


(http://www.eia.gov/e....cfm?t=epmt_5_3 The current US average is 11 cents - kw/hr)
And you consider them laughable why exactly, if I may ask?

Edited by Vulpesveritas, 02 March 2013 - 12:39 PM.


#88 Ironheart

    Member

  • PipPip
  • 35 posts

Posted 26 May 2012 - 02:26 PM

That's a pretty limited perspective. Not everyone is so lucky to have US energy prices. Gotta love people who think US is the center of the world. Also gotta love people who can read. I specifically said it depends on usage and where you life. Of course if you let it sleep idle beneath your desk you are not gonna see anything on your bill.

I'm pretty sure you are using the TDP for your numbers, which has (almost) nothing to do with the real power consumption of the CPU. It's used to plan your cooling system.
But since you prefer numbers here are some.

Right now in my local region in Europe we are paying 26 cent / kWh. An AMD FX 4100 (which has a TDP of 95) consumes 77W, and i3-2120 consumes 36W. That's a difference of 36W. Almost 1 cent / W. It would be a total of 87,6€ a year for 24/7 usage.
Of course since almost no one does that I divided it by 4, a bit more than 20€.
Though I'm a special case with all that simulation and calculation I'm pretty much leaving it on under load for 24/7. And that's just one PC.
But still this is mostly for gamers, for office you could get away with almost any CPU (like my old Athlon 64 3000+).
It's still not as negligible as you make it out to be. Except if you live in the US.
Especially the FX - 8120 @ 125W TDP almost consumes 80W more than the i5 2500k.
The FX - 8120 @ 92W TDP is inbetween with 40W more.

Also a table with basically 2 Benchmarks is not a comparison. You could've posted anything from your various hours of research that hold more information than this. I might have mist a link within the page that displays more benchmarks, this might be entirely possible, though even after a second look I couldn't find any. Maybe laughable was a bit rude, but it's at least not representative.
Since I'm already on it I guess I can continue.

If we look at more benchmarks the result pretty much relativises itself. I hope you don't mind if I don't post every benchmark, but just the summary.

Across all applications, this includes audio, video, photo, synthetic benchmarks, rendering, encryption:
  • 2500k - 153%
  • FX 8150 - 137%
  • 2300 - 131%
  • FX 6100 - 112%
  • 2120 - 110%
  • FX 6100 - 100%
Based on the Benchmark by ht4u.net, both multi and single threaded.


Spoiler


For Gaming:

According to Tom's Hardware guide at 1920p:

Posted Image


Prices for US according to newegg: If you don't plan on overclocking you have many more options, e.g. the i5 3550 for 209$ which is 5% faster on average in gaming and consumes ~16-17W less than the i5-2500k. Or the 2400 which is marginally slower than the 2500k.


Now if I had to recommend an AMD I wouldn't know to whom.
Certainly not to gamers or anyone who does simple office work.
And even then no one knows what Intel will bring to that table by that time.
For everything science related you are gonna use GPU-Computing anyway.
If you have a last gen CPU it doesn't matter anyway, the performance increase does not warrant a new CPU.
Sure, maybe AMD will support Bulldozer for another generation after piledriver, but they said the Desktop Market is not their main focus anymore and they will shift their know how to the Mobile Market. But by the time current PCs will be outdated it will be at least another 3-4 years.


In Europe an Ivy Bridge CPU is actually even cheaper than a Sandy Bridge. At least where I live.



PS: I didn't mean to insult any people in the process of writing this.


Edit: Also your top line, 4 credible sources as proof, that is kinda laughable considering you yourself usually give ... none.


Original Post:
Spoiler

Edited by Ironheart, 26 May 2012 - 02:30 PM.


#89 Thorqemada

    Member

  • PipPipPipPipPipPipPipPipPipPip
  • 6,389 posts

Posted 26 May 2012 - 02:57 PM

Price + Performance AMD is still Top.
Upgradeabiltiy AMD is still Top.

So everybody who is looking for a less expensive good performing system has a good choice with AMD.

Second Graph is Price+Performance:
http://www.computerb...-ivy-bridge/25/

#90 Vulpesveritas

    Member

  • PipPipPipPipPipPipPipPipPip
  • 3,003 posts
  • LocationWinsconsin, USA

Posted 26 May 2012 - 03:06 PM

I used the USA for two reasons mainly, first because I live there, and second because 40-50% of people on this forum live in north ameriica, with the other 50-60% split across the globe mostly randomly.

One thing that I must say however, is that while yes, Intel is more energy efficient, most people don't have their PC on all day. Also, let's look at the 4170, shall we? by your own benchmarks, with the 20% overclock it has, it should perform between the i3-2100 and the i5s.

Also unfortely, the world doesn't revolve around europe either, and for the most part ivy is more expensive. Largely however, the price/performance ratio still swings in AMD's favor, especially if you factor in the longer lasting motherboard for cpu upgrades. (sorry for earlier grammatical, spelling, and context errors due to this being originally typed via phone)

As far as why I don't list sources with the majority of posts I make, and why I included that in the first place are the following reasons;
1. I've been building PCs for a good number of years. my grandfather had me put together electronics all the time, including disassembling and reassembling our first PC when I was 6, and by 9 I was collecting 'junk' PCs from people in the neighborhood and seeing what worked with what. By 14 I was building PCs for friends and doing repairs for people in the neighborhood. I'm not exactly new to this, and I largely taught myself by hand.
2. I read benchmarks and reviews an average of 1-4 hours a day. I do know modern hardware fairly well, despite my current financial situation and being unable to test things firsthand, as much as I would like to.
3. I understand real world values. Let's face it- benchmarks are usually done on a 'clean' system with nothing but the operating system, which doesn't show one of the largest reasons to having more processor cores in a system - namely, multitasking.
4. I understand I can be wrong, however given my confidence in my own knowledge, reason, and experience, I would like a number of reviews that a level of doubt can be raised to my assertion of a topic before I invalidate it.

Edited by Vulpesveritas, 26 May 2012 - 10:59 PM.


#91 chaz706

    Member

  • PipPipPipPipPipPip
  • 263 posts
  • LocationSomewhere in Utah

Posted 26 May 2012 - 10:36 PM

As a person who speaks from 12 years of experience of comp building I have to agree with everything you said Vulpesveritas (which I think is a truly fitting name BTW).

When it comes to building a custom rig though: I just tend to with what I know works well. Ultimately you have to work with what suits you and for some people... even practical solutions are expensive... especially for laptop gamers. There are fanboys on both aisles that like to tout certain stats to prove a point but in the end... your system is your system.

#92 Stahlseele

    Member

  • PipPipPipPipPipPipPip
  • 775 posts
  • LocationHamburg, Germany

Posted 27 May 2012 - 03:16 AM

Tell me about it . . my 15.6" Gaming Laptop cost me 2200€ / 2754U$ . .
If i had been willing to go for a 17,3 or 18,6" i could have had the same hardware for about half that price . .

#93 PerryRaptor

    Member

  • PipPipPip
  • 80 posts
  • LocationNew Mexico

Posted 27 May 2012 - 03:46 AM

I've read all five pages of posts...only a few posts are relevant to the very first one.

Would like to read more about SSD + Hard Drive configurations vs. 2 or 3 Hard Drive RAID setup. What are the performance and reliability myths in regards to gaming?

Would like to read about a PCI-E 3.0 video card plugged into a PCI-E 2.0 or 2.1 motherboard socket. Will it work? Will there be gaming performance drawbacks?

#94 Stahlseele

    Member

  • PipPipPipPipPipPipPip
  • 775 posts
  • LocationHamburg, Germany

Posted 27 May 2012 - 03:49 AM

SSD is good for no noise and saving energy on portable devices and it's more shock resistant than any HDD.
In stationary computers, it only helps boot up windows faster and maybe with the starting of programs, in gaming there is no meaningfull difference, seeing how most stuff of gaming happens in the RAM anyway . .

PCI-E3 card in PCI-E2 slot should work, and no, not much of a performance difference either . .
Wanna know something funny?
PCI-E2x16 versus PCI-E2x8 is a difference of less than 5%.

#95 Cochise

    Member

  • PipPipPipPipPipPipPip
  • The 1 Percent
  • The 1 Percent
  • 642 posts
  • LocationAustin, Texas

Posted 27 May 2012 - 06:42 AM

Nice stuff Vulp,

As an AMD fanboy, I hate to say it, but the Intel i5-2500k is probably the best bang for buck in gaming right now and bests the new AMD FX-8150 in most areas. They are also priced almost exactly the same.

I say "right now" because you will probably get more life out of a 990FX motherboard for AMD and who knows what windows 8 will bring to the FX-8150 but throwing all the "what ifs" out the window, it would probably be the i5. I would say that the AMD is more "future proof" than the i5 if you do more than just gaming. If not, then no.

Yet another FX-8150 comparo review
http://www.bjorn3d.c...25&pageID=11066

It will be interesting to see what the next AMD Piledriver will bring because I think with this iteration AMD was just laying the architecture out for what the future "will be" because with the FX-8150 it has not quite arrived yet.

Because I am a fanboy, I will probably get the AMD and I know that I will be able to use the next generation CPU in my motherboard and not have to re-invest in all that again.

Food for thought.

Edited by Cochise, 27 May 2012 - 06:43 AM.


#96 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 27 May 2012 - 06:48 AM

Ironheart, you power estimates go far beyond absurd, easily to the point of being -to use your favorite word- laughable.

Not only would one be required to run their computer 24/7 to see those numbers; they'd have to run their computer at full load 24/7. Basically, our hypothetical person would have to run Prime95 every second of every day or every week of every month for a year.

To use bend-over-backward liberal estimates that are at least slightly less "laughable", let's take your most extreme 80W difference and assume full load for five hours a day, which really is bending over backward for any gamer since games are only going to utilize half an octacore chip, at most (so I guess we can assume they're doing something other than gaming?), and since that's really quite a lot of play time to average for anyone who's more than like... 12 (and has a job/college to think about)

5 hours a day is 1825 hours for the year 1825h*3600s/h*80j/s= 525,600,000

Congrats, you've used a bit over 500 additional megajoules. One kilowatt hour is 3.6MJ so at 26 cent per, you're looking at a hair less $38/year, or basically the cost of buying a decent cup of coffee once a month.

For a gamer running only 2-4 cores, and even then, only modestly in a modern DX11 title, that's likely to be vastly less.


As for the "laughable" sites presented on performance, I'm afraid yours is by far the least substantial. There is basically no information given on methodology by you, nor did you actually cite a verifiable source (just a general website, in which these alleged tests could be anywhere), getting results that don't agree with those of the bigger, more reputable review sites.

We can't tell anything from the numbers that the site you're "citing" gets, because we have no information on how each performed in each specific test, the margin by which any given chip was ahead relative to any other in any particular test (which in turn would tell us how much they're using properly threaded software in that batch of tests... nearly a full quarter of which are file compression tests (why?)), or even what each program was doing.


Going back to sources where we actually can see exactly what happened-

The Guru3D review I linked focused on the i7 2600k, so the i5 2500k was only included as additional backdrop in most of the tests, and yet, where both were included, the 8150 won in 9 of those tests, effectively tied in 2 (within about 3% at most), and lost in 3. I'm too tired to average it all out, but looking at the margins by which the 8150 tends to win vs the 2500k, the result is a foregone conclusion.

Tom's Hardware got mixed results, but still ended up concluding that Bulldozer is handily faster than the 2500k in heavily threaded software (as we've been saying all along), which CPU-intensive software generally and ever-increasingly is.

Anandtech gets results like those of Guru3D. The FX8150 beats the 2500k in almost every program, except when they artificially cripple Cinebench to only use a single core (artificially since Cinebench is absolutely mutlithreaded). Note of course how that particular detail is exactly the kind of thing missing from the alleged "tests" from your "source".


The only sector of software in which the 2500k shows any kind of consistent edge is gaming, and since most games aren't CPU-intensive, that's generally limited to either tests that artificially bottleneck the game on the CPU by using absurdly low resolutions, in older DX9 titles, and/or generally cases where the CPU is the bottleneck because the FPS is already extraneously high.

If games generally do become CPU intensive, there's no reason to think they won't follow the pattern of all other software, and take advantage of CPU speed primarily by using more cores, which would favor Bulldozer immensely, yet, somehow despite its importance, your post fails entirely to even mention either the increasing role of multithreading in software, or the fact that, again, gaming performance rarely varies except in meaningless cases.


Of course, I'm not saying Bulldozer is unequivocally the better chip to buy. There are many cases where I would recommend Intel chips, and I have an Intel chip in one of two of my gaming machines (a Phenom II in the other). The right decision's contingent upon particular software focus and assumptions about the future, but unlike you, I haven't made it a point to try as hard as possible to come up with the most biased analysis (if can call it that) possible, leaving out any fact that doesn't further that point (you didn't even ADDRESS upgradeability), and bending over in every way possible to favor Intel with any assumptions you make, or avoid citing actual sources with links we can actually follow to scrutinize methodology, and I certainly wouldn't pretend, as you do, that there is no case where a Bulldozer chip might make sense, given the broad selection of software in which it handily outperforms its Intel contemporaries.

Maybe that wouldn't irk me so much had you not also made it a point to call far more substantial posts and followups by others "laughable".

View PostCochise, on 27 May 2012 - 06:42 AM, said:

Nice stuff Vulp,

As an AMD fanboy, I hate to say it, but the Intel i5-2500k is probably the best bang for buck in gaming right now and bests the new AMD FX-8150 in most areas. They are also priced almost exactly the same.

I say "right now" because you will probably get more life out of a 990FX motherboard for AMD and who knows what windows 8 will bring to the FX-8150 but throwing all the "what ifs" out the window, it would probably be the i5. I would say that the AMD is more "future proof" than the i5 if you do more than just gaming. If not, then no.

Yet another FX-8150 comparo review
http://www.bjorn3d.c...25&pageID=11066

It will be interesting to see what the next AMD Piledriver will bring because I think with this iteration AMD was just laying the architecture out for what the future "will be" because with the FX-8150 it has not quite arrived yet.

Because I am a fanboy, I will probably get the AMD and I know that I will be able to use the next generation CPU in my motherboard and not have to re-invest in all that again.

Food for thought.


Well, that review isn't really clearcut since the 8150 handily wins in Sandra, Truecript, and half the Cinebench tests. It basically ends up as a draw there.

Edited by Catamount, 27 May 2012 - 06:59 AM.


#97 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 27 May 2012 - 07:05 AM

It should be noted too that in terms of value, the FX-8120 is a better value in most ways than the 8150. It's nearly as fast as the 8150, and more or less trades blows with the 2500k, for $169.

In a lower budget build, I'd have no trouble recommending that chip, at least as an option.

Edited by Catamount, 27 May 2012 - 07:05 AM.


#98 Oderint dum Metuant

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 4,758 posts
  • LocationUnited Kingdom

Posted 27 May 2012 - 07:09 AM

I would agree with the 8120 recommendation, but i dont think i could ever recommend an 8150

#99 Oderint dum Metuant

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 4,758 posts
  • LocationUnited Kingdom

Posted 27 May 2012 - 07:40 AM

@Mopar

If you had actually read the entire post, you would see the point your arguing over (2500k being better at gaming) was already covered infact Catamount expressly says that it is.
The part that the AMD chips surpass it in is the heavily threaded applications (Read not games)

#100 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 27 May 2012 - 08:09 AM

View PostDV McKenna, on 27 May 2012 - 07:09 AM, said:

I would agree with the 8120 recommendation, but i dont think i could ever recommend an 8150


That's about where I stand. I would recommend an 8150 if it was someone doing heavy non-gaming stuff outside of Photoshop (which hates Bulldozer for some strange reason), but I don't think I would for gaming. The 8150 might be a chip that would pay off if, in a year or two or three, we start seeing more multithreaded games, but that's hardly guaranteed.


The 8120, on the other hand, I might recommend for a real budget build for almost any use (might; it depends).

It's too bad we haven't seen Windows 8 performance improvements materialize for Bulldozer, at least not in the development preview :)

Edited by Catamount, 27 May 2012 - 08:11 AM.






3 user(s) are reading this topic

0 members, 3 guests, 0 anonymous users