Jump to content

How Can I Beef Up My Fps?


54 replies to this topic

#21 Az0r

    Member

  • PipPipPipPipPipPip
  • 343 posts

Posted 26 December 2012 - 05:16 AM

View PostVulpesveritas, on 25 December 2012 - 11:07 AM, said:


Most Intel processors are having trouble too, as the game has poor coding. Pretty much just SB/SB-E/IB i5's/i7s are getting min fps above 35-40 consistently at stock. Older Intel CPUs and i3s are hurting as well.

As for the "converting you forever" does that mean that if AMD were to bring a superior product out of the blue like they did with the Athalon 64, you wouldn't use another AMD CPU? You don't have to have used nothing but Intel to sound like a fanboy, since fanboyism comes down to ignoring a superior product / person for the sake of a brand or ideal.

Like I'm an ethics fanboy.

As far as his system goes, by what I've been reading on his thread he is averaging in 40-50fps with these dips below 30.


I feel that if you're going to use personal reasons like ethics to justify your hardware advice then you probably shouldn't be giving advice in the first place. Currently there is no reason to buy an AMD chip for gaming at almost any price point, especially at the $200+ levels.

As for the OP you have a few options and I'll list the pros and cons of both:
A. Purchase an aftermarket cooler and overclock the stink out of your current processor. This is probably your most cost effective as you get to keep your current cpu/mobo. You can pick up a decent air cooler for as little as $35 http://www.newegg.co...N82E16835103099 which would allow a decent overclock.
Pros: Cheap, cost effective, keeps current hardware intact.
Cons: Some technical experience required, only provides a small performance boost. Voids warranty, Higher power consumtpion

B. Purchase an Intel processor such as the i5-3570 and appropriate motherboard.
Pros: Bigger performance gains than option A, lower power usage. Better clock for clock performance in all games.
Cons: Expensive, requires fresh windows install, left with excess hardware, not overclockable.

C. Purchase an i5-3570k or i7-3770k, appropriate z77 motherboard and cpu cooler for overclocking.
Pros: This will give you the best performance currently available short of spending $1500 for a socket 2011 chipset.
Cons: All the cons of B, plus warranty void, more power consumption and technical knowledge required.

Note all 3 of these options will require you or someone you know to disassemble and reassemble your PC.

As for the software side, some users have reported FPS gains by doing a fresh install of the game.
Also HOPEFULLY in the near future we'll be getting Dx11 which should give you a huge fps boost considering you have an extremely powerful Dx11 capable video card.

Edited by Az0r, 26 December 2012 - 05:18 AM.


#22 Sen

    Member

  • PipPipPipPipPipPipPip
  • 757 posts
  • LocationTexas

Posted 26 December 2012 - 07:04 AM

Quote

I feel that if you're going to use personal reasons like ethics to justify your hardware advice then you probably shouldn't be giving advice in the first place. Currently there is no reason to buy an AMD chip for gaming at almost any price point, especially at the $200+ levels.


Seems to me there are only a few $200 price point AMD processors available. Considering the general price of the i7 is $100+ more, this alone makes the 8350 viable enthusiast situations where someone needs to stretch the value of their dollar to complete a build. It may or may not game better than an i5, but an i5 should handle this game with no problems, so why should the 8350 be any different?


Ultimately though, I think Vulp communicated it to me best back in November:


Quote

[color=#959595]As it is through my analysis of what is out, Vishera is a better performance / value option overall at this time for both current applications and looking to applications in 2-3 years, given we're hitting 6 thread utilization right now in a few high end games in 2012. [/color]
[color=#959595]Aside from a value standpoint, there are a couple of other points to my recommending them. First off is that despite everything, Intel still has a track record of criminal activity. As such from a moral standpoint I am hardpressed to actively recommend them when there is a viable alternative with competitive levels of performance for the price. While AMD falls behind in single thread, they are superior in multi-thread, and I look to recommend systems which are still viable in 3-4 years when someone is going to start considering an upgrade, and as such AMD ends up being my go-to recommendation. Even if AMD isn't in the enthusiast business in 5 years, that doesn't matter to me on what I recommend today, as I would much rather recommend them a product that will still be at least somewhat viable near the end of those 5 years. [/color]
[color=#959595]And finally comes my loyalty to innovation, which AMD has shown to have more of than Intel, depsite being a company 1/70th the size of Intel. From x64 to multicore, AMD has shown that they are where people who think of where their $$$ are going should probably invest in if they're looking for progress. Heck, AMD's complier is faster for Intel CPUs than Intel's own C++ compiler. Bulldozer could have and should have been pulled off better. The idea behind it was sound, in a SMT arrangement to allow for more parallel processing while using less space on die, hence saving transistors and theoretically improving energy efficiency. Obviously that didn't go as planned, and AMD failed rather heavily on the design. The front end is too small, cache has too high latency, and leakage was horrible. Piledriver fixed the leakage issue to at least some extent, and improved the floating point cores, but the front end and cache still hold it back from what it should be.[/color]


#23 NinetyProof

    Member

  • PipPipPipPipPipPipPip
  • 547 posts
  • LocationSan Diego, CA

Posted 26 December 2012 - 04:18 PM

View PostSen, on 26 December 2012 - 07:04 AM, said:


Seems to me there are only a few $200 price point AMD processors available. Considering the general price of the i7 is $100+ more, this alone makes the 8350 viable enthusiast situations where someone needs to stretch the value of their dollar to complete a build. It may or may not game better than an i5, but an i5 should handle this game with no problems, so why should the 8350 be any different?


Debating is for esoteric thoughts and political view points ... when it comes to "clock cycles" there are no debates ... only facts. Basically Tom's is the defacto resource for "metrics" and how things perform. The below is the *equiv* list of AMD to INTEL.

http://www.tomshardw...ock,3106-5.html

Many suggestions have been given to the OP on how to increase his/her framerate ... I am merely providing the resource so that the Intel / AMD non-sense will end. Tom's is pretty much the defacto source for all things CPU / Performance.

Long story short? If you want to support a company cause you like them, or you think it's good for the industry to have robust competition, etc, etc, etc ... that's great ... jolly good show man! Just don't try to persuade people based upon a "false narrative".

#24 Youngblood

    Member

  • PipPipPipPipPipPipPip
  • 604 posts
  • LocationGMT -6

Posted 26 December 2012 - 08:53 PM

Was there a reason you all had to bring this into the thread? There didn't need to be anything more aside from "get an i5-3570K system". Not like it all even matters anyways since the game itself is hard-capping the performance of almost everyone's machines anyway.

If you would like to provide examples of your builds that can miraculously run this game at 1920x1080 with High or Very High settings without a minimum dipping below 50 FPS -AND- support them with video recordings, you might be able to contribute to the point of the thread.

#25 M E X

    Member

  • PipPipPipPipPipPip
  • The Named
  • The Named
  • 381 posts
  • Locationg-town, Vienna, Austria, EU.

Posted 27 December 2012 - 05:29 AM

View PostXxDRxDEATHxX, on 25 December 2012 - 07:48 AM, said:

In order to avoid turning this thread into an intel vs amd thread. I understand that technology wise, intel is the way to go. But budget wise, i chose to go the AMD route so that i could start playing last month instead of waiting until Jan. or Feb. 2013 to play.
I choose to get a cheap computer for playing MWO in November too :D
In my case this was a old used business computer for 50€ with P4 630, 1GB RAM & 80GB HDD ... a bigger & better spare HDD I had at home from upgrading my external USB-HDD to 2TB, and I immediatly spent another 50€ for 2 x 2GB RAM's.
Sadly I couldn't afford a GPU last month, so I had to wait until I got my cash in December before I could buy a msi HD7750 GPU. Together with the GPU I upgraded the CPU to a Intel Pentium D 820, which I found for 15€ at willhaben.at
But I am not satisfied with a FPS of ~10 ( 4-25 in battles, up to 30 in the mechlab, sometimes even slightly above 40 when logged off ).
Because of this I intend to upgrade to a Intel Pentium D 945, 950, ... 960 - depending on what I can get for a few Euro's here in Vienna :D

View PostXxDRxDEATHxX, on 25 December 2012 - 07:48 AM, said:

As for my GPU, do i really need to overclock it? It's already a 7950, pre overclocked from the factory and running at 3GB with DDR5 ram. Here's the link
Have you checked that the settings of your GPU are OPTIMIZED for MWO ?
I have heard stories about NVIDIA GPU's running with full game performance while the computer is idle ... and only giving IDLE performance while playing MWO :P

As you have a OEM version of Win 7, I would suggest a clean install of 64bit Windows 8 RELEASE PREVIEW on another HDD with at least 40 GB, so that you can try out if the BLOATWARE Windows you got with your computer isnt the source of your problems !
Win 8 RP could be downloaded for FREE in summer from the website of M$ and has a product key which is valid until mid-January, which also allows you to get a cheaper full version of Win 8 ONLINE :D

MfG, MEX

Edited by M E X, 27 December 2012 - 05:35 AM.


#26 Sen

    Member

  • PipPipPipPipPipPipPip
  • 757 posts
  • LocationTexas

Posted 27 December 2012 - 08:14 AM

Quote

Debating is for esoteric thoughts and political view points ... when it comes to "clock cycles" there are no debates ... only facts. Basically Tom's is the defacto resource for "metrics" and how things perform. The below is the *equiv* list of AMD to INTEL.


http://www.tomshardw...iew,3328-6.html

http://www.tomshardw...ew,3328-13.html

http://www.tomshardw...ew,3328-14.html

http://www.tomshardw...ew,3328-15.html

Alright, let's talk facts. Now, granted,these benchmarks all come from the same review, and the overall conclusion was that Vizshera was generations better than Piledriver, but still not "up" with intel, especially when factoring in power draw for the performance. But wait, we're talking gaming and FPS here, and why O/P is not getting higher frame rates. You blame the processor, yet when you look at the actual real world GAMING BENCHMARKS AT MAX SETTINGS [from your "end all be all tech source]. . .you'll see that in every case the 8350 is within 3 FPS or so of the i7 every time.

Even at lower levels, however, FPS for the 8350 are over 60 anyway, and as such would suggest that the issue here is NOT the processor in this case.

Now, explain to me why the O/P, with a relatively brand new system needs to go out and blow another $300-$500 on a brand new processor and motherboard just to play MwO?

#27 Thorqemada

    Member

  • PipPipPipPipPipPipPipPipPipPip
  • 6,396 posts

Posted 27 December 2012 - 08:39 AM

For comparison:
FX 8350 (non-OC and dont plan to).
ATI HD6950 2GB (Cat 12.11 Beta - standard application controlled settings)
8 GB DDR3 1600
X-Fi Titanium
Win7 x64
Run the game at high or very high settings in 1920x1200.

The FPS range from 27 to 60 and i guess the median is somewhere ~40.
The CPU goes up to 68° Celsius aircooled (would probably a tad cooler when the Ventilator in the Casefront is fixed)
Recently the patches have made the game a tad slower and i play on high to avoid too low min fps situations.

Edited by Thorqemada, 27 December 2012 - 08:40 AM.


#28 Vulpesveritas

    Member

  • PipPipPipPipPipPipPipPipPip
  • 3,003 posts
  • LocationWinsconsin, USA

Posted 27 December 2012 - 10:33 AM

View PostNinetyProof, on 26 December 2012 - 04:18 PM, said:


Debating is for esoteric thoughts and political view points ... when it comes to "clock cycles" there are no debates ... only facts. Basically Tom's is the defacto resource for "metrics" and how things perform. The below is the *equiv* list of AMD to INTEL.

http://www.tomshardw...ock,3106-5.html

Many suggestions have been given to the OP on how to increase his/her framerate ... I am merely providing the resource so that the Intel / AMD non-sense will end. Tom's is pretty much the defacto source for all things CPU / Performance.

Long story short? If you want to support a company cause you like them, or you think it's good for the industry to have robust competition, etc, etc, etc ... that's great ... jolly good show man! Just don't try to persuade people based upon a "false narrative".

How about we see facts from across the board instead of going in an off debate. I will provide a quick look at all the benchmarks from the yearly look.
Spoiler


So no, the CPU is not the problem here, it's the game's coding. In worst case single thread floating point heavy scenarios, an FX-8350 is as fast as a 1st gen Intel i7, or around 30% slower than an Ivy Bridge i5. most of the time it is acting on par, and quite often it is offering performance more similar to an i7 CPU.

Edited by Vulpesveritas, 27 December 2012 - 10:33 AM.


#29 Odins Fist

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 3,111 posts
  • LocationThe North

Posted 27 December 2012 - 11:58 AM

View PostXxDRxDEATHxX, on 24 December 2012 - 07:35 PM, said:

My WORST FPS rates which are upper 20's(as shown in the above video). My FPS rates are shown in the upper/left corner of the screen.

.
Believe it or not people "TURNING DOWN" their graphics in game options on lower spec systems have had a drop in frame rates. You do "NOT" have a lower spec system, but try this...
One solution, (believe it or not) was to turn just about everything "UP"...
AA I can't remember what that was set at, or particles, but there is a few posts where turning graphics options up on most things actually increased FPS...
.
EDIT: my very good friend bought a FX-8350, (clocked at 4.6GHZ now) and is having no issues (other than single threaded performance on some things), but his settings were set to high on almost everything "BEFORE" he borked his ATI 6870, which coincidentally was an RMA replacement for a 6850 that was still in warranty, they sent him a bigger card as a replacement, and that one borked too..
.
Over 50% failure rate now with his ATI cards... YIKES..!!!

Edited by Odins Fist, 27 December 2012 - 12:04 PM.


#30 Vulpesveritas

    Member

  • PipPipPipPipPipPipPipPipPip
  • 3,003 posts
  • LocationWinsconsin, USA

Posted 27 December 2012 - 12:23 PM

View PostOdins Fist, on 27 December 2012 - 11:58 AM, said:


Over 50% failure rate now with his ATI cards... YIKES..!!!


Sounds like he needs to stop smashing mirrors and walking under ladders. And/or buy from more reputable AIB partners like Sapphire, HIS, and Asus if he isn't.

Care to explain what failed on the cards before you go on a Nvidia fanboy rant again, was it the VRM, the heatsink mounting, memory, chip, what?... Because all but one of the many things that can kill a GPU card is going to actually be at fault with the AIB partner, not AMD's quality control, and even then the blame could well be passed onto TSMC rather than the AMD/ATI design team.

Edited by Vulpesveritas, 27 December 2012 - 12:29 PM.


#31 Oderint dum Metuant

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 4,758 posts
  • LocationUnited Kingdom

Posted 27 December 2012 - 12:41 PM

View PostVulpesveritas, on 27 December 2012 - 10:33 AM, said:

How about we see facts from across the board instead of going in an off debate. I will provide a quick look at all the benchmarks from the yearly look.

So no, the CPU is not the problem here, it's the game's coding. In worst case single thread floating point heavy scenarios, an FX-8350 is as fast as a 1st gen Intel i7, or around 30% slower than an Ivy Bridge i5. most of the time it is acting on par, and quite often it is offering performance more similar to an i7 CPU.



Slightly misleading here Vulp, not cool.
Even given the benchmarks you post for Crysis 2 most relevent to playing MWO (people should note that it is at low resolution and low graphical settings) the 8350 @ 4GHZ does not even equal or surpass the venerable I5 2500k let alone approach Ivybridge, whilst it does surpass the first generation of I series chips that's hardly a selling point to AMD, congratulations of making a new chip thats as fast as chips 3 generations ago...
What we need to remember, is that in terms of mainstream enthusiasts the K series of chips is the standard line, its no use comparing new AMD chips to Intel chips from three generations ago, that is a bad comparison.

Whilst i agree the 8350 is superior in the unimportant encryption and general multi tasking scenarios, the AMD chips as a grand total are not superior for gaming.
You also need to note that those results are done with varying clock speeds and the AMD chips appear to have all been tested with faster RAM.

So in grand summary, lets remember this is a gaming forum and we are here in this thread talking about FPS.

The AMD chips do not perform equal to intel chips in gaming, they are inferior. However and sadly those benchmarks only show lower settings, i would suspect that in the grand design of things the FPS difference between the top end chips of each competitor are not that far apart (+/- 15FPS)

Edited by DV McKenna, 27 December 2012 - 12:47 PM.


#32 Odins Fist

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 3,111 posts
  • LocationThe North

Posted 27 December 2012 - 01:02 PM

View PostVulpesveritas, on 27 December 2012 - 12:23 PM, said:


Sounds like he needs to stop smashing mirrors and walking under ladders. And/or buy from more reputable AIB partners like Sapphire, HIS, and Asus if he isn't.

Care to explain what failed on the cards before you go on a Nvidia fanboy rant again, was it the VRM, the heatsink mounting, memory, chip, what?... Because all but one of the many things that can kill a GPU card is going to actually be at fault with the AIB partner, not AMD's quality control, and even then the blame could well be passed onto TSMC rather than the AMD/ATI design team.

.
He has bought "ONLY" H.I.S., and Saphire Video Cards... The only series that hasn't borked was the XFX 7770 on his wife's computer..
.
He sent his cards back, one was a brown out, the others all started aftifacting... He has been running 850 watt PSUs since his first brown out... I haven't had (1) Nvidia card die on me personally using (3) different brands...
.
Nvidia Fanboy rant huh... I don't care what "YOU" think in the least... Go back to your ASRock motherboards, and your ethics problems you have with Intel.. BTW, I don't use Intel yet...

#33 Vulpesveritas

    Member

  • PipPipPipPipPipPipPipPipPip
  • 3,003 posts
  • LocationWinsconsin, USA

Posted 27 December 2012 - 01:04 PM

View PostDV McKenna, on 27 December 2012 - 12:41 PM, said:



Slightly misleading here Vulp, not cool.
Even given the benchmarks you post for Crysis 2 most relevent to playing MWO (people should note that it is at low resolution and low graphical settings) the 8350 @ 4GHZ does not even equal or surpass the venerable I5 2500k let alone approach Ivybridge, whilst it does surpass the first generation of I series chips that's hardly a selling point to AMD, congratulations of making a new chip thats as fast as chips 3 generations ago...
What we need to remember, is that in terms of mainstream enthusiasts the K series of chips is the standard line, its no use comparing new AMD chips to Intel chips from three generations ago, that is a bad comparison.

Whilst i agree the 8350 is superior in the unimportant encryption and general multi tasking scenarios, the AMD chips as a grand total are not superior for gaming.
You also need to note that those results are done with varying clock speeds and the AMD chips appear to have all been tested with faster RAM.

So in grand summary, lets remember this is a gaming forum and we are here in this thread talking about FPS.

The AMD chips do not perform equal to intel chips in gaming, they are inferior. However and sadly those benchmarks only show lower settings, i would suspect that in the grand design of things the FPS difference between the top end chips of each competitor are not that far apart (+/- 15FPS)

I was speaking overall given the benchmarks done by Tomshardware overall, in gaming tests there is little difference at high graphics settings in modern games, and overall it is still more than enough to get 60+ fps in most any game out there.

The reason for comparing to 1st gen i7s is the FX-8350 still outshines i3s in most every benchmark while falling behind the i5s, and having said 1st gen i7s being the closest comparison. And this is still in worst case scenarios.
Beats SB/IB i5 processors; 14 benchmarks, one gaming
Acts the same as SB/IB i5 processors; 14 benchmarks, one gaming
Performs worse than i5 processors; 6 benchmarks, one gaming

And note, all processors in that benchmark were done with the fastest RAM set the processor is rated for in the channel density it is rated for, so 1866 for FX/A processors, 1600 for SB/IB/Thubians, etc.


View PostOdins Fist, on 27 December 2012 - 01:02 PM, said:

.
He has bought "ONLY" H.I.S., and Saphire Video Cards... The only series that hasn't borked was the XFX 7770 on his wife's computer..
.
He sent his cards back, one was a brown out, the others all started aftifacting... He has been running 850 watt PSUs since his first brown out... I haven't had (1) Nvidia card die on me personally using (3) different brands...
.
Nvidia Fanboy rant huh... I don't care what "YOU" think in the least... Go back to your ASRock motherboards, and your ethics problems you have with Intel.. BTW, I don't use Intel yet...


One is out the door, the others can be from various problems, though I'd guess the VRM. As far as other bits go, oh well. I guess you'll stay in the past when better value products are available.

Edited by Vulpesveritas, 27 December 2012 - 01:09 PM.


#34 Oderint dum Metuant

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 4,758 posts
  • LocationUnited Kingdom

Posted 27 December 2012 - 01:14 PM

View PostVulpesveritas, on 27 December 2012 - 01:04 PM, said:

I was speaking overall given the benchmarks done by Tomshardware overall, in gaming tests there is little difference at high graphics settings in modern games, and overall it is still more than enough to get 60+ fps in most any game out there.

The reason for comparing to 1st gen i7s is the FX-8350 still outshines i3s in most every benchmark while falling behind the i5s, and having said 1st gen i7s being the closest comparison. And this is still in worst case scenarios.
Beats SB/IB i5 processors; 14 benchmarks, one gaming
Acts the same as SB/IB i5 processors; 14 benchmarks, one gaming
Performs worse than i5 processors; 6 benchmarks, one gaming

And note, all processors in that benchmark were done with the fastest dual channel RAM the processor is rated for.


In terms of gaming, unless told otherwise by the OP that is all that matters, sometimes i think you forget that the vast majority of questions here are directed towards gaming, and specifically MWO (Crysis 2 is the closest we have).
Now without varying benchmarks of different graphical setups at my finger tips im still willing to go on record and say that the 8350 here at highest settings is only going to be as good as Sandybridge, with Ivybridge leading the way.

And i will go on record, there is a real lack of 8350 crysis 2 bechmarks.

#35 THOR HAMMER

    Member

  • PipPipPipPipPip
  • Elite Founder
  • Elite Founder
  • 106 posts
  • LocationNEVADA

Posted 27 December 2012 - 01:21 PM

View PostXxDRxDEATHxX, on 24 December 2012 - 07:35 PM, said:



My CPU average temp during gameplay is in upper 40's(celcius) and max temp is 55c. A map where it's snowing is when i get my WORST FPS rates which are upper 20's(as shown in the above video). My FPS rates are shown in the upper/left corner of the screen.

First of all, i'm not complaining because in my opinion my gameplay and my FPS feels very smooth and enjoyable.

I've been asking around and thought it'd be a good idea to post this here:

Is there anything that i can do to bring up my FPS rate to upper 40's? I know that the low FPS problem that i have and everybody else has is because this is beta. But i want to MAKE SURE that i've done everything that i can hardware/software wise to beef up my FPS and to avoid lag.

Have i done everything i can from my end?

Below are my specs:
CPU: FX8350 8 core, 4.0GHz
RAM: DDR3 16GB Corsair Veng. 240pin
GPU: Radeon 7950 OC edition DDR5
Motherboard: ASUS Sabertooth 990fx R2.0
PSU: Roswell 80+platinum 750watt
OS: Windows 7 Ultimate, 64 Bit, OEM

Edit: I would also like to know something else. How can i tell what "percentage load" my GPU and CPU is under? I noticed that some people on the forums here say that MWO uses the CPU more than my GPU. I would like to see for myself if this is the case for my PC.


It sounds like driver issues that setup should be getting 60 fps to 70fps im running a old amd quad phenom 2 3.0 ghz with a evga geforce gtx 670 ftw and im getting fps in the high 40s lower 50s range my brother has the same cpu as you running a geforce gtx 660 superclock and hes getting fps in the high 50s lower 60s its amd video drivers there known for doing that also you may want to run a malware and antvirus make sure your not infected and to the dorks that think that they need $500 intel chips save your money buy amd and use that extra cash for a good video card .

#36 Vulpesveritas

    Member

  • PipPipPipPipPipPipPipPipPip
  • 3,003 posts
  • LocationWinsconsin, USA

Posted 27 December 2012 - 01:25 PM

View PostDV McKenna, on 27 December 2012 - 01:14 PM, said:


In terms of gaming, unless told otherwise by the OP that is all that matters, sometimes i think you forget that the vast majority of questions here are directed towards gaming, and specifically MWO (Crysis 2 is the closest we have).
Now without varying benchmarks of different graphical setups at my finger tips im still willing to go on record and say that the 8350 here at highest settings is only going to be as good as Sandybridge, with Ivybridge leading the way.

And i will go on record, there is a real lack of 8350 crysis 2 bechmarks.

There is a definite lack of benchmarks, perhaps we will see some with Crysis 3.

As far as performance goes, in the benchmark we have there is less than 5% difference between an FX-8350 and the fastest IB processor, and I am still in the habit of recommending what I believe to be a better system overall, as even among enthusiast users, there isn't much of a point in making a machine just for gaming, especially in scenarios where there isn't any truly visible difference in performance to the end user for said game. Benchmarkers of course will likely buy Intel, but they're looking at rigs far out of the price range we normally talk about. Overclockers may prefer Intel or AMD, depending on whether they're looking for the Intel numbers or the pure ghz on the FX series. Gamers will likely see higher performance in games 2-4 years down the line on the FX-8350 vs the i5 given the multithread performance when we start seeing these systems fall behind the line and most people start looking at upgrading.

#37 THOR HAMMER

    Member

  • PipPipPipPipPip
  • Elite Founder
  • Elite Founder
  • 106 posts
  • LocationNEVADA

Posted 27 December 2012 - 01:28 PM

You can also run Crysis 2 in developer mode it will show you your fps in the console its the same engine i run 1400x900 resolution and get average of high 40s fps and low 50s depending on the action

#38 Odins Fist

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 3,111 posts
  • LocationThe North

Posted 27 December 2012 - 02:47 PM

View PostVulpesveritas, on 27 December 2012 - 01:04 PM, said:

I guess you'll stay in the past when better value products are available.

.
LOL, where did you come up with that..??
.
I'm dumping AMD CPUs for Intel, because piledriver "BARELY" made any improvement over bulldozer... Check yourself..
.
I did the smart thing, and "SKIPPED" the original Phenom, waited for Phenom II, then I skipped Bulldozer, and now Piledriver, and i'm jumping the sinking ship that AMD CPUs are for Intel... Anyone who's anyone in the industry (or a business owner like I am) knows that Bulldozer, and Piledriver were simply too little too late... Oh did I meantion that Piledriver uses "INSANE" amounts of power when OC'd like Bulldozer, my buddy with the 8350 knows all about it...
.
I would say i'm sorry, but i'm not, and I know better...

Edited by Odins Fist, 27 December 2012 - 02:48 PM.


#39 BR0WN_H0RN3T

    Member

  • PipPipPipPipPipPipPip
  • Philanthropist
  • Philanthropist
  • 701 posts
  • LocationElysium

Posted 27 December 2012 - 03:26 PM

I'm running Razer Booster now when I play. It may improve your FPS by cutting down unnecessary background program load. Give it a go if you like. I believe I'm getting better "stable FPS using it. Fewer peaks and toughs, I'm quite sure of it.

#40 Thunder Fist

    Member

  • Pip
  • The Jaws
  • The Jaws
  • 19 posts

Posted 27 December 2012 - 03:49 PM

View PostOdins Fist, on 27 December 2012 - 02:47 PM, said:

.
LOL, where did you come up with that..??
.
I'm dumping AMD CPUs for Intel, because piledriver "BARELY" made any improvement over bulldozer... Check yourself..
.
I did the smart thing, and "SKIPPED" the original Phenom, waited for Phenom II, then I skipped Bulldozer, and now Piledriver, and i'm jumping the sinking ship that AMD CPUs are for Intel... Anyone who's anyone in the industry (or a business owner like I am) knows that Bulldozer, and Piledriver were simply too little too late... Oh did I meantion that Piledriver uses "INSANE" amounts of power when OC'd like Bulldozer, my buddy with the 8350 knows all about it...
.
I would say i'm sorry, but i'm not, and I know better...


Yes intel rocks!!!!! your wallet, just checked newegg.com there top of the line cpu the Intel Core i7-3930K Sandy Bridge-E 3.2GHz (3.8GHz Turbo) LGA 2011 130W Six-Core Desktop Processor is $ 569 the same price as a amd 8 core $149 and a high end video card like the EVGA 02G-P4-2680-KR GeForce GTX 680 2GB 256-bit GDDR5 PCI Express $399 , and you still have $20 dollars to spare, lmao yes intel rocks all the way to the bank





1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users