Jump to content

[GUIDE] Hardware Mythbusters - An In-Depth Hardware Guide



1329 replies to this topic

#1001 PierceElliot

    Member

  • Pip
  • Survivor
  • 12 posts

Posted 23 October 2012 - 07:34 PM

Is it true what I have recently observed? Basically, I saw two people playing the same game co-op with similarly built computers with a small variety of games. They were torn between the ATI/NVIDIA thing so, of course, they chose different cards. I'm not sure the price of the rigs and if they were even on similar levels in terms of performance, but what I noticed was that, when the NVIDIA machine was faced with lots of high quality textures on a screen, it's performance started to drop incredibly much more than the ATI machine. However, when high quality fullscreen shaders and things such as SSAO were used on the ATI machine, it would drag the FPS to a crawl. From my understanding, this is because of two things:

1. That ATI essentially just creates cards with the most raw power possible, however, they fall short on driver optimization and such which leads them to fall short on more "Complex" rendering.

2. That Nvidia, while it may not have as much raw power as an ATI card, it has incredibly optimized drivers as well as independent "cores" just for shader rendering (Not to say ATI doesn't either, but that NVIDIA just does it much better).

I'm under the impression that this is an oversimplification, however, is there any credibility to this? I'm a Network student who has played with computers all my life, but yes, networks and general IT work are what I do best and am more up to date on. Anyways, I've observed it in my laptop with an ATI Radeon HD 6550M because I can have very high definition textures, but as soon as you mix SSAO, HDR, or up until the recent driver updates Anti-Aliasing, framerate would cut in half (Especially on the Unreal Engine). Though I know saying Laptop in a discussion about gaming hardware is bait for a flame war, so before that's addressed (and if it's even relevant. I may just be pre-emptively being an ***), I'm sorry that I offend you by playing games on a laptop and join a discussion about hardware.

#1002 Magic Murder Bag

    Member

  • PipPipPipPipPip
  • Bad Company
  • Bad Company
  • 149 posts
  • LocationSomewhere between Naraka and Shinkoku

Posted 23 October 2012 - 07:56 PM

I'm almost afraid to ask, but I'll ask anyways....I'm currently using an ASUS P8Z68-V Pro/Gen3 motherboard (which is currently in overclocked mode). Works well with my other games (especially Borderlands 2 at maxed settings) but I'm just curious if my board is considered too weak to use (and if it is, I'm gonna be pissed cause I bought just a few weeks ago on sale with the rest of my PC parts).

Edited by Magic Murder Bag, 23 October 2012 - 08:07 PM.


#1003 Weeble

    Member

  • PipPipPipPipPip
  • 122 posts
  • LocationKansas City, MO.

Posted 23 October 2012 - 08:31 PM

@PierceElliott- oversimplification and neither of your examples are good...well, examples.


I've been running Nvidia cards the last few years and there have been times when their forums exploded with people complaining about their drivers so don't count on always getting fully optimized (or even working) drivers.

If I was going to oversimplify, I would say who's hardware is more powerful and who's drivers are more optimized change so often that it depends on what day you ask the question. In general, I think Nvidia usually pulls off slightly higher frame rates, slightly more often, but it really depends on your specific needs. The "best" solution depends on what you are trying to accomplish.

I'm switching from a Nvidia SLI setup to a single ATI 7950. Why? The AMD has 3 gig memory compared to the Nvidia 2 gig (or 2 x 896mb on my current setup) and a wider memory interface. I run a multi-monitor gaming rig at 5960x1080 resolution so I need that. Most gamers are at 1920x1080 and a much cheaper board, very possibly a Nvidia, is probably a better choice for them. YMMV

#1004 ItsKrunchTime

    Member

  • Pip
  • FP Veteran - Beta 1
  • 17 posts

Posted 23 October 2012 - 08:54 PM

The link said no, but my POS laptop can still run this game. (barley, and not very well, but it still runs!)
Though I'm no good with computers, so I fear I'm somehow damaging my processor by unknowingly overclocking it or something...

#1005 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 23 October 2012 - 09:00 PM

View PostItsKrunchTime, on 23 October 2012 - 08:54 PM, said:

The link said no, but my POS laptop can still run this game. (barley, and not very well, but it still runs!)
Though I'm no good with computers, so I fear I'm somehow damaging my processor by unknowingly overclocking it or something...


The only real danger is that you're running your processor or graphics card too hot. You can download software to monitor those temperatures, like Coretemp and GPUZ (or others might have other suggestions), and we can tell you if they're too hot. Generally, graphics chips should run no hotter than 90C (ideally considerably less, but laptop GPUs do tend to get hot), while the ideal temperature for your processor depends on your processor, so you'd have to tell us more about what you have.

#1006 ItsKrunchTime

    Member

  • Pip
  • FP Veteran - Beta 1
  • 17 posts

Posted 23 October 2012 - 09:10 PM

I run SpeedFan out of game to monitor my processor's heat. (I hardly run anything besides the game itself when it's running)
I only run the game for around half an hour at a time to allow my laptop to cool down.
As for processor, I'm running an Intel i5 that runs at 2.53 GHz. I don;t know if that's what you're looking for, but there it is.

Edited by ItsKrunchTime, 23 October 2012 - 09:10 PM.


#1007 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 23 October 2012 - 09:16 PM

Sounds like either an i5-540M or i5-460M. Those chip have a maximum temperature of 105C, and I have a mobile i7 of the same generation (same max temp) that's been getting into the high 80s for temperature and running like that hours on end, day after day, for over two years.

I'd say if your CPU stays below 85 or so you're fine, and again, the GPU shouldn't go above 90, but below mid 80s is also ideal there as well. If you're getting above that, well just let us know how much above.

#1008 PierceElliot

    Member

  • Pip
  • Survivor
  • 12 posts

Posted 24 October 2012 - 05:28 AM

5960x1080? You're simulating depth perception at that point right? But yeah, I guess the "best" solution is what your end goal is. I'm coming from the onboard Intel GMA cards so my goal was to be able to actually play games. !!! Accomplished !!!

#1009 PierceElliot

    Member

  • Pip
  • Survivor
  • 12 posts

Posted 24 October 2012 - 05:35 AM

You could also just do what I do. Prop the laptop up using old game CD cases then place a Rubbermaid container with a large brick of ice under the fan. Works like a charm for several hours. Or, you could like, clean out the laptop from time to time. But, why would you do that? I have a Lenovo Y460p which are apparently known for going up to 179 C (I've watched it before) then just shutting off. When it's perfectly maintained it only goes up to around 150-ish and with a Block of Ice it knocks it down 20-30C depending on how hot it already was.

Edited by PierceElliot, 24 October 2012 - 05:37 AM.


#1010 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 24 October 2012 - 06:43 AM

View PostPierceElliot, on 24 October 2012 - 05:35 AM, said:

You could also just do what I do. Prop the laptop up using old game CD cases then place a Rubbermaid container with a large brick of ice under the fan. Works like a charm for several hours. Or, you could like, clean out the laptop from time to time. But, why would you do that? I have a Lenovo Y460p which are apparently known for going up to 179 C (I've watched it before) then just shutting off. When it's perfectly maintained it only goes up to around 150-ish and with a Block of Ice it knocks it down 20-30C depending on how hot it already was.


You needn't worry about your computer reaching such temps.

A little on temperatures, recording, and energy:
Temperature reporting software might be seeing things that high, but such sensors can be very inaccurately read by software (which is why most of that software has an option to calibrate temperature readings, though we seldom have the data to do so). There's no way anything inside that laptop is actually getting that hot. Not only would any chip ever created destabilize and crash long before the 150C mark, but most will physically take damage not far above 100C, and I've never seen a chip with a temperature cutoff above 125C (and that's extraordinarily high), which means even getting into the 120s would cause instant shutdown, long before 150C was reached, let alone 180C. In fact, the CPU in that line of laptops shuts down at 85-100C (depends on the exact model). I suspect if a chip ever hit 180C for any amount of time, it would near-instantaneously brick the computer.

Moreover, the relationship between power and temperature is not linear, because your cooler bleeds more heat the hotter it gets, requiring an exponential increase in power to create a linear increase in temperature. It's true that eventually, a cooler can get overwhelmed and a computer can begin to trap heat, but nevertheless reaching 180C would require exponentially more power than 100C, and doing so would almost fry your power brick, because the CPU alone would consume more energy than the AC adapter could give out. This fact works to our favor though, because it means coolers don't have to be scaled up linearly with chip power.

Edited by Catamount, 24 October 2012 - 01:55 PM.


#1011 PierceElliot

    Member

  • Pip
  • Survivor
  • 12 posts

Posted 24 October 2012 - 10:53 AM

I have actually bricked this laptop once before. I bought the 3 year accidental warranty from Squaretrade so I wasn't too worried about that necessarily. I read online about people complaining that the temperature going that high with this model, but, I guess I also assumed everyone knew what they were doing and that I didn't. So I just assumed through confirmation bias (I think that's what it's called) that what I was seeing was correct. I remember when I had a tower back when I was a kid, which had an NVIDIA 8600GT, which we were always trying to keep under 80F. So I guess I had a thought in the back of my head that this was WAY TOO HOT.

#1012 L A guns

    Member

  • PipPipPip
  • Bad Company
  • 54 posts

Posted 24 October 2012 - 11:09 AM

2.8 6core amd,gt430,2gigs of ram and its very playable-also on a 2core intel with a lesser card(1 gig zotac-i forget wich one) and 2 gigs of ram almost playable-will probably work on dx11 update on the 2 core.

#1013 Sir Roland MXIII

    Member

  • PipPipPipPipPipPipPipPip
  • The Spear
  • The Spear
  • 1,152 posts
  • LocationIdaho

Posted 24 October 2012 - 09:32 PM

Guess it should be noted that the chip prices are off since the OP was last updated back in June. For instance the AMD Phenom II X4 965 Black Ed is now at the $100 price point.

Edited by Sir Roland MXIII, 24 October 2012 - 09:32 PM.


#1014 Vulpesveritas

    Member

  • PipPipPipPipPipPipPipPipPip
  • 3,003 posts
  • LocationWinsconsin, USA

Posted 24 October 2012 - 11:51 PM

I will be updating this and many others of my threads in the next week or so now that AMD's Vshera and Trinity lineup are out along with Intel's Ivy Bridge.

#1015 Sir Roland MXIII

    Member

  • PipPipPipPipPipPipPipPip
  • The Spear
  • The Spear
  • 1,152 posts
  • LocationIdaho

Posted 25 October 2012 - 12:08 AM

View PostVulpesveritas, on 24 October 2012 - 11:51 PM, said:

I will be updating this and many others of my threads in the next week or so now that AMD's Vshera and Trinity lineup are out along with Intel's Ivy Bridge.


Fair enough, thanks Vulpes!

#1016 Youngblood

    Member

  • PipPipPipPipPipPipPip
  • 604 posts
  • LocationGMT -6

Posted 25 October 2012 - 04:09 PM

View PostPierceElliot, on 24 October 2012 - 05:28 AM, said:

5960x1080? You're simulating depth perception at that point right?


I don't think that phrase means what you think it means...it's a triple-monitor setup for a wider field of view so you can see Jenners off to your side biting at your ankles. The setup requires a lot of on-board memory for a card to access.

Edited by Youngblood, 25 October 2012 - 04:10 PM.


#1017 Youngblood

    Member

  • PipPipPipPipPipPipPip
  • 604 posts
  • LocationGMT -6

Posted 25 October 2012 - 04:11 PM

View PostVulpesveritas, on 24 October 2012 - 11:51 PM, said:

I will be updating this and many others of my threads in the next week or so now that AMD's Vshera and Trinity lineup are out along with Intel's Ivy Bridge.


Excellent! I know we'd love having you back here helping us out! Honestly, I must say that AMD has done quite well for itself this generation.

#1018 Vulpesveritas

    Member

  • PipPipPipPipPipPipPipPipPip
  • 3,003 posts
  • LocationWinsconsin, USA

Posted 25 October 2012 - 07:00 PM

View PostYoungblood, on 25 October 2012 - 04:11 PM, said:


Excellent! I know we'd love having you back here helping us out! Honestly, I must say that AMD has done quite well for itself this generation.

I actually have been watching the forums more or less every day. It's really more me between going over newer reviews and my wanting to actually getting to update everything.

#1019 PierceElliot

    Member

  • Pip
  • Survivor
  • 12 posts

Posted 25 October 2012 - 07:48 PM

View PostYoungblood, on 25 October 2012 - 04:09 PM, said:


I don't think that phrase means what you think it means...it's a triple-monitor setup for a wider field of view so you can see Jenners off to your side biting at your ankles. The setup requires a lot of on-board memory for a card to access.


I thought I knew what I meant. Looking it up though, the school of thought around depth perception starts out simple with:
"Depth perception is the visual ability to perceive the world in three dimensions (3D) and the distance of an object."

From there, it's about the same as the school of thought circling Intelligence. It's so incredibly simple of an idea that nobody knows what it actually is (At least that's my unintelligent observation based on the rule of the taijitu, or at least that's the story I'm sticking to.). Since, as I'm sure you're aware, there's about 100 different theories and tests on intelligence and albeit IQ is typically what is accepted, there's many ways to administer the test and how to interpret the results (Not only including the wrong ways). Apparently that's the same with depth perception. Large resolutions on large screens supposedly create a more realistic "Depth Perception" about them. Some say that there was a depth perception achieved when it was 3D and that it never gets any more... depth. I'm sure there might even be people who claim that there is no depth perception there but then I'd probably have to dismiss their claims (But that's my highly educated opinion, not really).

#1020 Vulpesveritas

    Member

  • PipPipPipPipPipPipPipPipPip
  • 3,003 posts
  • LocationWinsconsin, USA

Posted 25 October 2012 - 08:01 PM

View PostPierceElliot, on 25 October 2012 - 07:48 PM, said:


I thought I knew what I meant. Looking it up though, the school of thought around depth perception starts out simple with:
"Depth perception is the visual ability to perceive the world in three dimensions (3D) and the distance of an object."

From there, it's about the same as the school of thought circling Intelligence. It's so incredibly simple of an idea that nobody knows what it actually is (At least that's my unintelligent observation based on the rule of the taijitu, or at least that's the story I'm sticking to.). Since, as I'm sure you're aware, there's about 100 different theories and tests on intelligence and albeit IQ is typically what is accepted, there's many ways to administer the test and how to interpret the results (Not only including the wrong ways). Apparently that's the same with depth perception. Large resolutions on large screens supposedly create a more realistic "Depth Perception" about them. Some say that there was a depth perception achieved when it was 3D and that it never gets any more... depth. I'm sure there might even be people who claim that there is no depth perception there but then I'd probably have to dismiss their claims (But that's my highly educated opinion, not really).

Well, if depth perception is the intent, then you would probably be more interested in a 120hz monitor supporting stereoscopic 3D. Multi-monitor gives you a more realistic wide viewing angle / or more of space to look at via turning your head not the body of whatever you are piloting.





2 user(s) are reading this topic

0 members, 2 guests, 0 anonymous users