Jump to content

Revisiting The Gtx 970 Issues Some Have Been Having


30 replies to this topic

#21 Smokeyjedi

    Member

  • PipPipPipPipPipPipPipPip
  • Liquid Metal
  • Liquid Metal
  • 1,040 posts
  • LocationCanada

Posted 02 February 2015 - 10:31 PM

View PostJesus DIED for me, on 02 February 2015 - 04:46 PM, said:


To what end? Who says that "sys" (which stands for "system") relates to video card? I guess it wouldn't hurt to try but I suspect that system memory resources might be different than video card memory resources as far as that code is concerned. What exactly is meant by "sys_budget_videomem"? Anybody know for sure? It might be related to APU system memory allocation for graphical resources, for all we know--would be nice to find a specific definition for it.

Its a debugging line, to make system aware of ram and limit it. same as the

sys_enable_budgetmonitoring = 1
sys_budget_sysmem = 8192

sys_budget_videomem = 2096
sys_budget_streamingthroughput = 20560000-total output of my 8 gigs with 8 threads/

#22 Kuritaclan

    Member

  • PipPipPipPipPipPipPipPip
  • Ace Of Spades
  • 1,838 posts
  • LocationGermany

Posted 03 February 2015 - 12:18 AM

View PostPOOTYTANGASAUR, on 02 February 2015 - 01:49 PM, said:

I would hope your 980 beats my 290, since it cost over $250 more. Reviewers don't hit these clocks because they don't overvolt much. Here is one of my early valley runs.

Check the introducing price of the 290 ($400) and compare it to the 970 what is the antagonist to it ($329). (For the 290X what was introduced with $549 it is not different to the 980.) Than i would mind your argument falls apart.

View PostJesus DIED for me, on 02 February 2015 - 04:41 PM, said:

Overvolting is mostly useless anymore on nVidia because the way the company approched their builds. True overvolting is really a dying thing.

Ture overclocking isnt dying. True overclocking including since some years bios moding, to get to the borders of all cards. So nopp. No difference with new cards.

#23 Tom Sawyer

    Member

  • PipPipPipPipPipPipPipPip
  • The Widow Maker
  • The Widow Maker
  • 1,384 posts
  • LocationOn your 6

Posted 03 February 2015 - 09:28 AM

Well I am in the que line for EVGA's excellent step up program. I shall be moving up to the gtx 980. Yes it is 200 bucks more but frack it. I tend to keep my rigs and components for a LONG time now that I have a wife and 2 kids :)

#24 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 05 February 2015 - 08:43 AM

View PostKuritaclan, on 03 February 2015 - 12:18 AM, said:

Check the introducing price of the 290 ($400) and compare it to the 970 what is the antagonist to it ($329). (For the 290X what was introduced with $549 it is not different to the 980.) Than i would mind your argument falls apart.


You can certainly choose to make that comparison, but I don't see how it's actually useful for determining anything. The 970 will never compete with a $400 290 or $550 290X. It competes with a $300 290X and $270 290, that's good non-reference 290s mind you. That's the market it's being put into as a product for consumer consideration... at a price of $330 (no, not counting the 970 and 290X with MiRs, even if $320 vs $290 makes AMD's case even more).

The 290X does a little worse at 1080P, sure, but that gap closes to about dead parity at 2560x1600, and by 4K the 970's shortcomings have boat anchored it slightly behind the 290X and have even placed it within striking distance of the 290 (it's about 4% behind and in front of the two AMD cards).


At this point, aside from a tiny amount of power efficiency, the reasons to consider the 970 have basically evaporated. Yes, you can concoct reasons why the 970 is supposedly "better" by comparing release prices, but arbitrary measures of better doesn't tell us which one consumers should actually buy in the here and now. At 1080P there's little reason to get a 970, and at resolutions down the road, or even resolutions a new rig is likely to pair either card up with today given how cheap 1440p/1600p monitors are, the kind of resolutions that even make a card like that worthwhile in the first place, there's now no reason to get a 970.

The 970s good stock voltage overclocks might still be a consideration for some users who are big into OCing every GPU to its limits and don't mind playing the lottery. The 980 may still have its uses for the spare-no-expense crowd as well. I wouldn't pay twice as much for something for like 12% more performance, but at least that might be a noticeable jump for anyone willing to pay any cost for any tangible gain. For the most part, however, the 970s shortcomings have dragged it down.

We're also now left with the question of whether governments or markets should so permit Nvidia to blatantly lie about the specs of the card. As long as people focus on reviews, it's somewhat moot, but it does mean we have to keep more of an eye on Nvidia trying to pull fast ones with cards that do great at low resolutions, and choke higher up because they cut corners and didn't tell anyone.

Edited by Catamount, 05 February 2015 - 08:47 AM.


#25 Oderint dum Metuant

    Member

  • PipPipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 4,758 posts
  • LocationUnited Kingdom

Posted 05 February 2015 - 09:58 AM

View PostCatamount, on 05 February 2015 - 08:43 AM, said:


You can certainly choose to make that comparison, but I don't see how it's actually useful for determining anything. The 970 will never compete with a $400 290 or $550 290X. It competes with a $300 290X and $270 290, that's good non-reference 290s mind you. That's the market it's being put into as a product for consumer consideration... at a price of $330 (no, not counting the 970 and 290X with MiRs, even if $320 vs $290 makes AMD's case even more).

The 290X does a little worse at 1080P, sure, but that gap closes to about dead parity at 2560x1600, and by 4K the 970's shortcomings have boat anchored it slightly behind the 290X and have even placed it within striking distance of the 290 (it's about 4% behind and in front of the two AMD cards).


At this point, aside from a tiny amount of power efficiency, the reasons to consider the 970 have basically evaporated. Yes, you can concoct reasons why the 970 is supposedly "better" by comparing release prices, but arbitrary measures of better doesn't tell us which one consumers should actually buy in the here and now. At 1080P there's little reason to get a 970, and at resolutions down the road, or even resolutions a new rig is likely to pair either card up with today given how cheap 1440p/1600p monitors are, the kind of resolutions that even make a card like that worthwhile in the first place, there's now no reason to get a 970.

The 970s good stock voltage overclocks might still be a consideration for some users who are big into OCing every GPU to its limits and don't mind playing the lottery. The 980 may still have its uses for the spare-no-expense crowd as well. I wouldn't pay twice as much for something for like 12% more performance, but at least that might be a noticeable jump for anyone willing to pay any cost for any tangible gain. For the most part, however, the 970s shortcomings have dragged it down.

We're also now left with the question of whether governments or markets should so permit Nvidia to blatantly lie about the specs of the card. As long as people focus on reviews, it's somewhat moot, but it does mean we have to keep more of an eye on Nvidia trying to pull fast ones with cards that do great at low resolutions, and choke higher up because they cut corners and didn't tell anyone.


Honest question, should the reviewers have been blindsided, or should they have found this long ago.

Personally for me, they should have found it.

#26 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 05 February 2015 - 10:07 AM

That's a fair point.

Yes, Nvidia lied about almost half the specs on the card, but reviewers to bear some blame for not looking at that themselves. Maybe this will teach them to eye cards more critically. The high resolution performance numbers should have raised some red flags. Even just adding a page into reviews looking at memory bandwidth might go a long ways in figuring out just what cards might potentially be good at and giving consumers a better picture of likely performance across the games they might play.

#27 xWiredx

    Member

  • PipPipPipPipPipPipPipPip
  • Elite Founder
  • Elite Founder
  • 1,805 posts

Posted 05 February 2015 - 10:20 AM

View PostDV McKenna, on 05 February 2015 - 09:58 AM, said:


Honest question, should the reviewers have been blindsided, or should they have found this long ago.

Personally for me, they should have found it.

I disagree. Reviewers already have very limited time to do testing before they are expected to start publishing when a new product is launched. All manufacturers have a tight line to walk between sending out samples early enough that a proper initial review can be published while still not being so early with them that lots of leaks happen. As it is, most reviewers do the initial review, and then finish work on higher-resolution reviews, sli/crossfire reviews, etc later.

Also, just logically, one would expect higher-res performance to fall off, so lower numbers aren't a red flag unless they're drastically lower (which even at higher resolutions, the 970 doesn't drop off so hard on the performance that it's easy to tell).

#28 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 05 February 2015 - 10:58 AM

In a vacuum a high performance falloff is obviously nothing. It's expected. You'd have a physics-breaking card if it didn't put out lower numbers at higher resolutions. But when a card loses a lot of ground to its competitors, and older competitors at that, that's a curiosity at the very least. The 970 loses something like 12 or 13 percent to the 290X, and that's quite significant. I mean, that's basically the performance separation between the 970 and 980.

When something like that happens, it wouldn't take more than ten extra minutes for a reviewer to run a memory benchmark, at the very least.

#29 xWiredx

    Member

  • PipPipPipPipPipPipPipPip
  • Elite Founder
  • Elite Founder
  • 1,805 posts

Posted 05 February 2015 - 11:09 AM

Looking at benchmarks from Anandtech, I see a difference of about 4-8% when DX11 is involved at 4K. I discount results from Mantle tests since Nvidia cards do not use Mantle (and thus is a second variable in an otherwise controlled test). That is comparing 290X "uber" vs 970. That isn't red flag worthy IMO.

#30 Kuritaclan

    Member

  • PipPipPipPipPipPipPipPip
  • Ace Of Spades
  • 1,838 posts
  • LocationGermany

Posted 05 February 2015 - 12:03 PM

View PostCatamount, on 05 February 2015 - 10:07 AM, said:

That's a fair point.

Yes, Nvidia lied about almost half the specs on the card, but reviewers to bear some blame for not looking at that themselves. Maybe this will teach them to eye cards more critically. The high resolution performance numbers should have raised some red flags. Even just adding a page into reviews looking at memory bandwidth might go a long ways in figuring out just what cards might potentially be good at and giving consumers a better picture of likely performance across the games they might play.

Öhm the lacks in High resolution power are a flaw of the architecture since Kepler, so it is no news to the public. Since Maxwell is better version Kepler it does not have raised flags. It was expected and been confirmed via tests.

Edited by Kuritaclan, 05 February 2015 - 12:08 PM.


#31 DarthPeanut

    Member

  • PipPipPipPipPipPipPip
  • Liquid Metal
  • 861 posts

Posted 09 February 2015 - 02:00 PM

View PostJesus DIED for me, on 02 February 2015 - 04:41 PM, said:

Lets hope some modders would come out with modified BIOS for nVidia cards (and how to install them with ease) that let you set whatever limit(s) you might want. That's not to say the latest Maxwell offerings are bad.. it's just.. that.. they are no longer perfect, as far as an avid overclocker is concerned.


They already have. Bios mods on the maxwell based cards have already been done a while ago since just after the release of the 750ti.

For instance I was tinkering in the hobby of crypto currency mining when the 750ti was released. I was mining with rig of AMD cards, but I picked up a gigabyte 750ti OC model as soon as they came in stock on newegg to play with it since it showed promise for a very low power consumption (wattage) to potential hashrate. Everyone doing similar soon figured out that when overclocking, despite the advertised 60w TDP, it would pretty rapidly hit 100% power consumption and start throttling back. Pulling the bios with gpu-z and opening it up with Kepler Bios Tweaker it was found that they really had a limit of around 38w TDP. Obviously much lower than advertised and why they would hit 100% power consumption so quick/ throttle back.

Anyways long story short... edited of the bios power tables to put max up to 65w and a reflash... problem solved. Pretty simple process using gpu-z and kepler bios tweaker. Overclocked it till it got unstable, backed it down a little, and ran it nonstop for quite a long while without anymore issues. Great little card and still running strong in a machine I built for a family member when I got bored with that stuff.

Edited by DarthPeanut, 09 February 2015 - 02:07 PM.






2 user(s) are reading this topic

0 members, 2 guests, 0 anonymous users