Jump to content

Do You Think Mechwarrior Should Support A Lower Graphical Setting To Give Players Higher Fps?


167 replies to this topic

Poll: MechWarrior should offer better performance. (374 member(s) have cast votes)

Do you think MechWarrior should support a lower graphical setting to give players a higher FPS?

  1. Yes (210 votes [50.97%] - View)

    Percentage of vote: 50.97%

  2. No (78 votes [18.93%] - View)

    Percentage of vote: 18.93%

  3. For players who need it. (113 votes [27.43%] - View)

    Percentage of vote: 27.43%

  4. FPS means Frames Per Second? (11 votes [2.67%] - View)

    Percentage of vote: 2.67%

What graphic settings do you play MechWarrior on?

  1. Low (170 votes [40.96%] - View)

    Percentage of vote: 40.96%

  2. Medium (59 votes [14.22%] - View)

    Percentage of vote: 14.22%

  3. High (27 votes [6.51%] - View)

    Percentage of vote: 6.51%

  4. Very High (31 votes [7.47%] - View)

    Percentage of vote: 7.47%

  5. Maxed (102 votes [24.58%] - View)

    Percentage of vote: 24.58%

  6. I don't know. (8 votes [1.93%] - View)

    Percentage of vote: 1.93%

  7. Pie. (18 votes [4.34%] - View)

    Percentage of vote: 4.34%

Vote Guests cannot vote

#81 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 04 May 2013 - 06:53 PM

Oh shooters on laptops are miserable, believe me. I do it occasionally because I'm stuck at school for hours on end sometimes with free time. Home's quite a commute for me, because I live far off campus and park on a campus lot only reasonably accessible by bus.

When I have the desktop, though, I definitely don't even look at the laptop.


If PGI thought they could really push the GPU requirements down, so far that it might start including people with even slower integrated graphics, that would expand their market, yes, but do you really think that's even possible? I mean, by the time they went to sufficient lengths, would the game even be playable? They could go so far as to seriously compromise view distances, object placement (so as not to give an unfair advantage to one side), many of the effects (again, can't just make things like smoke "go away" at low settings), and things in game would not only have to stay present, but would have to look the same, again, so as not to confer visibility advantages.

As it is, I think the the number of systems that can support the game, at least on paper (that is to say, no silly performance glitches) is really pretty huge. I'm not sure how much further they could go without compromising the game. Low is already pretty darned ugly ;)

Edited by Catamount, 04 May 2013 - 06:54 PM.


#82 Lord of All

    Member

  • PipPipPipPipPipPipPip
  • Knight Errant
  • 581 posts
  • Google+: Link
  • LocationBottom Of a Bottle

Posted 04 May 2013 - 06:54 PM

View PostCatamount, on 04 May 2013 - 06:53 PM, said:

Oh shooters on laptops are miserable, believe me. I do it occasionally because I'm stuck at school for hours on end sometimes with free time. Home's quite a commute for me, because I live far off campus and park on a campus lot only reasonably accessible by bus.

When I have the desktop, though, I definitely don't even look at the laptop.


If PGI thought they could really push the GPU requirements down, so far that it might start including people with even slower integrated graphics, that would expand their market, yes, but do you really think that's even possible? I mean, by the time they went to sufficient lengths, would the game even be playable? They'd have to maintain view distances, object placement (so as not to give an unfair advantage to one side), many of the effects (again, can't just make things like smoke "go away" at low settings), and things in game would not only have to stay present, but would have to look the same, again, so as not to confer visibility advantages.

As it is, I think the the number of systems that can support the game, at least on paper (that is to say, no silly performance glitches) is really pretty huge. I'm not sure how much further they could go without compromising the game. Low is already pretty darned ugly :huh:

Playable is a relevant term. ;)

Me and You, No

#83 Bad Karma 308

    Member

  • PipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 411 posts

Posted 04 May 2013 - 08:33 PM

View PostLord of All, on 04 May 2013 - 06:32 PM, said:

Yes, a great argument for all res options.


I see no reason to limit the game to Dx11 only? Where did you see that? Dx9 should still be an option when Dx11 is implemented, not withstanding the fact if you run in dx9 you will not receive the 11 or 10 (shader 3.0 IIRC) benefits.


I'll go back to a scenario I've used before; that of DX11s destructible environment. So the set up as it would apply to MWO. You've got a light mech and you're running DX9.

Let's say your using a small hill as cover from the enemy team, it being between you and them. The enemy team is running DX11. Now because their DX11 capable GPUs can handle the destructible environment they can make use of it by blasting away at the hill and eventually removing it.

Now they have a direct line of sight to you. However, your system is DX9 and can't handle the destructible environment. You wouldn't see that the hill between you and them is now gone and you're now in their direct line of fire. It would be massively unfair to the player base.


So you'd be dead without ever knowing how they managed to hit you through the terrain. That is just one instance, but the same could be said for destructible buildings and foliage.

But also the finer addressable triangles available through tessellation also means that far more precise hit boxes and mech damage can be applied. So instead of a mech having a pretty big head hit box in DX9, DX11 could actually go into addressing individual metal plates and windscreens in DX11.

So the play experience between the two DX* APIs make it more akin to a console player going head-to-head with a PC gamer. Also, don't forget that DX11 can allow portions of the cores to be allocated toward physics processing, not that PGI isn't going that route that I know of, but they could if they wanted or needed to. But it makes the experiences far enough apart as to almost make it like two separate games.

There are ways to try and make it play "more" fair. but then you'd be dumbing down the Game engine that PGI payed a huge fee to license to get those enhanced benefits.

#84 Quinn Allard

    Member

  • PipPipPipPipPipPip
  • Veteran Founder
  • Veteran Founder
  • 278 posts
  • LocationUSA

Posted 05 May 2013 - 06:55 PM

View PostJuicebox12, on 03 May 2013 - 07:45 PM, said:

Worst analogy ever.....



Um, no. It was a good analogy. How many console players do you know that say BF3 looks amazing? Or Mustang owners that think their car is fast. When the ones that own decent gaming rigs know that any game on current consoles looks terrible, and Mustangs (stock ones atleast) are slow. My wife thinks my PowerStroke diesel truck is fast, and that Tomb Raider on the xbox looked "lifelike". Then I took her for a ride in my dads LS9 Corvette, and showed her Tomb Raider on my pc. According to her the difference was "like seeing a picture of the Grand Canyons, then going there and seeing for yourself". The real world difference is the personal experience. When I worked in the video game industry years ago people would ask me "so whats better the xbox or playstation?". I would tell them its all about personal experience. Playing Madden on PS3 on a big screen in that moment is amazing, so is playing Forza on the xbox on a big screen. Saying that you cant tell the difference between 24fps and 100fps is a lie. Think back to 8 and 16bit gaming days. I remember playing games that I thought looked amazing, especially when N64 came out. I still have those old systems, i hooked up my N64 and tried to play GoldenEye....I got a headache 10 minutes in. It looked terrible. When I came back from Skip Barber my father let me drive his Corvette, after driving 360's, Gallardo's and 911's it was terrible. Its the progression of our society. T1 used to be fast, now we complain if our connection is T1 speeds. We would complain if our cell phone couldnt get 3G, now we complain if we are getting 3G and not 4G. All these complainers are like those people who buy a cheap Dean acoustic and expect it to sound like a Taylor.

#85 Juicebox12

    Member

  • PipPipPipPipPip
  • Ace Of Spades
  • 142 posts

Posted 05 May 2013 - 09:21 PM

View PostQuinn Allard, on 05 May 2013 - 06:55 PM, said:

Saying that you cant tell the difference between 24fps and 100fps is a lie.


Comparing vehicle acceleration and top speed to frame rates is stupid. Unless your cars go from 30 mph to 20 back to 30 in milliseconds, that would simulate frame rate drops or inconsistency in individual frame draw times. Everyone is different, and i can assure you i can tell the difference if anything is under 100 fps. Ask alot of people who play on higher end rigs and get used to playing at proper rates, this is the main reason i only use my ps3 for netflix.... I cannot stand how slow that pos renders. And fyi corvettes are horribly built cars. Who wants a car which its body is attached to the frame by glue and 4- 8mm bolts.

Edited by Juicebox12, 05 May 2013 - 09:26 PM.


#86 Lord of All

    Member

  • PipPipPipPipPipPipPip
  • Knight Errant
  • 581 posts
  • Google+: Link
  • LocationBottom Of a Bottle

Posted 06 May 2013 - 08:20 AM

View PostQuinn Allard, on 05 May 2013 - 06:55 PM, said:



Um, no. It was a good analogy. How many console players do you know that say BF3 looks amazing? Or Mustang owners that think their car is fast. When the ones that own decent gaming rigs know that any game on current consoles looks terrible, and Mustangs (stock ones atleast) are slow. My wife thinks my PowerStroke diesel truck is fast, and that Tomb Raider on the xbox looked "lifelike". Then I took her for a ride in my dads LS9 Corvette, and showed her Tomb Raider on my pc. According to her the difference was "like seeing a picture of the Grand Canyons, then going there and seeing for yourself". The real world difference is the personal experience. When I worked in the video game industry years ago people would ask me "so whats better the xbox or playstation?". I would tell them its all about personal experience. Playing Madden on PS3 on a big screen in that moment is amazing, so is playing Forza on the xbox on a big screen. Saying that you cant tell the difference between 24fps and 100fps is a lie. Think back to 8 and 16bit gaming days. I remember playing games that I thought looked amazing, especially when N64 came out. I still have those old systems, i hooked up my N64 and tried to play GoldenEye....I got a headache 10 minutes in. It looked terrible. When I came back from Skip Barber my father let me drive his Corvette, after driving 360's, Gallardo's and 911's it was terrible. Its the progression of our society. T1 used to be fast, now we complain if our connection is T1 speeds. We would complain if our cell phone couldnt get 3G, now we complain if we are getting 3G and not 4G. All these complainers are like those people who buy a cheap Dean acoustic and expect it to sound like a Taylor.

View PostJuicebox12, on 05 May 2013 - 09:21 PM, said:


Comparing vehicle acceleration and top speed to frame rates is stupid. Unless your cars go from 30 mph to 20 back to 30 in milliseconds, that would simulate frame rate drops or inconsistency in individual frame draw times. Everyone is different, and i can assure you i can tell the difference if anything is under 100 fps. Ask alot of people who play on higher end rigs and get used to playing at proper rates, this is the main reason i only use my ps3 for netflix.... I cannot stand how slow that pos renders. And fyi corvettes are horribly built cars. Who wants a car which its body is attached to the frame by glue and 4- 8mm bolts.

Worst Sidetrack argument ever. :ph34r:

#87 Volume

    Member

  • PipPipPipPipPipPipPipPip
  • The Privateer
  • The Privateer
  • 1,097 posts

Posted 06 May 2013 - 08:52 AM

Yes, yes, please, yes.

Cryengine 3 can scale down more than their "lowest" and the scaleform hud is still the biggest resource hog in the game so far.

Also @ the "you can't see more than 25 FPS" post on page 1. Wowowowowowowow please try playing something like quake live or UT2K4 with 25 fps and then at 125fps and 120hz and compare the difference

#88 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 06 May 2013 - 09:58 AM

View PostLord of All, on 06 May 2013 - 08:20 AM, said:


Worst Sidetrack argument ever. :lol:


I was just thinking the same thing :(

#89 Dan Nashe

    Member

  • PipPipPipPipPipPipPip
  • Philanthropist
  • Philanthropist
  • 606 posts

Posted 06 May 2013 - 12:35 PM

View PostLord of All, on 06 May 2013 - 08:20 AM, said:


Worst Sidetrack argument ever. :(


There's only one way to settle this argument. Someone needs to buy me a Corvette. And a Porsche.
Let me know.

On topic I think wide audiences are a goal and can certainly tell the dfference between 25 and 45 fps. I do think developers care too much about pretty graphics when I'd be happy with high fps. Heck, I swithch animated tanks to flat cardboard counters in games just for better information presentation.

That said, obviously the devs can't allow competitive advantages for lower or higher settings.
And dev time is finite.
Nonetheless pc games like mwo need big player bases to survive because of the pvp and social elements, so everyone benefits if the playerbase can be expanded by allowing settings that simply look worse but don't affect gameplay.

That said, I suspect the critical performance decisions were made long ago and PGI can't dramatically change things at this point. So I won't get too worked up about it.

I'm still convinced modern rts tend to suck because they are zoomed in too far to show pretty graphics, which I hardly care about, I want a bigger darn visual field. So I hold old grudges :-).

And I'm pretty sure the playerbase that plays pc games on non optimized, two year old equipment is actually quite high. But I lack real data.

#90 warp103

    Member

  • PipPipPipPipPipPip
  • Bad Company
  • 342 posts
  • Facebook: Link
  • Locationdaytona Beach fl

Posted 06 May 2013 - 02:24 PM

View PostCatamount, on 04 May 2013 - 02:32 PM, said:


Your post is based on a premise that is not, itself, true: that PGI could notably expand the systems that can play the game by introducing a lower setting.

When the game isn't suffering technical issues, it'll play on almost any GPU under the sun. Midrange laptop GPUs from four years ago are sufficient. Where this isn't the case, due to the game's bugginess (it will sometimes just decide to perform poorly), the solution is to fix the bugs, not try to dodge around them with a lower graphics setting.

When the game is running properly, then it comes back to my point, which I'll reiterate yet again: This game is picky about CPUs, not GPUs. If you don't have the incredibly modest GPU capable of playing this game, you will not likely have the CPU to play this game and introducing a new graphical setting will not alleviate the CPU requirement for this game. Only the introduction of DX11 will do that.


Exactly what kind of system setup would benefit here? Really, I want to know what system configuration you have in mind with this suggestion.

ok have to call bullbucky on this cpu Issue
I have posted this before
"You are missing the point i have 3 comp towers. 1 is a 4core amd 9950 2.5{oc) with 4g of memory amd gpu hd7650 2tb drive 250ssd game drive, next is a Phenom II X4 965 BE 3.4GHz{oc} with 8g of memory amd gpu 8760 2tb drive 250ssd game drive. last is the i7 995 @ 3.60Ghz with 32g of mem amd radeon hd 8970{i forgot i remove the nvida{and transposted the 89 edit}} 4 2tb drives 2 250sss drive{only one for games}
the first 2 do not get over 15fps is this game, the 3rd gets 50fps.


Do not give my this low spec garbage. all are should get a min of 30 fps with the second getting about 45fps and the last 60+
So again you get aPosted Image because it not a issue anyone that has the min spec should be having. Stop blaming other machines and turn to the DEV for this issue. Oh and I upgraded my first comp from a 7450 to the 7650 and Wait for it "The same damn FPS". And not one maxs the CPU's so a video card chg should have upped the FPS.

None of the above systems hit 50% of the cpu. If it was a cpu issue {bottleneck} it would hit over 90%. Pls stop calling it a cpu or for that matter a gpu issue it st8r up and down a coding issue. Note I have did a test using 30 computer. of all types{ min was Phenom II X4 805 with 7450} and used various video cards. The lowest is above the "min spec" and should give you 30fps on low setting.

Stop blaming the computer and start blaming MWO. It not the cryengine3 played other games{nexuiz Crysis 2}{with 9560 cpu i get 24fps}with the about systems and never had this much of a bad FPS.

Edited by warp103, 06 May 2013 - 02:37 PM.


#91 ArmageddonKnight

    Member

  • PipPipPipPipPipPipPip
  • FP Veteran - Beta 2
  • 710 posts

Posted 06 May 2013 - 02:30 PM

You need to understand that a CPU bottleneck does not require 90%+ usuage.

I've had to explain this soooo many times on the GW2 tech forums due to the CPU hog that GW2 is and the fact soo many people say ' but my CPU is only at X%". And tbh i REALLY cba to explain it anymore. So i sugest you go online and readu p about CPU's how threads work and how they can get bottleenecked without their overall usage maxing out.
Even my 3930k bottlenecks in GW2. Now yes the GW2 engine is crap, it doesnt fully utilize a i7 CPU (hence why even my 3930k cant keep gw2 at 60 fps in WvW) BUT the threads it does utilize get munched up quikly in heavy situations, so u need fast 'single thread' performance.

Also a 1st gen Intel i5 or i7 is a far cry away form the performance of a 2nd or 3rd gen i5 or i7.

Edited by ArmageddonKnight, 06 May 2013 - 02:31 PM.


#92 Juicebox12

    Member

  • PipPipPipPipPip
  • Ace Of Spades
  • 142 posts

Posted 06 May 2013 - 02:36 PM

View Postwarp103, on 06 May 2013 - 02:24 PM, said:

You are missing the point i have 3 comp towers. 1 is a 4core amd 9950 2.5{oc) with 4g of memory amd gpu hd7650 2tb drive 250ssd game drive, next is a Phenom II X4 965 BE 3.4GHz{oc} with 8g of memory amd gpu 8760 2tb drive 250ssd game drive. last is the i7 995 @ 3.60Ghz with 32g of mem amd radeon hd 8970{i forgot i remove the nvida{and transposted the 89 edit}} 4 2tb drives 2 250sss drive{only one for games}
the first 2 do not get over 15fps is this game, the 3rd gets 50fps.


You have 2 unreleased gpu's? Interesting there.... Your whole argument is invalid now. And your i7 is a triple channel generation cpu, so you aren't running in triple channel with 32gb of ram. Come to think of it anyone who calls computers towers has no idea what he is talking about with computers.

Oh wells. So much for making sense.

Edited by Juicebox12, 06 May 2013 - 02:38 PM.


#93 warp103

    Member

  • PipPipPipPipPipPip
  • Bad Company
  • 342 posts
  • Facebook: Link
  • Locationdaytona Beach fl

Posted 06 May 2013 - 02:46 PM

View PostJuicebox12, on 06 May 2013 - 02:36 PM, said:


You have 2 unreleased gpu's? Interesting there.... Your whole argument is invalid now. And your i7 is a triple channel generation cpu, so you aren't running in triple channel with 32gb of ram. Come to think of it anyone who calls computers towers has no idea what he is talking about with computers.

Oh wells. So much for making sense.

lol all these gpu are released you need to look them up better and i have 8gb memory was a gamer rig that was custom bld as my server. So go look it up just because it not a normal set up or you can not get it does not my issue.

Edited by warp103, 06 May 2013 - 02:50 PM.


#94 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 06 May 2013 - 02:47 PM

View Postwarp103, on 06 May 2013 - 02:24 PM, said:

Oh and I upgraded my first comp from a 7450 to the 7650 and Wait for it "The same damn FPS".


Amidst all the incoherence I managed to pull this out, and go figure, it's the most clearly written sentence in the entire tirade.

If you upgraded your GPU and didn't get a framerate increase, then it means your GPU wasn't holding you back in the first place, which means you have a CPU bottleneck. Thank you for making my point for me, although you could have done it a tad bit more concisely.

Also, I was already showing this in posts with various tests months ago, so it's no great revelation.

As far as the rest of your post goes:

Posted Image

Edited by Catamount, 06 May 2013 - 02:56 PM.


#95 warp103

    Member

  • PipPipPipPipPipPip
  • Bad Company
  • 342 posts
  • Facebook: Link
  • Locationdaytona Beach fl

Posted 06 May 2013 - 03:06 PM

View PostCatamount, on 06 May 2013 - 02:47 PM, said:


Amidst all the incoherence I managed to pull this out, and go figure, it's the most clearly written sentence in the entire tirade.

If you upgraded your GPU and didn't get a framerate increase, then it means your GPU wasn't holding you back in the first place, which means you have a CPU bottleneck. Thank you for making my point for me, although you could have done it a tad bit more concisely.

Also, I was already showing this in posts with various tests months ago, so it's no great revelation.

As far as the rest of your post goes:

Posted Image

Wow did you miss that none of the cpu got to 50% power, it not a CPU issue. FOR it to be a cpu issue game would have to hit 90% power. it never does. A change in gpu from7450 to 7650 it not a big jump and FPS when up 2frames. That is in a margin that I call zero. Second I said 30 computer total tested on. All above minimum spec so if it why are these not getting 30 FPS.

#96 Juicebox12

    Member

  • PipPipPipPipPip
  • Ace Of Spades
  • 142 posts

Posted 06 May 2013 - 03:40 PM

View Postwarp103, on 06 May 2013 - 02:46 PM, said:

lol all these gpu are released you need to look them up better and i have 8gb memory was a gamer rig that was custom bld as my server. So go look it up just because it not a normal set up or you can not get it does not my issue.


The 8 series gpu's do not exist yet.... You are losing your mind. Get off the drugs and start making sense.

#97 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 06 May 2013 - 04:02 PM

View Postwarp103, on 06 May 2013 - 03:06 PM, said:

Wow did you miss that none of the cpu got to 50% power, it not a CPU issue. FOR it to be a cpu issue game would have to hit 90% power. it never does.


Right, maybe that's how it works in the the world you concoct in your head from the comfort of your armchair, but here in the corporeal world, a game does not need to "max" a CPU to be bottlenecked on it, and this is often not the case, as a CPU simply is not able to dedicate its free resources to executing the code faster due to issues such as poor code parallelization.

The way to spot a CPU bottleneck in gaming is not your juvenile method of looking at the task manager and seeing if software is "maxing" your CPU. It's done by removing the GPU as a bottleneck, usually by comparing different in-game settings, and seeing what happens. I did that, showing that regardless of settings, even a stock i5-3570k was able to drag the game into the mid 40s.

Quote

A change in gpu from7450 to 7650 it not a big jump


ORLY?

Well first off, there is no such thing as a Radeon HD 7650, so you're probably referring to the 7670, and secondly, a 7450 to a 7670 is a huge jump, though I'd love to know how you got hold of an OEM GPU as an upgrade.

The 7670 has about twice as many transistors (716 million vs 370 million), three times as many stream processors (480 vs 160), three times as many TMUs (24 vs 8), twice as many ROPs (8 vs 4), and five times as much memory bandwidth (64GB/s vs 12.8).

Please, tell me more about how the 7600 series isn't a "big jump" over the 7400 series. I mean, it only has two to three times the processing power, after all (3.2 theoretically, 768 GFLOPS vs 240 GFLOPS).

#98 warp103

    Member

  • PipPipPipPipPipPip
  • Bad Company
  • 342 posts
  • Facebook: Link
  • Locationdaytona Beach fl

Posted 06 May 2013 - 10:41 PM

View PostCatamount, on 06 May 2013 - 04:02 PM, said:


Right, maybe that's how it works in the the world you concoct in your head from the comfort of your armchair, but here in the corporeal world, a game does not need to "max" a CPU to be bottlenecked on it, and this is often not the case, as a CPU simply is not able to dedicate its free resources to executing the code faster due to issues such as poor code parallelization.

The way to spot a CPU bottleneck in gaming is not your juvenile method of looking at the task manager and seeing if software is "maxing" your CPU. It's done by removing the GPU as a bottleneck, usually by comparing different in-game settings, and seeing what happens. I did that, showing that regardless of settings, even a stock i5-3570k was able to drag the game into the mid 40s.



ORLY?

Well first off, there is no such thing as a Radeon HD 7650, so you're probably referring to the 7670, and secondly, a 7450 to a 7670 is a huge jump, though I'd love to know how you got hold of an OEM GPU as an upgrade.

The 7670 has about twice as many transistors (716 million vs 370 million), three times as many stream processors (480 vs 160), three times as many TMUs (24 vs 8), twice as many ROPs (8 vs 4), and five times as much memory bandwidth (64GB/s vs 12.8).

Please, tell me more about how the 7600 series isn't a "big jump" over the 7400 series. I mean, it only has two to three times the processing power, after all (3.2 theoretically, 768 GFLOPS vs 240 GFLOPS).

lol OMG it a oem card just like the 7450 from hp damn you do not know jack. if you need a link to it I can sent it to you. From what I am told it a rebranded card so is the 7450. But i guess i have 2 fathom card because you can not find it ROLF. As for finding them as a computer tech and a computer company owner left over stock from dell,hp that was send to me to test. But if you look on EBay you will find both too{look for completed in the last 6 months. Hell even newegg has one of them for 135 dollars

Edited by warp103, 06 May 2013 - 10:48 PM.


#99 Catamount

    Member

  • PipPipPipPipPipPipPipPipPip
  • LIEUTENANT, JUNIOR GRADE
  • 3,305 posts
  • LocationBoone, NC

Posted 07 May 2013 - 05:27 AM

View Postwarp103, on 06 May 2013 - 10:41 PM, said:

lol OMG it a oem card just like the 7450 from hp damn you do not know jack. if you need a link to it I can sent it to you. From what I am told it a rebranded card so is the 7450. But i guess i have 2 fathom card because you can not find it ROLF. As for finding them as a computer tech and a computer company owner left over stock from dell,hp that was send to me to test. But if you look on EBay you will find both too{look for completed in the last 6 months. Hell even newegg has one of them for 135 dollars


http://en.wikipedia...._HD_7000_series

https://www.google.c...lient=firefox-a

http://search.amd.co...llection=all-us

Apparently, neither Wikipedia, nor Google, nor AMD is aware of this fictional GPU of yours.

Ebay doesn't seem to be aware of it either, despite your claim
http://www.ebay.com/...D+7650&_sacat=0



There is a 7650 mobile part, but you didn't say 7650M or 7450M, you said 7650, and you explicitly identified the machine as a full-fledged desktop, meaning it wouldn't accept a mobile part anyways.


Either way, your assertion that a 7600 series is not a "big jump" from a 7400 series is wrong, as the two series are far apart in performance, with the former being more than twice as powerful as the latter. The 7670 (the only real 7600 series desktop part) is three times as fast as the 7450. The mobile 7650 is similarly advantaged, being around twice times as fast as the mobile 7450, though, again, we're clearly not talking about mobile parts here.



So let's see. You haven't managed to make a point, you refer to fictional GPUs to make your non-point, and you don't bother to actually look up your GPUs before you claim that one isn't much faster than the other (even when the difference is enormous). Did I miss anything?


I think you've had your fun. Now why don't you let the adults continue the discussion from here, okay?

Edited by Catamount, 07 May 2013 - 05:37 AM.


#100 Juicebox12

    Member

  • PipPipPipPipPip
  • Ace Of Spades
  • 142 posts

Posted 07 May 2013 - 05:28 AM

This warp guy is a chronic liar. Still haven't addressed the fact that you "own" 2 computers with video cards that haven't passed production phase yet. None of your arguments prove any knowledge of any prior pc experience. Then we come down to the issue that people with lesser pc's then your quoted rigs are getting higher framerates at the maximum settings on a regular basis. Yes the game is demanding, but its not downright impossible to run at decent framerates. I have friends running old q6600's with 6750's getting 50 fps on medium settings. Thats a 3-4 generation old mid grade quad core just to clarify since you have no grasp on hardware.





8 user(s) are reading this topic

0 members, 8 guests, 0 anonymous users