Jump to content

- - - - -

User.cfg For Low-End Computers Doesn't Help Fps?


21 replies to this topic

#1 Felio

    Member

  • PipPipPipPipPipPipPipPip
  • Philanthropist
  • Philanthropist
  • 1,721 posts

Posted 17 January 2014 - 06:34 PM

It worked great on my laptop, but there is no change at all on my girlfriend's.

I'm using the "Lowest Pro" config I found here: http://mwomercs.com/...53#entry1446553

It might be worth mentioning my laptop has a dedicated Nvidia card, and hers has only Intel HD Graphics. I'm don't think it should make a difference, but maybe it does.

#2 Koniving

    Welcoming Committee

  • PipPipPipPipPipPipPipPipPipPipPipPipPip
  • The Guide
  • The Guide
  • 23,384 posts

Posted 17 January 2014 - 06:47 PM

Intel HD graphics....is NOT a graphics card. It's simply a video port that consumes processor power to 'try' and simulate a graphics card.

Given that MWO is boatloads more processor intensive than most high end modern games, a video card emulation that eats up processor power only makes it even worse.

She needs a real video card. (A graphics card that used to be high end in 1998 would have better performance. Yes, it really would.)

Edited by Koniving, 17 January 2014 - 06:49 PM.


#3 Felio

    Member

  • PipPipPipPipPipPipPipPip
  • Philanthropist
  • Philanthropist
  • 1,721 posts

Posted 17 January 2014 - 07:01 PM

Hey, Koniving, I saw you in a game last night. I was like, "Hey, it's the guy from the forums!" I don't remember who won. :unsure:

I know it's not a graphics card, it just seems like telling the game to do less work should reduce the load on whatever you are using. Unless the Intel HD Graphics is just not doing the biggest things that the user.cfg file reduces in the first place.

Edited by Felio, 17 January 2014 - 07:02 PM.


#4 Koniving

    Welcoming Committee

  • PipPipPipPipPipPipPipPipPipPipPipPipPip
  • The Guide
  • The Guide
  • 23,384 posts

Posted 17 January 2014 - 07:14 PM

Even at minimum settings, that only tweaks graphical aspects. Trouble is, maxed out or bare minimum, you haven't got any graphical ability. The entire thing is just additional processor load, with a game that -- due to how it was made -- is doing as much work as an online Arma II multiplayer session is doing. Where, in the words of Rocket, "the Arma II engine would constantly feed all the information of every player, every tree, every building, every vehicle, every zombie, animal, and piece of loot to each player whether nearby or thousands of kilometers away."

MWO...in terms of processor load, rivals it. Celerons and other non-high-end processors can barely even run the game even with great video cards and copious amounts of ram.

But with something that requires some of your processor's power to even do the graphics... and sometimes it's a miracle it works at all.

OS CPU RAM VIDEO HD DX
MIN WinXP SP3 2 Duo 2.66Ghz/X2 245e 4GB 8800GT/HD 5600 8GB 9
BEST Windows 7 i3-2500/X4 650 8GB GTX 285/HD 5830 8GB 9

Mine is
Windows 7 AMD Phenom II X4 3.60ghz. 16GB HD 6850 Lots 11
(Doesn't seem to like keeping the spaces/formating.

Edited by Koniving, 17 January 2014 - 07:24 PM.


#5 Lucky Noob

    Member

  • PipPipPipPipPipPipPipPip
  • The Sovereign
  • The Sovereign
  • 1,149 posts

Posted 17 January 2014 - 08:22 PM

i just wanna remind about incomming DX 11,

#6 Felio

    Member

  • PipPipPipPipPipPipPipPip
  • Philanthropist
  • Philanthropist
  • 1,721 posts

Posted 17 January 2014 - 10:12 PM

Now this is getting weird. When I tested it on my laptop, I went from 17 FPS to more than 60.

Playing again without changing anything, I'm back at my usual 17.

I've seen mention of setting OptionCfg.sys_spec to 0 in the attributes.xml file to enable a custom user.cfg, but I don't see that variable in the file. Maybe they took it out and it is no longer necessary? (Edit: actually I found it, trying that...)

Edited by Felio, 17 January 2014 - 10:39 PM.


#7 ApolloKaras

    Member

  • PipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 1,974 posts
  • LocationSeattle, Washington

Posted 17 January 2014 - 10:42 PM

View PostKoniving, on 17 January 2014 - 06:47 PM, said:

Intel HD graphics....is NOT a graphics card. It's simply a video port that consumes processor power to 'try' and simulate a graphics card.

Given that MWO is boatloads more processor intensive than most high end modern games, a video card emulation that eats up processor power only makes it even worse.

She needs a real video card. (A graphics card that used to be high end in 1998 would have better performance. Yes, it really would.)



Thats not quite accurate any longer. Even intel has come a long way with the HD series.




#8 Koniving

    Welcoming Committee

  • PipPipPipPipPipPipPipPipPipPipPipPipPip
  • The Guide
  • The Guide
  • 23,384 posts

Posted 18 January 2014 - 07:47 AM

View PostSaxie, on 17 January 2014 - 10:42 PM, said:



Thats not quite accurate any longer. Even intel has come a long way with the HD series.





That's a graphics card. Was he talking about Integrated Graphics HD (not a card, part of the motherboard), or the HD graphics card? There's a huge difference between the two, and almost always the laptops have the integrated graphics.

"Intel® HD Graphics, Iris™, and Iris™ Pro graphics enhance visual computing, do not require an add-in card, and are ideal for gaming and video, offering faster frame times, editing, and sharing, low energy use, and superior image quality."

This confuses me.

Anyway.

MWO is a processor hog (seriously the only other game I've ever seen with anywhere near that processor use is Arma 2 and DayZ Mod which runs on Arma 2. And the performance difference of DayZ Mod versus DayZ Standalone is incredible, despite the far better and updated graphics. Bf4 on ultimate settings, Metro Last Light (which slows my computer to a 16 FPS crawl on minimum settings), Skyrim with 255 mods, most of them game changing heavily scripted quest mods, don't eat up the processor nearly as bad as MWO does. Of course, one main issue is that MWO still only uses a single core; and it's the same one that everything else randomly running tends to go to first.

MWO could really benefit from taking advantage of multi-core processors. I've got 4 of them. MWO runs, and I'm at 89% of the first core and almost nothing of the others. (DayZ Mod / Arma II eats up a bit over 85% on multiplayer, but it spreads from 20 to 45% across 3 cores leaving one free. That's with a 100 player server, on a map both more detailed and significantly larger than dozens of Alpines thrown together. "225,000 km^2", a 3 hour sprint from one edge of the land to the other, not counting any of the water beyond the land or islands).

Edited by Koniving, 18 January 2014 - 08:07 AM.


#9 mikelovskij

    Member

  • PipPipPip
  • Philanthropist
  • 60 posts

Posted 18 January 2014 - 07:55 AM

View PostKoniving, on 18 January 2014 - 07:47 AM, said:


That's a graphics card. Was he talking about Integrated Graphics HD (not a card, part of the motherboard), or the HD graphics card? There's a huge difference between the two, and almost always the laptops have the integrated graphics.

No, intel HD4600 is the new integrated graphic inside the fourth generation Intel Core processors. It is not a graphic card.

#10 Koniving

    Welcoming Committee

  • PipPipPipPipPipPipPipPipPipPipPipPipPip
  • The Guide
  • The Guide
  • 23,384 posts

Posted 18 January 2014 - 08:09 AM

View Postmikelovskij, on 18 January 2014 - 07:55 AM, said:

No, intel HD4600 is the new integrated graphic inside the fourth generation Intel Core processors. It is not a graphic card.


That's what I thought. Given that the person's girlfriend's laptop suffers even on "below minimum settings" which is what the user.cfg is for, it definitely isn't a "fourth generation integrated graphics" thingy. Though I will say I'm mildly impressed with it -- I'll be happier if I see them made with the ability to work together with the added in graphics card.

Edited by Koniving, 18 January 2014 - 08:10 AM.


#11 Napoleon_Blownapart

    Member

  • PipPipPipPipPipPipPipPip
  • Shredder
  • Shredder
  • 1,173 posts

Posted 18 January 2014 - 08:46 AM

just wanted to mention, dont rely on results in testing grounds.i can run very high settings there that i crash with in a real match...

#12 Felio

    Member

  • PipPipPipPipPipPipPipPip
  • Philanthropist
  • Philanthropist
  • 1,721 posts

Posted 18 January 2014 - 12:05 PM

View PostKoniving, on 18 January 2014 - 07:47 AM, said:

MWO could really benefit from taking advantage of multi-core processors. I've got 4 of them. MWO runs, and I'm at 89% of the first core and almost nothing of the others.


There is seriously no excuse for this. We have multicore processors in our phones now. Get with the program.

View PostGorantir, on 18 January 2014 - 08:46 AM, said:

just wanted to mention, dont rely on results in testing grounds.i can run very high settings there that i crash with in a real match...


Yeah, performance is definitely smoother on the Testing Grounds. By about 30 FPS. Have to wonder why.

View PostBraddack, on 17 January 2014 - 08:22 PM, said:

i just wanna remind about incomming DX 11,


I'm not correcting you here, just commenting on the subject. (everyone expects everyone to be so hostile)

They said they have a functional DX11 renderer, but its performance is worse than the present DX9, so their goal is to get it to be more efficient than the DX9 before releasing it.

They also clarified it won't create any visual improvements; they are only doing it to future-proof the game for compatability.

#13 Nick Makiaveli

    Member

  • PipPipPipPipPipPipPipPipPip
  • Bridesmaid
  • Bridesmaid
  • 2,188 posts
  • LocationKnee deep in mechdrek

Posted 18 January 2014 - 12:36 PM

View PostFelio, on 18 January 2014 - 12:05 PM, said:


There is seriously no excuse for this. We have multicore processors in our phones now. Get with the program.

Yeah, performance is definitely smoother on the Testing Grounds. By about 30 FPS. Have to wonder why.


They are using an older engine. Not a matter of flipping a switch to change that.

No one else is running around on the screen? You are the only one shooting? Lots of reasons why it has an easier time in the TG.

#14 Darth Futuza

    Member

  • PipPipPipPipPipPipPipPip
  • 1,239 posts

Posted 18 January 2014 - 12:47 PM

View PostNick Makiaveli, on 18 January 2014 - 12:36 PM, said:


They are using an older engine. Not a matter of flipping a switch to change that.

No one else is running around on the screen? You are the only one shooting? Lots of reasons why it has an easier time in the TG.

Yeah they'd have to switch from CryEngine to something else to really take advantage of multiple cores. Which means you won't get ui 2.0 in time. Also most phone apps don't use multi-core/multi-threading programming methods, the phone just has mutli core processors.

Edited by Darth Futuza, 18 January 2014 - 12:47 PM.


#15 Felio

    Member

  • PipPipPipPipPipPipPipPip
  • Philanthropist
  • Philanthropist
  • 1,721 posts

Posted 18 January 2014 - 07:02 PM

View PostNick Makiaveli, on 18 January 2014 - 12:36 PM, said:


They are using an older engine. Not a matter of flipping a switch to change that.

No one else is running around on the screen? You are the only one shooting? Lots of reasons why it has an easier time in the TG.


The frame rate difference is there even before the cockpit startup. Or staring at the ground.
As for the engine... choosing that engine in the first place is inexcusable, then.

#16 Nick Makiaveli

    Member

  • PipPipPipPipPipPipPipPipPip
  • Bridesmaid
  • Bridesmaid
  • 2,188 posts
  • LocationKnee deep in mechdrek

Posted 19 January 2014 - 06:31 AM

View PostFelio, on 18 January 2014 - 07:02 PM, said:


The frame rate difference is there even before the cockpit startup. Or staring at the ground.
As for the engine... choosing that engine in the first place is inexcusable, then.


In the Training Grounds no other mechs are moving around, or shooting. Thus less information coming to your computer to be processed. Your client has to know where they are at all times in case you suddenly decide to look their way,. That's not present on the TG as the mecs are not moving.

As to choice of engine, you do know they have to pay for the use of the engine right? Or pay to develop one from scratch? Also, there has to be a cut-off point somewhere, or do you want a game that can be played on smartphones? Sucks that you have a laptop that is, apparently, below the cut-off, but that's life.

The argument could also be made that the fault is yours (or the gf's) for not choosing a laptop that had an actual video card or one that could be upgraded etc. I'm sure that running MWO wasn't a primary factor in the decision to buy it, so that argument wouldn't be fair to make. So by the same token, why not recognize that the devs had to think about things other than can old or weak PCs run this game on this engine?

#17 Modo44

    Member

  • PipPipPipPipPipPipPipPipPip
  • Bad Company
  • 3,559 posts

Posted 19 January 2014 - 07:16 AM

View PostKoniving, on 17 January 2014 - 06:47 PM, said:

Intel HD graphics....is NOT a graphics card. It's simply a video port that consumes processor power to 'try' and simulate a graphics card.

It was that. It is an actual dedicated circuit now. Still very bad except for decoding video, but physical.

#18 ApolloKaras

    Member

  • PipPipPipPipPipPipPipPip
  • Ace Of Spades
  • Ace Of Spades
  • 1,974 posts
  • LocationSeattle, Washington

Posted 19 January 2014 - 10:02 AM

View PostKoniving, on 18 January 2014 - 07:47 AM, said:


That's a graphics card. Was he talking about Integrated Graphics HD (not a card, part of the motherboard), or the HD graphics card? There's a huge difference between the two, and almost always the laptops have the integrated graphics.



Kon... thats an IGP test. not a Vid card test. Anything on the chip or on the mother board is considered Integrated Graphics.

Edited by Saxie, 19 January 2014 - 10:03 AM.


#19 Felio

    Member

  • PipPipPipPipPipPipPipPip
  • Philanthropist
  • Philanthropist
  • 1,721 posts

Posted 19 January 2014 - 04:30 PM

View PostNick Makiaveli, on 19 January 2014 - 06:31 AM, said:


As to choice of engine, you do know they have to pay for the use of the engine right? Or pay to develop one from scratch? Also, there has to be a cut-off point somewhere, or do you want a game that can be played on smartphones? Sucks that you have a laptop that is, apparently, below the cut-off, but that's life.


I don't fault them for making a game that doesn't run well on her laptop. That would be silly. I'm talking about them using an engine that doesn't utilize multi-core processors very well.

#20 Mechteric

    Member

  • PipPipPipPipPipPipPipPipPipPip
  • Overlord
  • Overlord
  • 7,308 posts
  • LocationRTP, NC

Posted 20 January 2014 - 07:51 AM

View PostFelio, on 19 January 2014 - 04:30 PM, said:


I don't fault them for making a game that doesn't run well on her laptop. That would be silly. I'm talking about them using an engine that doesn't utilize multi-core processors very well.


The GPU is the first most important thing in any recent 3D game, only after you're able to deal with it from the graphics perspective does the CPU come into play. They are inherently tied together, if one can't handle the job the other can't pick up the slack for it.





1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users