

User.cfg For Low-End Computers Doesn't Help Fps?
#1
Posted 17 January 2014 - 06:34 PM
I'm using the "Lowest Pro" config I found here: http://mwomercs.com/...53#entry1446553
It might be worth mentioning my laptop has a dedicated Nvidia card, and hers has only Intel HD Graphics. I'm don't think it should make a difference, but maybe it does.
#2
Posted 17 January 2014 - 06:47 PM
Given that MWO is boatloads more processor intensive than most high end modern games, a video card emulation that eats up processor power only makes it even worse.
She needs a real video card. (A graphics card that used to be high end in 1998 would have better performance. Yes, it really would.)
Edited by Koniving, 17 January 2014 - 06:49 PM.
#3
Posted 17 January 2014 - 07:01 PM

I know it's not a graphics card, it just seems like telling the game to do less work should reduce the load on whatever you are using. Unless the Intel HD Graphics is just not doing the biggest things that the user.cfg file reduces in the first place.
Edited by Felio, 17 January 2014 - 07:02 PM.
#4
Posted 17 January 2014 - 07:14 PM
MWO...in terms of processor load, rivals it. Celerons and other non-high-end processors can barely even run the game even with great video cards and copious amounts of ram.
But with something that requires some of your processor's power to even do the graphics... and sometimes it's a miracle it works at all.
OS CPU RAM VIDEO HD DX
MIN WinXP SP3 2 Duo 2.66Ghz/X2 245e 4GB 8800GT/HD 5600 8GB 9
BEST Windows 7 i3-2500/X4 650 8GB GTX 285/HD 5830 8GB 9
Mine is
Windows 7 AMD Phenom II X4 3.60ghz. 16GB HD 6850 Lots 11
(Doesn't seem to like keeping the spaces/formating.
Edited by Koniving, 17 January 2014 - 07:24 PM.
#5
Posted 17 January 2014 - 08:22 PM
#6
Posted 17 January 2014 - 10:12 PM
Playing again without changing anything, I'm back at my usual 17.
I've seen mention of setting OptionCfg.sys_spec to 0 in the attributes.xml file to enable a custom user.cfg, but I don't see that variable in the file. Maybe they took it out and it is no longer necessary? (Edit: actually I found it, trying that...)
Edited by Felio, 17 January 2014 - 10:39 PM.
#7
Posted 17 January 2014 - 10:42 PM
Koniving, on 17 January 2014 - 06:47 PM, said:
Given that MWO is boatloads more processor intensive than most high end modern games, a video card emulation that eats up processor power only makes it even worse.
She needs a real video card. (A graphics card that used to be high end in 1998 would have better performance. Yes, it really would.)
Thats not quite accurate any longer. Even intel has come a long way with the HD series.
#8
Posted 18 January 2014 - 07:47 AM
Saxie, on 17 January 2014 - 10:42 PM, said:
Thats not quite accurate any longer. Even intel has come a long way with the HD series.
That's a graphics card. Was he talking about Integrated Graphics HD (not a card, part of the motherboard), or the HD graphics card? There's a huge difference between the two, and almost always the laptops have the integrated graphics.
"Intel® HD Graphics, Iris™, and Iris™ Pro graphics enhance visual computing, do not require an add-in card, and are ideal for gaming and video, offering faster frame times, editing, and sharing, low energy use, and superior image quality."
This confuses me.
Anyway.
MWO is a processor hog (seriously the only other game I've ever seen with anywhere near that processor use is Arma 2 and DayZ Mod which runs on Arma 2. And the performance difference of DayZ Mod versus DayZ Standalone is incredible, despite the far better and updated graphics. Bf4 on ultimate settings, Metro Last Light (which slows my computer to a 16 FPS crawl on minimum settings), Skyrim with 255 mods, most of them game changing heavily scripted quest mods, don't eat up the processor nearly as bad as MWO does. Of course, one main issue is that MWO still only uses a single core; and it's the same one that everything else randomly running tends to go to first.
MWO could really benefit from taking advantage of multi-core processors. I've got 4 of them. MWO runs, and I'm at 89% of the first core and almost nothing of the others. (DayZ Mod / Arma II eats up a bit over 85% on multiplayer, but it spreads from 20 to 45% across 3 cores leaving one free. That's with a 100 player server, on a map both more detailed and significantly larger than dozens of Alpines thrown together. "225,000 km^2", a 3 hour sprint from one edge of the land to the other, not counting any of the water beyond the land or islands).
Edited by Koniving, 18 January 2014 - 08:07 AM.
#9
Posted 18 January 2014 - 07:55 AM
Koniving, on 18 January 2014 - 07:47 AM, said:
That's a graphics card. Was he talking about Integrated Graphics HD (not a card, part of the motherboard), or the HD graphics card? There's a huge difference between the two, and almost always the laptops have the integrated graphics.
No, intel HD4600 is the new integrated graphic inside the fourth generation Intel Core processors. It is not a graphic card.
#10
Posted 18 January 2014 - 08:09 AM
mikelovskij, on 18 January 2014 - 07:55 AM, said:
That's what I thought. Given that the person's girlfriend's laptop suffers even on "below minimum settings" which is what the user.cfg is for, it definitely isn't a "fourth generation integrated graphics" thingy. Though I will say I'm mildly impressed with it -- I'll be happier if I see them made with the ability to work together with the added in graphics card.
Edited by Koniving, 18 January 2014 - 08:10 AM.
#11
Posted 18 January 2014 - 08:46 AM
#12
Posted 18 January 2014 - 12:05 PM
Koniving, on 18 January 2014 - 07:47 AM, said:
There is seriously no excuse for this. We have multicore processors in our phones now. Get with the program.
Gorantir, on 18 January 2014 - 08:46 AM, said:
Yeah, performance is definitely smoother on the Testing Grounds. By about 30 FPS. Have to wonder why.
Braddack, on 17 January 2014 - 08:22 PM, said:
I'm not correcting you here, just commenting on the subject. (everyone expects everyone to be so hostile)
They said they have a functional DX11 renderer, but its performance is worse than the present DX9, so their goal is to get it to be more efficient than the DX9 before releasing it.
They also clarified it won't create any visual improvements; they are only doing it to future-proof the game for compatability.
#13
Posted 18 January 2014 - 12:36 PM
Felio, on 18 January 2014 - 12:05 PM, said:
There is seriously no excuse for this. We have multicore processors in our phones now. Get with the program.
Yeah, performance is definitely smoother on the Testing Grounds. By about 30 FPS. Have to wonder why.
They are using an older engine. Not a matter of flipping a switch to change that.
No one else is running around on the screen? You are the only one shooting? Lots of reasons why it has an easier time in the TG.
#14
Posted 18 January 2014 - 12:47 PM
Nick Makiaveli, on 18 January 2014 - 12:36 PM, said:
They are using an older engine. Not a matter of flipping a switch to change that.
No one else is running around on the screen? You are the only one shooting? Lots of reasons why it has an easier time in the TG.
Yeah they'd have to switch from CryEngine to something else to really take advantage of multiple cores. Which means you won't get ui 2.0 in time. Also most phone apps don't use multi-core/multi-threading programming methods, the phone just has mutli core processors.
Edited by Darth Futuza, 18 January 2014 - 12:47 PM.
#15
Posted 18 January 2014 - 07:02 PM
Nick Makiaveli, on 18 January 2014 - 12:36 PM, said:
They are using an older engine. Not a matter of flipping a switch to change that.
No one else is running around on the screen? You are the only one shooting? Lots of reasons why it has an easier time in the TG.
The frame rate difference is there even before the cockpit startup. Or staring at the ground.
As for the engine... choosing that engine in the first place is inexcusable, then.
#16
Posted 19 January 2014 - 06:31 AM
Felio, on 18 January 2014 - 07:02 PM, said:
The frame rate difference is there even before the cockpit startup. Or staring at the ground.
As for the engine... choosing that engine in the first place is inexcusable, then.
In the Training Grounds no other mechs are moving around, or shooting. Thus less information coming to your computer to be processed. Your client has to know where they are at all times in case you suddenly decide to look their way,. That's not present on the TG as the mecs are not moving.
As to choice of engine, you do know they have to pay for the use of the engine right? Or pay to develop one from scratch? Also, there has to be a cut-off point somewhere, or do you want a game that can be played on smartphones? Sucks that you have a laptop that is, apparently, below the cut-off, but that's life.
The argument could also be made that the fault is yours (or the gf's) for not choosing a laptop that had an actual video card or one that could be upgraded etc. I'm sure that running MWO wasn't a primary factor in the decision to buy it, so that argument wouldn't be fair to make. So by the same token, why not recognize that the devs had to think about things other than can old or weak PCs run this game on this engine?
#17
Posted 19 January 2014 - 07:16 AM
Koniving, on 17 January 2014 - 06:47 PM, said:
It was that. It is an actual dedicated circuit now. Still very bad except for decoding video, but physical.
#18
Posted 19 January 2014 - 10:02 AM
Koniving, on 18 January 2014 - 07:47 AM, said:
That's a graphics card. Was he talking about Integrated Graphics HD (not a card, part of the motherboard), or the HD graphics card? There's a huge difference between the two, and almost always the laptops have the integrated graphics.
Kon... thats an IGP test. not a Vid card test. Anything on the chip or on the mother board is considered Integrated Graphics.
Edited by Saxie, 19 January 2014 - 10:03 AM.
#19
Posted 19 January 2014 - 04:30 PM
Nick Makiaveli, on 19 January 2014 - 06:31 AM, said:
As to choice of engine, you do know they have to pay for the use of the engine right? Or pay to develop one from scratch? Also, there has to be a cut-off point somewhere, or do you want a game that can be played on smartphones? Sucks that you have a laptop that is, apparently, below the cut-off, but that's life.
I don't fault them for making a game that doesn't run well on her laptop. That would be silly. I'm talking about them using an engine that doesn't utilize multi-core processors very well.
#20
Posted 20 January 2014 - 07:51 AM
Felio, on 19 January 2014 - 04:30 PM, said:
I don't fault them for making a game that doesn't run well on her laptop. That would be silly. I'm talking about them using an engine that doesn't utilize multi-core processors very well.
The GPU is the first most important thing in any recent 3D game, only after you're able to deal with it from the graphics perspective does the CPU come into play. They are inherently tied together, if one can't handle the job the other can't pick up the slack for it.
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users