Jump to content

do not set a lowest limit to graphics


34 replies to this topic

#1 Axxon

    Member

  • PipPip
  • 35 posts

Posted 29 January 2012 - 02:55 AM

This is a graphics engine suggestion. Might not be implementable with YOURS (cryeng3). Might just be. Who knows..?

Do you know the game MINECRAFT? There is hardly a computer on which it would not play at a decent frames per sec.

So my suggestion is:
At "graphics level" settings, it should not be the usual options which dominate, but an algorithm, that constantly adjusts visuals during playing TO A GOAL TO ACHIEVE A CERTAIN FPS.
Which, say, would be 30. Or 60. Or whichever the user would like. Because those visual laggings make ANY game unplayable. No matter the fancy visuals, if when in the heat of battle, the user suddenly gets a 6 FPS. Or 1.

Or do you suggest me to always buy the latest hardware? What is it nowadays.. a 590GTX NV, or a 6990 Ati..? Buy it for me. I will quit talking. I promise! :) Until then, you should set to your graphics engine NO LOWEST LIMITS.

Oh, almost forgot, i would need a new power supply as well. And please, the power bill... would you please cover that as well...?

Point is:
Only the <frames per sec> matters. Everything else is irrelevant. Adjust it to each PC system. REAL-TIME.

#2 Maximilian Thorn

    Member

  • PipPipPipPipPip
  • 109 posts
  • LocationIn the middle of a Mech battle

Posted 29 January 2012 - 04:03 AM

Your proposal would certainly address the lag/latency issues resulting from the graphic effects other threads have been asking for.

#3 John Clavell

    Member

  • PipPipPipPipPipPipPipPip
  • 1,609 posts

Posted 29 January 2012 - 04:06 AM

The graphics of Minecraft are a result of style and the fact it's based on voxels. It's not based upon the considerations that games like MWO need to factor. I'd also like to point out that Minecraft did not run so smooth during Alpha. You'd get massive frame rate drops, which is still common on online depending on the server. Minecraft is also a RAM hog.

MWO will cover DX9 - DX11 hardware. That is a lot of freedom and legacy support right there. DirectX 9 came out in 2005. Almost 7 years ago. It's impossible to build a computer game with no lowest limits. Modern computer games engines are built with a lowest limit in mind. This limit is determined by research. CryENGINE 3 is pretty darn scaleable. But what do you want to play this on an Atari 2600?

Edited by John Clavell, 29 January 2012 - 04:07 AM.


#4 Agent CraZy DiP

    Member

  • PipPipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 609 posts
  • LocationAZ - USA

Posted 29 January 2012 - 04:08 AM

It's called a stress test to benchmark your machine. I strongly recomend you run one. Most (if not all) modern day video software has a stress test feature somewhere in their settings.

#5 Axxon

    Member

  • PipPip
  • 35 posts

Posted 29 January 2012 - 04:29 AM

@3
Atari? no, i dont have that kind of an advanced machine. a C64 only. With a datasette. >)

I have a Ge6200. Hey! It is silent. :) and low-power.. I have a bigger graphics cannon too, but thats noisy. I dont like that.
But would you believe my CPU is a 2G C2D mobile..?


In strategy games when you ZOOM OUT, there is a REDUCING of the VERTICES affecting all the models visible on the screen, so the computer does not go down in glorious flames trying to handle the softwares graphical needs.. IE the models suddenly being replaced by less-poly-count ones.. hey! from that far you cant see the difference anyways.. :P

Soo, you could do that up close too, to keep up visual smoothness; ~30 FPS! But i knew this was an engine-specific one... which the team does not develop, their own one.

#6 Fiachdubh

    Member

  • PipPipPipPipPipPipPip
  • 971 posts
  • LocationSkulking out along the Periphery somewhere.

Posted 29 January 2012 - 05:03 AM

Not holding out much hope but being able to play with a GeForce Go 7600 and 256mb would be nice.

Edited by Fiachdubh, 29 January 2012 - 05:03 AM.


#7 Axxon

    Member

  • PipPip
  • 35 posts

Posted 29 January 2012 - 06:24 AM

you know, MW2, the non-3DFX DOS-vesion, had wonderful graphics
all the colors.. the fog/visibility, the crystals on the ground, objects, buildings in cities, the skies..

All in SVGA. Not even directX1. ha-ha. Because in simplicity there is greatness!

THAT IS what todays directX-crazy softwarehouses forget.

But the nr1 feature in MW2 was the robot announcer. The feamle voice, that was soo calm, and peaceful even in the hottest immense of a battle, when the mech WAS REALLY about to fall apart under our A$$es.. awww the good old days... :)
MW3s man announcer was nervvy

I would recommend that you bring back the late MW announcers. And make an option, the SYSTEM MESSAGES how fast being played. (increase voice playback speed option. by a player-set %) At a significant rate increase, a helium inhale effect will take place, but the voice reporting will be FAST, and up-to-a-sec-date. That is needed in battle!

#8 Sug

    Member

  • PipPipPipPipPipPipPipPipPip
  • The People's Hero
  • The People
  • 4,630 posts
  • LocationChicago

Posted 29 January 2012 - 08:24 AM

View PostMaximilian Thorn, on 29 January 2012 - 04:03 AM, said:

Your proposal would certainly address the lag/latency issues resulting from the graphic effects other threads have been asking for.


How do your graphics settings affect your lag/latency?

#9 Axxon

    Member

  • PipPip
  • 35 posts

Posted 29 January 2012 - 09:58 AM

View PostSug, on 29 January 2012 - 08:24 AM, said:


How do your graphics settings affect your lag/latency?


The end-users computers drawing failure. The *shall buy new vga/cpu* problem. GRAPHICAL lag.
Its still "lag"

#10 Sug

    Member

  • PipPipPipPipPipPipPipPipPip
  • The People's Hero
  • The People
  • 4,630 posts
  • LocationChicago

Posted 29 January 2012 - 10:14 AM

Ah. I've always heard that refered to as choppy or chugging graphics. Yeah, "lag"

#11 Durant Carlyle

    Member

  • PipPipPipPipPipPipPipPipPip
  • Survivor
  • Survivor
  • 3,877 posts
  • LocationClose enough to poke you with a stick.

Posted 29 January 2012 - 11:09 AM

I certainly hope that they don't do as the OP suggested. I want my Super-Ultra graphics settings, darn it. I play at 1920x1080 resolution, and I like to have all graphics settings at their highest option. If my system cannot handle it then I reduce the least important settings one step at a time until it can, and then I reserve some of my earnings for a new graphics card that can handle the highest settings.

I don't want the game deciding what the graphics look like on my machine. That's my choice.

I do the same thing for EVE Online. Most people turn off a lot of the graphics settings and the sound and such to get the maximum out of it during fleet fights. I don't. Every setting is on it's highest option. I prefer to see the turret effects and explosions and stuff, and I like the sound effects too. I do turn off brackets for big fights though.

#12 Harrow

    Member

  • PipPipPipPipPip
  • 190 posts

Posted 29 January 2012 - 11:25 AM

View PostSug, on 29 January 2012 - 08:24 AM, said:


How do your graphics settings affect your lag/latency?


The biggest culprit is 'skipping' or 'rubber banding' as its called in MMO's. It becomes especially noticeable in Player versus Player games where your machines inability to draw the graphics as fast as the game server is updating your client. So Imagine an enemy mechs comes into your field of fire on your right and you begin to prepare to fire, maybe you even do so, but suddenly the enemy mech is now leaving your field of fire on your left and you never saw the movement in between as if he dissappeared on one side and then reappeared on another. Now your first instinct will be to instantly call hacks, flame the guy on the forums, and refuse to play against him or his group ever again. The reality is your machine is just slow and its really your fault :-P

#13 Axxon

    Member

  • PipPip
  • 35 posts

Posted 29 January 2012 - 11:51 AM

@12
I suggested an -OPTIONABLE- AI, which monitors graphical needs. Turn it off, and you have the legacy setting-options.

And i suggested interference with the LOWEST-end of settings. I dont care about the highest.

And to you i suggest: read before you reply!

#14 Sug

    Member

  • PipPipPipPipPipPipPipPipPip
  • The People's Hero
  • The People
  • 4,630 posts
  • LocationChicago

Posted 29 January 2012 - 11:55 AM

View PostHarrow, on 29 January 2012 - 11:25 AM, said:


The biggest culprit is 'skipping' or 'rubber banding' as its called in MMO's. It becomes especially noticeable in Player versus Player games where your machines inability to draw the graphics as fast as the game server is updating your client. So Imagine an enemy mechs comes into your field of fire on your right and you begin to prepare to fire, maybe you even do so, but suddenly the enemy mech is now leaving your field of fire on your left and you never saw the movement in between as if he dissappeared on one side and then reappeared on another. Now your first instinct will be to instantly call hacks, flame the guy on the forums, and refuse to play against him or his group ever again. The reality is your machine is just slow and its really your fault :-P



I disagree Harrow. If you're seeing people zip around the screen or jumpcut from one spot to the other (or run in place), that's lag/latency. it's caused by a slow connection somewhere between them the server and you.

If the whole screen starts chugging or running choppy, that's your graphics card/pc not being able to draw the increased details fast enough.

Your graphics setting don't affect your lag/latency, the computer is not sending or receiving anti aliasing or anisotropic filtering information from your/their pc

Edited by Sug, 29 January 2012 - 11:59 AM.


#15 Treffies

    Member

  • PipPip
  • 49 posts

Posted 29 January 2012 - 12:15 PM

Most modern games try to accommodate hardware from the past 3 years. If there are people using 7 year old hardware it's time for an update. You can make a pretty good computer for around $600

#16 Harrow

    Member

  • PipPipPipPipPip
  • 190 posts

Posted 29 January 2012 - 12:19 PM

View PostSug, on 29 January 2012 - 11:55 AM, said:



I disagree Harrow. If you're seeing people zip around the screen or jumpcut from one spot to the other (or run in place), that's lag/latency. it's caused by a slow connection somewhere between them the server and you.

If the whole screen starts chugging or running choppy, that's your graphics card/pc not being able to draw the increased details fast enough.

Your graphics setting don't affect your lag/latency, the computer is not sending or receiving anti aliasing or anisotropic filtering information from your/their pc


Yes but the increased CPU load caused by these games when you're graphics card isn't up to snuff can still have an effect on how fast you also process the network information that has to come to the CPU in the traditional ethernet stack. So unless you are running one of those fancy gamer NICs, your CPU/GPU load can have a cascading effect of lag. You can have a great connection and still have this issue occur because your cpu/gpu is bottlenecking your computers ability to even process the information being sent from server to client over a great connection. You normally don't notice this if you have a great rig, but not everyone does. And to address your point about anti aliasing or anistropic data not being sent over the connection; the data being sent tells your client 'where' to draw and not 'how' to draw. So if your CPU/GPU is still busy rendering the information it received 30ms ago it will lead to the appearance of 'server' or 'net' lag, when in fact its PC computational lag.

#17 Harrow

    Member

  • PipPipPipPipPip
  • 190 posts

Posted 29 January 2012 - 12:26 PM

View PostTreffies, on 29 January 2012 - 12:15 PM, said:

Most modern games try to accommodate hardware from the past 3 years. If there are people using 7 year old hardware it's time for an update. You can make a pretty good computer for around $600


After playing SWTOR I feel that its not about how new the equipment is (within reason) but the quality of the components. And not just in that game. I've seen people upgrading from mid range components to enthusiast/gamer level components just to have a smooth and enjoyable gaming experience. You used to be able to survive on just mid range gear but its rapidly becoming not the case. Having said that you can still 'upgrade' a strong computer for around $600 bucks. I recently upgraded mine, and since I already had a 1200 watt powersupply, full tower case, great graphics card etc. All I had to buy was the cpu, motherboard, and memory. I got an AMD 8 core, asus sabertooth mobo, and 16 gbs of ram for less than $600.

#18 Sug

    Member

  • PipPipPipPipPipPipPipPipPip
  • The People's Hero
  • The People
  • 4,630 posts
  • LocationChicago

Posted 29 January 2012 - 12:51 PM

View PostHarrow, on 29 January 2012 - 12:19 PM, said:

Yes but the increased CPU load caused by these games when you're graphics card isn't up to snuff can still have an effect on how fast you also process the network information that has to come to the CPU in the traditional ethernet stack. So unless you are running one of those fancy gamer NICs, your CPU/GPU load can have a cascading effect of lag. You can have a great connection and still have this issue occur because your cpu/gpu is bottlenecking your computers ability to even process the information being sent from server to client over a great connection. You normally don't notice this if you have a great rig, but not everyone does. And to address your point about anti aliasing or anistropic data not being sent over the connection; the data being sent tells your client 'where' to draw and not 'how' to draw. So if your CPU/GPU is still busy rendering the information it received 30ms ago it will lead to the appearance of 'server' or 'net' lag, when in fact its PC computational lag.


Again, I disagree Harrow. For network information to "overload" your computers ability to process data and graphical information, your computer would have to be so old it couldn't even install games made in the last decade.

Every gamer network card review I've read has shown an improvement of 5% if even that. That's why not too many people have them. 100$ for like 2 more kbps isn't worth it. Unless you're using netflix and a bitorrent client while you play online games you probably won't notice any difference.

As far as I know, your graphics card being overwhelmed does not affect your cpu. They process entirely different information. When you want to benchmark your cpu, you send your graphics and resolution as low as possible. When testing just the GPU, you set everything high. Your cpu doesn't take over or pick up the slack of your gpu and vice versa.

And I'm sorry but I don't agree with a slow computer causing the appearance of internet lag/latency. I mean, at least I can tell the difference.

I know you probably just threw this number out there but 30ms is probably unnoticeable by human beings. 1ms is a thousandth of a second. Human reaction time is around 200ms.


Again, my information is probably out of date as I haven't been "hardcore" as far as pc hardware in about 2 years but I still try to keep up with new hardware and benchmarks.

#19 Axxon

    Member

  • PipPip
  • 35 posts

Posted 29 January 2012 - 12:52 PM

View PostTreffies, on 29 January 2012 - 12:15 PM, said:

Most modern games try to accommodate hardware from the past 3 years. If there are people using 7 year old hardware it's time for an update. You can make a pretty good computer for around $600


Ever heard of "this is the fastest..." "the worlds first..." and so on, stupid promises..? I have purchased so many *new* computer part *this* and *that* things.. that i now think f. all that..I dont tell you what to spend your money on, but me, i am really being tired..
hey! Maybe i wont play at all. Ill just watch you guys playing on youtube :)

The final droplet in the goblet was it when they introduced PCI-E. All my previous VGA cards - didnt even fit into it!
I hate this rush. Into nothingness!

#20 Harrow

    Member

  • PipPipPipPipPip
  • 190 posts

Posted 29 January 2012 - 01:24 PM

View PostSug, on 29 January 2012 - 12:51 PM, said:


Again, I disagree Harrow. For network information to "overload" your computers ability to process data and graphical information, your computer would have to be so old it couldn't even install games made in the last decade.

Every gamer network card review I've read has shown an improvement of 5% if even that. That's why not too many people have them. 100$ for like 2 more kbps isn't worth it. Unless you're using netflix and a bitorrent client while you play online games you probably won't notice any difference.

As far as I know, your graphics card being overwhelmed does not affect your cpu. They process entirely different information. When you want to benchmark your cpu, you send your graphics and resolution as low as possible. When testing just the GPU, you set everything high. Your cpu doesn't take over or pick up the slack of your gpu and vice versa.

And I'm sorry but I don't agree with a slow computer causing the appearance of internet lag/latency. I mean, at least I can tell the difference.

I know you probably just threw this number out there but 30ms is probably unnoticeable by human beings. 1ms is a thousandth of a second. Human reaction time is around 200ms.


Again, my information is probably out of date as I haven't been "hardcore" as far as pc hardware in about 2 years but I still try to keep up with new hardware and benchmarks.


I think you misread my post or I didnt articulate my thoughts properly. What i am saying is you can have a situation where your computers inability to graphicaly represent the real time data its being given and thus get the appearance of network lag. It can happen with an underpowered cpu or gpu or both. Online games like MMORPG's and FPS online games use more of a computers components at once then any other type of computer game. The point is, that what somone may look at and think is network lag, might not be network lag.

I know the numbers on discrete cards as well and I agree. That wasn't my point either. The point is that discrete cards ignore the traditional ethernet stack (if you don't know what i mean by this then ask) so that on some computers (older than say 3 or 4 years) they do have a significant impact on fps/lag issues because they offload more work from the CPU. the reason the impact isnt felt as much on more of the newer computers is due to the effectiveness of new northbridge/southbridge chipsets which 'offload' more work from the CPU.

Which brings us to the GPU. yes most of the information about graphics is processed on the GPU to offload even more information from the CPU but the cpu is still involved. And benchmarks are targetted and only look at certain types of data processing on certain types of components. They are a good indication of how a particular component is going to perform yes. But games don't behave that way. They utilize multiple components of a computer with the CPU orchestrating all the activity into an orderly fashion so that we get an awesome game to play.

You are welcome to continue in your disagreement. I have experience this issue first hand.

Yes i just threw 30ms out there and I am aware of the average human reaction time. Actually isnt the average goalie in hockey capable of 300-400 ms reaction times against 60 to 90mph puck shots?

Its not really about being 'hardcore'. The point is if you have a weak link in the components you can get issues that appear to be one thing when they aren't. Thankfully, decent components are getting cheaper everyday.





1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users