Jump to content

MWO in 3D?


13 replies to this topic

#1 nimrodusmaximus

    Member

  • PipPipPipPipPip
  • 155 posts

Posted 27 July 2012 - 09:22 AM

Does anyone know if MWO supports 3D? I'm using an NVidia 680.

If so, any thoughts on the best 27" 3D monitor (I have the impression that the Acer is considered the "best" for Nvidia)? Or should I wait until the next generation of monitors?

Thanks!

#2 BFett

    Member

  • PipPipPipPipPipPipPip
  • 751 posts
  • LocationA galaxy far far away...

Posted 27 July 2012 - 09:31 AM

Yes, it will be supported. http://mwomercs.com/...__fromsearch__1

#3 nimrodusmaximus

    Member

  • PipPipPipPipPip
  • 155 posts

Posted 27 July 2012 - 09:40 AM

Awesome. Time to go to Fry's ;)

#4 MechRaccoon

    Member

  • PipPipPipPipPipPip
  • Knight Errant
  • Knight Errant
  • 312 posts
  • LocationIn a dumpster. A walking, nuclear powered, space dumpster with lasers on it.

Posted 27 July 2012 - 10:03 AM

3D? Really?
Posted Image
Posted Image
http://www.youtube.c...1&v=umDr0mPuyQc

#5 Dimestore

    Member

  • PipPipPipPipPipPip
  • Overlord
  • Overlord
  • 302 posts
  • LocationVancouver (Pacific Standard Time Zone)

Posted 27 July 2012 - 10:11 AM

I checked with Paul on this and posted the results in a different thread. At the time (2 weeks ago or so) there were problems with the hud and fixing 3d wasn't top priority but they do have 4 test rigs set up to support 3d so it seemed to be mostly a matter of time.

I haven't tried turning on my 3d to see how bad it is but apparently somee of the hud elements become unreadable and misplaced.

#6 nimrodusmaximus

    Member

  • PipPipPipPipPip
  • 155 posts

Posted 27 July 2012 - 01:41 PM

Just got back from Frys and have been playing with the 3D on my Acer 27".

Anything special I need to do to get 3D working for MWO? Or do you just boot it up and it starts running?

Thanks!

#7 silentD11

    Member

  • PipPipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 816 posts
  • LocationWashington DC

Posted 27 July 2012 - 03:19 PM

Cryengine 3 supports 3d. But two more things need to happen. The devs need to make sure it's working properly and nvidia needs to create a profile for it and fix up their drivers, that won't happen till the game is released.

Also just an FYI, running things in 3d mode cuts your frame rate in half. You should be fine with a 680 but keep it in mind. Every now and then you see some moron try to run 3d over three monitors.

#8 nimrodusmaximus

    Member

  • PipPipPipPipPip
  • 155 posts

Posted 28 July 2012 - 05:02 PM

Eh. Lets just say im toying with getting a 690 to sli with my 680...

Edit: you cannot, apparently, sli a 690 with a 680. So what do you do wth the 680 if you want a 690?

Edited by coolname, 28 July 2012 - 05:39 PM.


#9 LordDread

    Member

  • PipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 212 posts
  • LocationMelbourne Australia

Posted 28 July 2012 - 06:20 PM

install the 690 and put the 680 in a box :rolleyes: .. or ebay it :rolleyes:

#10 Ouster

    Member

  • PipPip
  • 47 posts
  • LocationSilicon Valley

Posted 28 July 2012 - 08:07 PM

Wait a 690 and a 680 should just be 3 way SLI with 680's. Why would that not be supported? I think both Nvidia and AMD support up to 4 cards in general there may be some exceptions but that was my understanding.

I still hoping the game support AMD HD3D if that was implemented it would mean no driver side hack to enable / disable the game running in 3D. Once those come in to play you always loose because there always be some 3d imperfection no matter how much the driver tries to correct for it. The way the game renders has to be constructed right or you always have some artifacts.

Edited by Ouster, 28 July 2012 - 10:54 PM.


#11 silentD11

    Member

  • PipPipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 816 posts
  • LocationWashington DC

Posted 29 July 2012 - 10:17 PM

View PostOuster, on 28 July 2012 - 08:07 PM, said:

Wait a 690 and a 680 should just be 3 way SLI with 680's. Why would that not be supported? I think both Nvidia and AMD support up to 4 cards in general there may be some exceptions but that was my understanding.

I still hoping the game support AMD HD3D if that was implemented it would mean no driver side hack to enable / disable the game running in 3D. Once those come in to play you always loose because there always be some 3d imperfection no matter how much the driver tries to correct for it. The way the game renders has to be constructed right or you always have some artifacts.



crossfire/sli = two of the same card or one dual gpu card that does it internally
trifire/trisli= three of the same card
quadfire/quadsli = four of the same card or two dual gpu cards.

For the most part those rules cannot be broke unless you're running a lucid chip on your board, or some of of hybrid crossfire with an AMD IGP platform.

Also keep in mind a crap ton of chipsets don't perform fully in SLI or crossfire when using two PCIE slots because they have to cut them down. That's one of the reasons 1366 and 2011 sockets cost so much.

#12 Pershaw

    Member

  • PipPipPipPipPip
  • 139 posts

Posted 30 July 2012 - 05:39 AM

Think a 3GB 590 GTX will be enough for 3D?

#13 silentD11

    Member

  • PipPipPipPipPipPipPip
  • Legendary Founder
  • Legendary Founder
  • 816 posts
  • LocationWashington DC

Posted 30 July 2012 - 09:26 AM

View PostPershaw, on 30 July 2012 - 05:39 AM, said:

Think a 3GB 590 GTX will be enough for 3D?


That doesn't have 3gb, it has 1.5 I'm guessing. To be more exact, it has 1.5gb per GPU. When doing multi GPU you do not double the memory, since it loads the frame buffer into both cards VRAM. So even if you have 4 gpus with 1.5gb of memory, you only have 1.5gb.

I haven't seen a bench on a 590, it should be OK. But for 3d, check what the frame rates are and then cut them in half and that's what it will be. When you run in 3d it renders everything twice which is why your frame rate gets cut in half.

#14 Ouster

    Member

  • PipPip
  • 47 posts
  • LocationSilicon Valley

Posted 30 July 2012 - 11:04 PM

View PostsilentD11, on 29 July 2012 - 10:17 PM, said:



crossfire/sli = two of the same card or one dual gpu card that does it internally
trifire/trisli= three of the same card
quadfire/quadsli = four of the same card or two dual gpu cards.

For the most part those rules cannot be broke unless you're running a lucid chip on your board, or some of of hybrid crossfire with an AMD IGP platform.

Also keep in mind a crap ton of chipsets don't perform fully in SLI or crossfire when using two PCIE slots because they have to cut them down. That's one of the reasons 1366 and 2011 sockets cost so much.



Well I did a little more research form the AMD side and to a lesser extra on the Nvidia side. And from what I can tell all resent AMD cards will cross fire within the same major series except maybe a few ultra low end models and some non reference designs.
So for example 79xx with 79xx can crossfire. So a 7990 which is a duel card dues still CrossFire with 7970 (single) cards and 7950 cards so AMD does support such configurations (like clock speeds are recommended to reduce performance loss). As for what it’s labeled on the cards they are ether CrossFire or CrossFireX. The CrossFireX is a certification for quality that guaranties that quad CrossFire will work. But what I am guessing is as long as the card has two link ports they will probably quad CrossFire whether it say CrossFireX or not but it at your own risk. Apparently even some of the lower end cards can crossfire as well in a software mode even if they don’t have a bridge connecter but this will cause performance degradation. You do of course have to have proper pci express channels on the motherboard and an appropriate chipset. The only other exception I know of is APU hybrid crossfire they only support one extra card in crossfire mode.

I totally agree if you are going to go with a 2 plus gpu’s in general you should just go ahead and buy a board that support 4 8x lanes or better. But there no reason you can’t use something like a 7990 and a 7970 on a 2 x8 lane board. The whole point of the external link connecter is to guaranty sufficient bandwidth between each card but as always the more bandwidth the better.

But from what I can gather Nvidia just does not test and qualify all there cards for certain SLI configuration so even though they may be perfectly capable of it. The driver locks you out but beyond that I haven't been having a much luck finding out information about nvidias SLI product line and its restriction I guess for Nvidia in general just have to trust what it says on the product box and assume Identical card only.





4 user(s) are reading this topic

0 members, 4 guests, 0 anonymous users