Shumabot, on 14 March 2013 - 12:22 PM, said:
640X800 per eye isn't actually that low. That's effective 1280x1600, which is roughly 50% better than a standard high def television.
It actually is that low, because your field of view is not simply split into two sections.
It's not like you can display half of the screen for one eye, and half for the other, and end up getting an effective 1280X1600.
Your two eyes are both perceiving a view of the world that hugely overlaps. So the effective resolution is not simply the addition of the pixels for one eye to those for the other. Look up screenshots for the Occulus, and you'll see that the view presented for each eye is virtually the same image. (although the screenshots are generally shown at far higher resolutions than you see on the Occulus screens themselves)
So, effectively, what you have is a 3D view of a scene being rendered at 600x800. Which is a horrifically low resolution to play a game at, expecially given how close the screen is to the viewer. (although they use some optics to effectively increase the density of the pixels near the center of your view) And even the occulus folks themselves acknowledge this... no one is foolhardy enough to think this is even remotely good enough for a consumer version of this. They are going to need to at least double it.
Quote
As for the lag, every interview I've seen takes pains to explain, EXPLICITLY, that the lag is due to the fact that you're bouncing the image between three screens (rift to tv to camera) and that the rift itself has a latency that is in nanoseconds and is comparable with any other wireless control device. It has the same latency as your mouse assuming you don't live in wired ball mouse hell.
Folks I know who have seen it first hand have said that the claims that the latency is merely some artifact of monitors is not actually true, despite what the rift folks say. They've said that it does in fact have latency. While it's definitely better than many predecessors, it's non-zero, and in a fast moving game you may see some serious problems. Now, that was less recent than their most recent showing in Vegas, so perhaps it's better since then. But a few months back, the guy at Arstechnica, despite being hugely supportive of the technology and thinking it was awesome, also pointed out that it did make him feel kind of nauseous.
Even Luckey himself only claims that the Occulus can achieve a latency of 30-40 ms under PERFECTLY OPTIMIZED conditions, and not accounting for the actual display hardware lag. As you can see from Ciller's link to Carmack's article, as well as tons of other literature, you can actually perceive latencies down to around 20ms.
Again, I will be thrilled to be proven wrong. If they produce a consumer version of this, and it works as they claim it will, that will be awesome. But I will believe it when I see it.
In terms of Carmack, the guy is without question a genius... but he's also financially involved in this effort, so I think his opinions regarding this have to be taken with a grain of salt. That's not to say that his thoughts on this are without merit... that's certainly not the case. But he isn't completely without bias in this regard.
Ultimately though, it comes down to this:
We are consumers. This does not exist as a product. Thus, "supporting" it doesn't actually mean anything. MWO could maximally support the OR, and you would not be able to use it, because it doesn't exist.