Stormwolf, on 22 September 2014 - 12:06 AM, said:
I work in IT, 10 years is too long a time to make accurate predictions. Hell, I remember when the first CD-ROM players came on the market, burning ROMs at home seemed impossible, yet 4 years later you could even do that. There are far too many unexpected developments to make any predictions.
Though you are right that CPU's have hit a brick wall and that we had to move to multicore systems because of it. Even now various ways are being explored around this but there isn't a definite solution.
The problem is that multicore systems quickly run into Amdahl's Law. The benefit from continuously adding threads hits quick diminishing returns.
The decades-long explosion of computing technology revolved entirely around component miniaturization. Now that's no longer really getting us anything. Even though we can push a little further than scientists were predicting a handful of years ago before hitting quantum tunneling problems, doing so is not serving much of a purpose. That's largely why things haven't budged since Sandy Bridge.
Yes, sure, new steps could be taken. We could move to graphene chips, or light-based computing as HP is working on, or quantum computing for those tasks where it's actually faster (which isn't everything), or numerous other directions, but miniaturization easily netted us doubling performance every couple of years or so, sometimes faster, for decades. That's exponential growth, continuously. Now we've picked all that low-hanging fruit, and even a revolutionary step might net us, like, one doubling of performance, at enormous cost, instead of the half dozen or more doublings that came in a typical decade just as a matter of normal, relatively cheap fabrication process development.
I think it's safe to say things are going to last a lot longer than we've been used to. my standard for hardware replacement is usually the existence of something twice as powerful, without massive price hikes over what I own (so no $1000 CPUs). By that standard, I expect my 3570k to remain viable until close to 2020, if not all the way to 2020, before something comes out in the $200-$350 range that's twice as fast. This is especially true because new Intel chips on smaller processes increasingly lack overclockability. Doubling my 3570k at 3.6 (max stock 4-core turbo) is going to be hard enough, but doubling it at 4.2? I won't hold my breath. For GPUs my 7970 might take until like 2016/2017 to become replaceable, and that's assuming I am willing to spend more. If I purchased a GTX 970 right now, I'd expect it to last longer still. I wholly expect one purchased today could stay quite viable until decade's end.
Like the OP says, RAM is even more glacial, and only modest increaes in speed and price reductions to bring them into the mainstream have been necessary. Sure, we've hit a point where a few programs can saturate DDR3-1600, but DDR3 goes much higher at reasonable prices. 2133 is cheap, 2400 and up is getting there. It's going to be a long time before we need more RAM performance than DDR3-2400 can provide, so DDR4 is mostly useless to the consumer.
Edited by Catamount, 22 September 2014 - 07:35 AM.