Oh, no doubt. It's just that Nehalem was a bit too soon (and I was more broke then) and afterwards it just didn't seem that exciting - I thought I'd do it with Haswell, what with the TSX, AVX2 and what not, but then the -K series got crippled for differentiation reasons, and the heat issues, and overall focus on energy over performance and just... meh.
And while I'm sure the processor is limiting the graphics card somewhat, as far as I could tell, it's not that much.
Let's say my overclocked Q6600 is roughly as fast as the AMD A10-5800K (good job, AMD!):
The A10, while being the slowest of the bunch, delivers just 10 fewer fps in Crysis than the 4770K:
yeah it's worse in the worst case but again, just meh.
But I am going to do it next year for sure, so I hope Nvidia is working hard on Maxwell as well!
Yeah, our department is now fully on laptops and like 80% of the people never take them off their desks or even use the built in input devices. Good for me though, as I'm going to get myself some awesomely pristine depreciated assets once the refresh cycle comes.
>Where's the "Like" button?
That's the "Insightful" or "Interesting" option, which you don't have but I do. Oops!
Haswell significantly reduced the power consumption and improved graphics performance, so that Joe Blow can read My Spacebook for longer and have higher framerate in Tankville or whatever.
But yeah, in general progress has been quite disappointing recently. My desktop PC is still a Q6600 based machine which is what, 2-3 time slower at raw computing than top of the line i7 even in SIMD-heavy apps, and that's at stock speeds. I overclocked it a bit and with a mid-range GTX650 Ti Boost it runs Crysis 3 at high settings at >40 fps in all but a few locations at 1080p (which is better than what the Xbone can manage, apparently). All of which doesn't sound too bad, but my CPU is almost 7 years old now, for fuck's sake!
With each of the past several CPU generations I've been meaning to upgrade, only to come to the conclusion that another 5-10% of improvement just wasn't worth it. C'mon Intel, don't mess up with Broadwell now, I'm counting on you! I've been rational/cheap for too long and want shiny new things!
>Where exactly do they plan on releasing these chimps at? NYC?
Yeah, and what's the problem with that plan? They'd fit right in...
Yes, but sadly their votes would only be counted as 2/3 of a human citizen's
But how would a traditional relational database scale to the 1 billion, or 1,000 billion users, huh? Did you think about the need to future-proof the application?
Too bad Microsoft just burned all that money, and not paid some poor, $100k+ engineers to develop the controller!
The difference between your hack, the many existing head-mounted displays and the Rift is how serious they're taking a) the FOV and b) latency. These two factors are critical to making the VR feel like something more than a display strapped to your head. This is also why I don't see this working with the PS4 very well either, unfortunately. For the Rift and, really, any VR, to work well you need high resolution, high framerate, and low latency which is not something the consoles are very good at.
As for content, I think this type of VR proposition is way better off than 3D TV/video stuff, as most 3D games can easily be adjusted to render the scene twice with a slightly different camera position. You don't really even need the official dev support for this, as this is something that can be done on the driver level for any DX/OpenGL game.
I'm not really sure about the Oculus API policy but Valve hasn't been particularly "open" about anything either so far. In any case, although the Rift might start off with a single model, that doesn't have to be always the case, it just makes sense to start with one unified offering rather than immediately confusing the market with a bunch of versions.
Anyway, what makes it difficult for me to predict how this goes is a fundamental part of the experience - you have to strap something to your head and isolate yourself completely from the real environment, which is a rather dorky proposition that not everyone would accept. Anyway, even if none of the mainstream games support the Rift/Valve VR, a high resolution and low latency HMD is something I'd definitely want anyway.
Have you tried the Rift? While you're correct that just having a perfect picture won't get you the full experience of jacking into the Matrix, it actually does go a long, long way toward making the experience extremely immersive.
A lot of our sense actually depend on our vision to work properly - for instance, you'd think that an airplane pilot would be able to tell which way is up based on what their body feels, but actually losing visibility of the outside world is a good way to completely mess up the perception of position and direction. With the inner ear not providing data one way or another, having realistic, lag free video works pretty well for making you feel like you're experiencing whatever is being shown.
Well, as long as they make taking photos with a tablet an offense punishable by death, carried out on the spot, I don't have a problem with the rule.
Could be a gearing issue, or rather the lack of gearing to be more specific. It seems to get up to the top speed reasonably quickly, but if the motor hits its speed limiter, well, that's it, because you can't upshift even if the motor is still making sufficient power to overcome wind/rolling resistance.
Sure, but for now AMD and Nvidia seem to be happy rebadging previous-gen chips with new names and calling it a day. 2014 is almost here and still nobody knows anything about Maxwell, which was already supposed to be shipping by this point. With huge per generation improvements and a significant process advantage, Intel could really put the hurt on them in the lower end of the market, which is the majority of it.
What about all potential electricity users who won't get anything because everything except perhaps natural gas power plants are more expensive and the poor countries won't be able to afford as many of them?