Don't conflate gameplay quality with engine / grahics quality - they are orthogonal to each other.
You can help the discussion by telling us the name of the engine you created, and the list of commercially released games that used it. We need some perspective.
It wasn't Moore's law that kicked us in the butt, it was the differential between the increases in the speed of CPU / GPU vs memory access speeds. As CPUs kept increasing at a decent pace, the memory lagged massively behind leading to new strategies for fast code.
This is pure rhetoric and chest thumping until you tell us exactly which engines you wrote and the products that used them. Making 11 shitty low grade projects that no-one cares about is a little different to making even a single AAA engine.
CRYENGINE has no royalty cost attached to it. Unlike the other engines, you can make and release a game without paying a single cent of royalties. The very minor cost of $10 / month is basically chicken feed to anyone able to afford a PC and is just to keep the lights on.
By contrast, you will be hit for massive amounts of cash the second you go over a set amount with Unreal. Unity hits you straight up and again in the rear.
How big were the meters they were using to measure the depth? Given that information we can know how deep 100 meters are. Alternatively, give us the depth in metres.
Which means it's now on the long slow slide into obsolescence. I figure it's good for software updates for maybe another 12 months, after that it's just a matter of time until some software I need / want requires an OS version the tablet can't support.
The tablet has an earlier version in 2012, and a refresh in 2013 which was a major update in terms of the hardware. It's basically being retired after only 2-3 years, which is a shame, since it's quite a decent little tablet - the 2013 refresh at least is.
I expect it will still give me a couple more years of happy service in any case, and since that meets my "five year lifetime" criteria for buying, I'm still relatively happy.
It's working just fine on mine, though YMMV as they say.
In other words...motherfucking Google! Must you kill everything I love!
I'm not insisting any such thing. All I stated was the i7 line of processors is now 7 years old; people interpreted what they wanted to hear from there. I know full well the laptop will be using a modern version of a chip from that line.
The only thing this laptop has that's reasonably new and interesting is the 4k display, but that's completely wasted on the 15inch display. It would only be useful when hooked up to a large external 4K monitor where you can actually visually see the extra resolution it provides.
While 16GB RAM, and room for 2 hard drives is an ok feature, it's hardly anything worth talking about. Not in this decade, maybe in the last.
Haswell is well suited for use in laptops, so it's not really surprising that manufacturers are shipping devices using Haswell.
The model he mentioned does at least seem to have a nod in the right direction for video cards, instead of the usual garbage most laptops have installed - but then, for a starting price of $2355 I guess you'd expect something decent.
You're still completely wrong.
That would be about the only reason I could imagine for two drives on a laptop.
I only state the i7 was available 7 years ago, in Nov 2008, according to Intel's own information.
Oh great, a technology that gimps your processor to prevent it heating up. Nice!