Cheap laptops (barely more than the price of netbooks, and eventually cheaper and better spec'ed than netbooks) killed it.
Both you and TFA are wrong. Manufacturers killed the netbook because once enough of them joined the fray and started competing, they eroded their margins so much that they forced the market into "chromebooks" or otherwise gimped netbooks that were cheaper to license.
Start with some practical code/app samples that demonstrate clear trouble. Ask them to participate -- see if they can solve the trouble. Then show how the original dev got themselves into that trouble, and the "professional" thing you would have applied to avoid it.
Bonus points for going comedic with examples that are ridiculous to the extreme.
It's 2017 and Visual Studio is still 32-bit.
Unless you have specific use cases 64-bit doesn't always mean better. Most apps don't need the extra address space, and jumping to 64-bit means doubling your pointer sizes, which increases memory usage, reduces locality, and puts a larger burden on cache.
In VS's case they did the math and 32-bit was better. They've said this for years now. It's not a bad thing.
Most people spend more on their phone. Or on food. Or vacationing. This is just another form of entertainment to budget for, are you really too myopic to see that?
For people who want to use VR, or who have a 4K screen, or have a 144Hz monitor, you literally can't get by on anything but high-end. Display tech is outpacing graphics cards right now.
It might add some small amount of coders to the workforce who would otherwise have never tried it out. But, that is probably insignificant. What I'm more interested in is priming people to "get" tech.
Working at small non-tech companies for the past ~5 years really opened my eyes to how completely unready for tech most people are. Old, young, it doesn't matter -- so many people have no comprehension of what developers do beyond maybe "magic", and practically shut down when asked to participate.
We don't need America to have more developers, but we do need America to get ready for a world where every company is a tech company.
Carmack posted something pretty long saying he was not only extremely disappointed in Zenimax's expert witness, but was essentially barred from seeing the evidence he used. How can you remove stolen code if you don't know what to remove?
While it appears that Zenimax is going for the jugular here, it is almost certainly a negotiating tactic to get a large stake in Oculus. They're not interested in VR, but it would be a safe way for them to keep a foot planted in the market should it become big enough.
Honestly very few politicians, regardless of what side they sit in, are on the side of tech. Democrats may be better on some issues, but by and large they're morons when it comes to Tech too.
What we really need is for everyone to write their critters to inform them about the issues that are important to them. We need a Neil deGrasse Tyson equivalent for tech, someone who can straddle the line between entertainment and education to keep the public informed and fight for what's sane.
The iPhone isn't the in thing it once was. I'm surprised to see them making it dramatically more expensive.
Let's see if it pays off.
Wireless charging is the next step toward eliminating the charging port. I've been using wireless charging for about 4 years now and I've only ever plugged in to upload music. I'd guess most people don't even do that.
A while ago, someone made the nnedi upsampler that uses neural networks to upsample. It's still one of the best image upsamplers available.
Google's approach is quite a bit different. Where nnedi worked to better extract detail out of what was already in the image, Google seems to literally fill in detail that was probably in the source but maybe not. Much, I guess, like how our own memories work. It's an interesting approach and the results look quite fantastic. My only question is how well it will work on a random sampling.
To be fair though, sometimes they do come out with complete BS. The claim that the sum of all natural numbers is -1/12 is a good example. It's not.
They don't claim -1/12 is the only answer, only that it is a valid answer. IIRC they present three or four possible answers and explain why each of them is valid, however unintuitive the theory behind them may be. They did not claim one was a -1/12 is "more correct" than the others.
To enjoy 4k, you need a monitor that supports it, that is large enough relative to the viewing distance, enough bandwidth and processing power. You also need a 4k source. Few people produce 4k video : it is more expensive, more difficult and the result is only marginally better.
I think you'll find these boxes are checked more and more.
On the consuming side, 4K monitors are coming down in price very quickly and are at the point where it's reasonable for the layperson to get one. 4K makes a notable difference on a 24" monitor at the common 2-3' distance -- anyone who says otherwise has bad eyesight or hasn't used one yet. Bandwidth-wise 4K uses about 20-30mbit, which is a lot of users these days. With H.265 they should be able to drop that number considerably for most videos.
On the production side, 4K video is already becoming increasingly more common on YouTube as the latest inexpensive professional and amateur cameras -- even phones and gopros -- all support 4K. Editing really isn't much different versus 1080p -- it's not like they're using rendering farms to create special effects.
I find you lack of faith in the forth dithturbing. - Darse ("Darth") Vader