And if they go on like that, they can keep "their" operating system too. I'd have to make do with less games, but otherwise all the applications I need are available on Linux.
Well, Microsoft had a Windows XP and a Windows 7 in between those bad releases. Otherwise, Windows market share might have suffered a lot more.
Now it will be interesting to see if Windows 10 can be one of the good versions. But even if not, Microsoft can survive thanks to Windows 7 until Windows 11 comes out. Displacing a succesful Windows version is very hard, to the point where a lot of people still run XP (even if I doubt the wisdom of that...).
On one hand, I acknowledge that a website owner may run the site as he sees fit. Which includes ads to earn some money
On the other hand, I don't feel obliged to pay attention to the ads or even let them onto my computer. So I don't have any qualms about using software like NoScript.
In everyday use, I tend to allow non-obtrusive ads that don't bog down my computer too much. The bogging down is noticeable at times BTW. I'm sometimes on a measly 2MBit/s connection with an older PC, and then the bandwidth and CPU demands of ads can be significant.
At other times, loading websites goes slooowly because some ad server cannot keep up with the load (often ad.doubleclick.net. I've blocked that site specifically since).
Nope, Firefox all the way.
Adobe appear to be incompetents, but I don't trust Google to be not evil.
HTML5 does not have the abysmal history of exploits that Flash has. I would not mind Flash if Adobe was making decent software
I just deinstalled Flash and tried Spiegel.de again. It still works.One more problem solved
That's why the initiative to kill it should come from corporate users, not Adobe. By stopping using it.
Google using HTML5 for Youtube videos, so you don't need Flash for that anymore, was a good start. Next I'd like to see is Spiegel.de, the online branch of the German News magazine. Last time I checked, they still used Flash for their video clips. A few more of those jumping ship and I won't really notice Flash missing from my computer...
I usually go by the performance index of www.3dcenter.org, which gives an average performance value relative to the Radeon HD 5750/6750 GDDR5, which is defined as 100%.
The index is not based on theoretical GFLOPS, but on tests by various review sites (mostly gaming) and calculated for benchmark results at 1920x1080 with 4x multisampling anti-aliasing.
This explains why Nvidia looks better in the 3dcenter.org ranking, as they usually get more gaming performance out of cards with the same GFLOPS.
3dcenter.org also calculates a performance/watt rating where they divide the performance index by the typical power consumption in games. The result is in percent of performance per watt, and as explained above it favors Nvidia. Of course, if you do something other than gaming, your results may differ.
The best result at the moment is for the GTX 980 4GB at 3.45, closely followed by the 750Ti at 3.44. I used the 750Ti as example of a midrange card that still performs quite nicely compared to high end cards of a few years ago. Current market price is 130-145 Euro. The Fury X is listed with a performance per watt of 2.32.
BTW, Wikipedia says that
Full-height cards may increase their power after configuration. They can use up to 75 W (3.3 V Ã-- 3 A + 12 V Ã-- 5.5 A)
Graphics cards manufacturers use that routinely to save a few cent on the extra connector.
In the last years, Nvidia have made big strides in reducing their power consumption for a given performance. You can buy the "latest-and-greatest" in performance, which will outperform older cards, OR you can get similar performance in a smaller, cooler and cheaper package. The 750Ti comes to mind:
It is "only" a midrange card, but with a power consumption of 60-70W it does not even need an additional PCIe power connector.
Recently, AMD are also getting closer with HBM on the Fury (although they are still falling a bit short of Nvidia).
If you think back a few years, the roles were reversed BTW:
Nvidia was still on the Fermi (also derided as "Thermi") architecture and significantly less efficient than AMD's HD5xxx series.
Looking forward, the
Really off topic but I'll bite:
1600x900 here, on a one year old Fujitsu Lifebook E 782. Which is overall a nice but not really high end system. Typical prices used to be in the 900-950 Euro range a year ago. Today, the E554 seems to be its equivalent, available with 1.920 x 1.080 display for a bit over 800 Euro.
It seems that your management was just being cheap when they stuck you on laptops with a "720p" display
Well, GP explicitly mentioned gamers, who are quite the opposite of business users when it comes to upgrading.
Where businesses hesitate to upgrade, often to the point of running unsupported OS versions because "it works" (until it doesn't
That includes stuff like running alpha versions and beta drivers, and since Windows has by far the largest selection of available games, it has to be Windows for most of them. I predict that DirectX 12 really will drive adoption of Windows 10 in the home user market.
If that was true, I might pirate Windows and run Libre Office as my office suite (at home). Gaming is the only thing I still need Windows for...
In any event, MS would be ill advised to open source anything. As soon as they do, they are no longer the only source for updates, and once they are no longer the only source for updates, they will no longer be the *best* source for updates, since it is likely that a young upstart company with some intelligence behind it is going to be able to run rings around MS.
It would still be the only official source for updates. Windows would almost certainly not accept third-party sources by default.
It would be the only official source for updates for the original version. But if Windows was entirely open source, not just a few components, the third party version could accept whatever the third party wants.
I think it could be similar to CentOS cloning RedHat:
Switch out the trademaked logos, perhaps change a few URLs in the update mechanism to point to the servers of the third party and run the whole thing as "alternative" system...
I guess Intel might eventually get there. For midrange stuff at least. Their Iris Pro GPUs are already getting close to AMD's APUs.
But it is quite possible that they keep it as high price "laptop exclusive". Especially if AMD goes tits up.
Yes, but at least the overall efficiency is similar to Nvidia now.
Personally, I find the Fury too expensive and power-hungry, so I might get a new machine next year or 2017. When Zen and midrange HBM graphics cards are available.