A pleasant event... don't they mean a pheasant event?
A pleasant event... don't they mean a pheasant event?
This is a real problem, I have experienced it myself on my computers and at work where I work in IT on some 4,000+ PCs. The help desk probably re-images a dozen machines a week partly due to this problem. I know for me it is often because I am downloading various interesting software packages, and then I am too lazy to uninstall them. And a large percentage of software does not uninstall cleanly - not really a Microsoft problem there, not completely.
Personally I kind of like Windows 8.1, and I really only think it was a marketing flub to try an force a touch interface on people. Bad Microsoft, no biscuit. But I digress...
* On some machines we use a product called Citrix Provisioning Server (used to be Ardence), booting the machines off a network read-only drive, and we have other software that saves select important user settings and files. We refer to this as "stateless" and is the closest thing you get to being immune to this problem. Unless you really have skillz and screw up the master image this is based on. This has been the "Always runs like new" experience for us.
Other ways to achieve a similar effect:
* Windows 8.1 includes a FULL version of Hyper-V, a type I hyper-visor that is fast (you could use others as well of course). Basically, install Windows twice, one being the host and put nothing on there but the guest. Then immediately make a snapshot of the guest. Use that VM for web surfing or any activity that will introduce cruft, etc., and you can always revert to the snapshot and be pristine one again (of course you will need to do updates again, re-install software, etc.). This also would let you use Win7 as the guest, if you like that OS better. XenClient Enterprise is another nice one but it costs money (no, I don't work for Citrix, but I am a Citrix admin). Oh, and although this is similar in effect to backing up with an image, it is much faster and you don't have to buy something like Acronis (although it is nice). I can't recommend things like VMware Workstation, VMware Fusion, Parallels, etc. because Type II hyper-visors like these cause a big performance hit for everyday use, especially if you have invested in a nice machine and want to take advantage of it. These have good utility for other things though.
Make your own thin client
* If you have access to Microsoft Enterprise licensing, you can use the ThinPC version of Windows, which is made to turn a PC into a thin client and includes a write filter. Such a machine will not retain anything on reboot. So you would need a way to save data / settings elsewhere. But, you can turn the write filter off to install things permanently and then turn it on again. Effectively making an "appliance" with the apps you need, but doesn't really get slow over time (at least no where near as much or as fast). Great way to test things.
Microsoft did include a "refresh your PC" built right into Windows 8.1, but I will admit I have not tried it myself yet.
At least this way we know where the Neanderthals ended up...
or just 2160p as it should be called
Movies come in different aspect ratios. At 1.78:1 you get 1080p or 2160p. At the also popular 2.35:1 you get ~817p. 720p likewise becomes ~544p. Those aren't really helpful for comparison since 817p isn't lower resolution than 1080p. Only the horizontal resolution is constant, so it actually makes sense to use it. The use of vertical resolution comes from the days of analog TV when only horizontal resolution was continuous, not discrete.
(I'm sure the marketing folks were salivating over it anyway.)
Also, while I haven't watched your hour-long video (summary?), I'm not sure why anyone would target 4096 pixels wide, which would make upscaling existing HD very painful. Doubling the resolution is much simpler, and I very much doubt that 4K was ever a spec as opposed to a marketing term.
Indeed, upscaling existing 1920x1080 to 4096x(aspect ratio) would be painful. Just as downscaling the 2K and 4K that movies are shot in to 1920x1080 and 3840x2160 are, but could be much better if they weren't. That is one of the points brought up in the video.
Other points in the video talk about how resolution isn't the only factor that makes the newer formats better, it is not even the most important one. The new formats also come with a wider color gamut, better compression algorithms, and so on. But one of the main points is the problem of getting movie formats cleanly scaled down to home formats. They had an opportunity here to stop doing that and they blew it.
Yip it should be called 2160p.
4K is already reserved for the resolution of 4096x2160, which is the resolution of movie camera sensors and the resolution of theatre projectors.
Absolutely. I especially love that every "4K*" TV is already tagged with an asterisk with a sticker at the bottom of the TV saying "*3840x2160"....
even the lawyers knew this was a bad idea.
The TV resolution specifications (720p, 1080i, 1080p, etc) were set in the 90s. It was after this that digital movie recording started with a slightly different "2K" resolution. They are different display mechanisms after all, the home TV and the cinema - even if the home TV is approaching cinema size (factoring in viewing distance).
2048x1080 is a stupid resolution. 2048x1152 would be more sane as it's a 16:9 display. Maybe this is what Full HD should have been originally instead of 1080 lines. Too late now.
"8K" in the home will be 7680x4320.
Resolutions such as 720p and 1080i were created due to transport / transmission limitations, and I would say they were more "arrived at" than set. My DLP projector is extremely similar to the one's at the cinema. Actually, 2048x1080 isn't a format, as you point out it's an aspect ratio. The movie studios use 2K, 4K, etc. to refer to the fixed number of the format, but the aspect ratio is variable as you can see with all the different ratios used by different movies. You can have 2K at 2048x2048 if you want. If they scanned old film or shot digital movies at 3840x2160, that would also fix it.
"So, the whole reason for going with faux 4K (3820 x 2160 or just 2160p as it should be called) in the first place, was because existing HDMI couldn't quite hit 4096 to do the real thing."
No, that's not the "whole reason" or even part of the reason. The remaining question is uninteresting.
You may not know who Joe Kane is, but this should help: http://www.youtube.com/watch?v=0ZqhA3iIHm4 Perhaps not the whole reason, likely the main reason, but definitely part of the reason. I think the answer to the remaining question might be that they don't have a good reason...
1920 multiplied by 2 is 3820.
3820x2160 is merely Quad-1080p - which at least is sane.
4096x2160 is 17:9 (ish) - I don't see the point in this resolution.
I await the pointless 5040x2160 monitors (21:9, the "new shiny standard" for widescreen monitors).
1920 (as being slightly short of 2048) is the old or maybe existing faux format, but at least they call it 1080p and not 2K. The point of 4096x2160 or if you will, 2048x1080 is that those are resolutions that movie studios actually shoot movies in, and they refer to them as 2K, 4K, 8K, etc. There are no perfect ways to convert from the movie format to the home format. Yes, you could say it is convenient to be able to double/quad our current 1920x1080, but actually that too is itself based on slightly less than the real thing. If we used the same resolutions at home, then no conversion and thus no picture and/or quality loss would happen from the conversion. I will wait for 8K and hope it is 8192x4320 and not 7680x4320, or I will just call it 4320p.