MicroUSB was designed for 10,000 cycles, the same as Type-C. You might be thinking of MiniUSB, but even that was for 5000 cycles, not 500. Standard A/B was designed for 1500 cycles.
That's ignoring Vernier Acuity, which is a very important effect on displays where the pixels form parallel lines, i.e. pretty much every modern electronic display. It gets down to 0.13 arc minutes, which is why there are several replies pointing out that your theory doesn't match reality, even for people with worse than average vision.
And this figure of 2190dpi? That's 3 significant figures, computed from something that was given to only one significant figure (0.4 arc minutes). You can't do that, and it should be a huge red flag that the source article should not be taken seriously.
Firefox supports 60fps if the video is encoded in WebM (VP9) which only happens on Youtube if it has enough views
Google could have added support in the Flash player
I get 60fps on test videos with single-digit views, using RHEL 6.4, Firefox 17.0.7 ESR and Flash 11,2,202,327.
Although the video options only present e.g. 720p rather than 720p60, selecting 720p gives 60fps. Selecting 480p gives 30fps. The same video encoded at 30fps before upload and viewed in 720p shows the difference very clearly. I suspect it's something to do with the old version of Flash.
For reference, I also tested with Windows 7, Firefox 33.0.2, Flash 22.214.171.124, and 720p and only get 30fps.
Some of it is learned through practice, but all of it isn't.
The meaning you intended to convey was probably "not all of it is". Otherwise, the literal meaning contradicts the first part of the sentence. What came up with that phrasing - your conscious or unconscious mind?
I've noticed that the faster I write, the more likely it is my writing will contain homophones. I presume that the faster I write, the more my unconscious mind gets used for the task, and it places more emphasis on sound. Or there's a sound buffer and a letters/word buffer working in parallel, with the former usually taking precedence, but at speed it gets filled too quickly, so the fallback is to the sound buffer.
Hmm, I think you have a few things wrong and/or misleadingly stated.
In the early 1980s Acorn evaluated CPUs for their next-generation product. 80286 was released in 1982 February and was readily available on the market so there was no need to get Intel's cooperation to evaluate it. But, Acorn did want to license the 80286 core and make changes to it, which Intel rejected. All the evaulated CPUs were deemed inadequate, so in 1983 October Acorn started development of Acorn RISC Machine.
The goal of the ARM architecture was high performance. (On production release it out-performed the still-current 80286.) The device was simple because of the limited design resources, and therefore low-power, but for it to be quite as low power as it turned out to be was an entirely unexpected accident.
Apple officially became part of the the ARM project when Acorn spun off ARM Ltd in 1990 November, by which time the 80486 was on the market. Apple's interest was to continue development of low-power CPUs for their Newton handheld, for which the 80x86 line was unsuitable.
How often do you need to drive from Dundee, Scotland to Poole, England?
646 km seems to be about as far as one can drive in the UK --- that's just 400 miles
Dundee to Poole is an 800km drive. Dundee is a less likely endpoint than Aberdeen, another 100km up the road. Thurso to Penzance is a 1300km drive. Yes, the US is a lot bigger than the UK, but don't just make stuff up. Then there's the rest of the EU to consider...
Its also worth noting that ARM has never been about performance
Performance was exactly the reason the ARM architecture was created in the first place. Acorn's engineers determined that the performance of existing and announced architectures (from Intel, Motorola, etc.) was insufficient, so they needed to create a new one. e.g. http://www.ot1.com/arm/armchap1.html
no protected memory
I keep hearing this, but it's not true; RISC OS had protected memory. Try writing to another app's memory from user mode, or writing to VIDC registers from user mode. But some important areas weren't protected, e.g. the ARM vector table.
In principle Computer Science courses are meant to turn out scientists, not engineers. Maybe you'd be better getting a Software Engineering degree. Have you worked with programmers with Software Engineering degrees? Are they more engineer-like?
MSDOS is not dead, it just smells that way. -- Henry Spencer