Hey! Which side are you on?
Hey! Which side are you on?
and DOM, don't forget the DOM.
I've been trying to for years and then you mentioned it, you insensitive clod
All motherboards have em.
I scrolled down to see if there were any more relevant posts to reply to, but most of them also boasted about 80+ wpm.
I am by no means a touch typer, but I don't watch my keyboard either. So I correct a lot, and am about half your speed at best (say 50 wpm).
Still probably around 12 cps, but hitting Delete 3 times lowers the average, hehehe.
I still type faster than I can think, whether I am programming, translating, or writing for fun and pleasure. As the GP post said, any more is overkill for anything but data entry or transcribing.
As it happens, I didn't make many mistakes in the previous para, but I can regularly type stuff like: To be oare nto teo be, thatr ais the quzesition.
Thing is, when I'm typing text (using 9 fingers, not the right pinky for some reason, although I do sometimes use my left hand for control (thumb to C for copy, for example), I am aware of my mistakes and often want to change for other reasons anyway. And when programming, I want to type two or three letters and then code-complete.
At the time of posting, 14% say they have written more than a million lines of code. Seriously, I don't think so.
I guess I'll total a million in just another 90 years of coding.
You are forgetting the masses of seriously shitty coders out there who pump out 2000 "lines" per day, 250 days per year, who can get there in just 2 years.
I say 2000 lines because I can "create" that many in 8 hours no problem - it is only 4 lines per minute and I type at roughly 40 wpm when not thinking that much but still being aware of what I'm doing. But agreed, no way can a competent programmer do that every day!
My ThinkPad is dead you insensitive clod.
British mathematicians (from primary school to professor) place the decimal point not at the base, but half way up, at the same level as the minus sign, the space between the lines of an equal sign, or the intersection of the small "x" used as a multiplication sign.
This seriously confused an Italian boy who joined my school at age 14 and eventually was the other person from my year to go to Cambridge.
Unfortunately, despite his intelligence when he was first tested to see what group he should be in, he mistook the dot for a multiplication symbol. He'd been to American schools a lot, since his father was a diplomat. So he answered such simple questions as what is 1.2 + 3.4 (which should be 4.6) as 14 (1x2 + 3x4).
At least the GP understands that confusion is the issue. He is not 100% correct, but it isn't nonsense either.
Don't forget that not only written and spoken are important, but that the limitations of the ASCII/ANSI character set(s) mean that we use "full stops" rather than "points", and similarly combine multiplication into the asterisk.
Interestingly, the x87 FPU has instructions for loading and storing BCD values, but internally computes everything using binary arithmetic. That lets you combine the accuracy of binary floats with the storage efficiency of BCD. To my knowledge, no one has ever wanted to do that.
That isn't very interesting, since it is the x86 (not the maths coprocessor) which has opcodes such as AAA and (see http://en.wikipedia.org/wiki/Intel_BCD_opcodes). The accuracy is the same, the performance of arithmetic operations is worse, but the important advantage of BCD on old architectures - including even the 4.77 MHz 8088 - is that they only had at best 16-bit registers, so the most you could represent unsigned was $655.35 (more than enough for anybody). Allowing a packed byte for the cents, and another for ones and tens, etc, may not have been the computationally most efficient, but without an '87, it was better than strings!
There is an urban legend that unified Cameroon (French/British) did trucks one month, and cars the next.
I'm pretty sure it was a joke, but my grandfather (who died in '76) did work there at the time it would have happened.
As Roger Needham quipped, Multics was design for the real-time processes of geological processes.
I'm still waiting for the pun.
To err is human, to moo bovine.