Comment Re:Hrrmm... (Score 1) 857
All signatures on legal documents should be signed in cursive.
Says who?
I don't use cursive. Therefore, any signature on any document that is in cursive is not my signature.
All signatures on legal documents should be signed in cursive.
Says who?
I don't use cursive. Therefore, any signature on any document that is in cursive is not my signature.
"Men, on the other hand, rarely use anything but a map. If I changed a street sign outside my apartment, my male friends probably wouldn't be able to find the place anymore."
Maybe I'm an exception, but I don't think that's true at all. I navigate entirely by landmarks. I don't even know the names of half the streets I travel on regularly. Furthermore, my mental map of the city is framed by our light rail system, major bus lines, and bike throughfares, not by the major roads carrying automobile traffic.
Or MAYBE you're just not a REAL MAN!
I jest, of course. Exceptions, rules and all that.
I'm willing to believe it's possible, with a caveat. In many artistic disciplines, the master may die without imparting all his knowledge to a student. When the student becomes the new master, he too later dies without passing on everything he knows. Thus, the knowledge base eventually dwindles.
This is one theory of knowledge transmission, and it deserves to be taken seriously; however, we're at the head of a four-thousand-year-long counterexample in our current technological progress. Many students learn things that their masters never knew, and the overall state of the art advances. So while I think it's possible that Stradivarius knew more about violin-making than his students, it also seems very unlikely to me that we've never recovered his knowledge.
If the difference is in materials (as is usually claimed), well that's certainly more plausible.
there appear to be no characterizing differences between the perceived sound from well-made orthodox instruments on any age when played by a skilled player
That's because they used the wrong speaker cables and missed out on the warm sound only pure gold provides.
Every prime number is a natural number, and every natural number is a positive/non-negative (depending on which definition you choose) integer. "Positive prime" is redundant.
The "positive" part is not the redundant part... it is the "nonzero" part that is. You have started with "every prime number is a natural number", which is a false premise... you can't rely on wikipedia for everything.
More precisely, that definition taken from wikipedia is closer to that for an irreducible, not a prime.
A nonzero element p in a ring is a prime if when p divides a product "ab", then p must divide one of the factors "a" or "b". A nonzero element p is irreducible if whenever you write p = st then either s or t must be a unit (in the case of integers, 1 or -1).
It just so happens that in the case of integers, the concepts of prime and irreducible turn out to be equivalent, which results in endless confusion. This means that "definition" of primes that people usually give is more correctly a "theorem". Anyhow, in the ring of integers, we have both positive *and* negative primes (i.e. 2 and -2 are both primes). In common speech though, we restrict ourselves to natural numbers (as the wikipedia article appears to do, sacrificing mathematical correctness for vulgarity).
So as I said to start with, the "positive" part isn't redundant; it's just being more precise than people normally bother to be. However yes, the "nonzero" part is redundant.
Any IT dept that likes to keep working will just give him/her an identical unit, Use Fog (or what have you) to clone the HD to the new unit, and upgrade (or my recommendation, backup and reinstall) on the original.
You're taking it to the extreme ignoring all real world factors.
Yes, as a thought experiment, the human brain, because it is able to, although extremely inefficiently, perform basic arithmetic and logic operations, could emulate any infinitely complex hardware running any infinitely complex software if all information about the hardware was available and correct and humanly understandable and if the software was available in human readable form and if there was an unlimited means to manually store data and unlimited time to compute and perfect computation accuracy and unlimited ability to focus on said task.
But the converse is true as well I'm afraid. Not only can humans """THEORETICALLY""" emulate computers, but computers can right now in real life accurately emulate a portion of a mammal brain.
By the way, the idea you are describing is called the church-turing thesis. Sure any computational device can theoretically emulate any other, but it's theoretical for a reason. All external factors can not be reproduced. If a computer does calculations based on yesterday's weather, if I don't have yesterday's weather, then those computations can not be emulated.
From the Wikipedia article on Emulation: In a theoretical sense, the Church-Turing thesis implies that any operating environment can be emulated within any other. However, in practice, it can be quite difficult, particularly when the exact behavior of the system to be emulated is not documented and has to be deduced through reverse engineering. It also says nothing about timing constraints; if the emulator does not perform as quickly as the original hardware, the emulated software may run much more slowly than it would have on the original hardware, possibly triggering time interrupts to alter performance.
http://en.wikipedia.org/wiki/Emulator
The moon is made of green cheese. -- John Heywood