## Comment Re:Common Sense (Score 1) 535

Capital, old chap!

(Capital P in Pound implies Stirling, since you didn't understand the first time. Nice try at covering up though!)

Capital, old chap!

(Capital P in Pound implies Stirling, since you didn't understand the first time. Nice try at covering up though!)

"conventional modes of democracy could be extinct within two decades"

At present "conventional democracy" has a vote every 4-5 years (perhaps with mid-term or local elections halfway) in which your bit of information (if that!) ends upo with a single bit of who leads for the next 4-5 years, during which politicians tend to drop their campaign promises.

Internet technology allows for finer-tuned democracy, yes, but if anything "election day" should be an annual day on which everybody does physically go to the polls and cast a secret ballot. Because although technology does allow secrecy (not necessary for all votes, but essential for some), the risk of back doors will always be greater than when a simpler and less technological procedure is used.

I'm in my forties now and want to be able to vote issues, not parties. I'd also like to be able to vote for individuals who have proven leadership qualities without them being beholden to a party. Not that I could vote Perot - being European - nor that I would want his finger on the button anymore than anybody else, and at least Obama comes across as somewhat statesmanlike even if his mantra of "Change" never really happened, but you should see the bunch of twits in Europe nowadays (on all sides of the political spectrum).

Almost as if we are forgetting what populism brought in the 1930s.

Hint: There is no amount of radiation that is "healthy" to be exposed to.

Oh dear, you actually do need a refresher course in mathematics.

What do you call an abelian group with an associative, distributive secondary operator and the power to corrupt mortals?

Answer here: http://www.irregularwebcomic.net/470.html

Apples and oranges are age 5 concepts of counting, at which point children aren't necessarily learning even to subtract yet. As a child I lived at #17, so #13 was two doors to the left, and #21 was two doors to the right. They were two doors away from my house, and from each other they were four doors! But my house was no doors from my house, it was (in more formal mathematical terms) O, the Origin, for me.

Next thing you will be telling me that Quaternions are a purely mathematical construct, with no physical analogue. Oh wait, how about Spacetime, you know, the natural universe we live in?

Now, defining Zero to be the equivalent of the empty set {} and then using the Peano axioms, THAT is a mathematical construct which can help us (mathematicians) to be rigorous (at least until Kurt F.ing Goedel comes along) without a direct physical analogue.

What confused you in your previous post is that the Romans had a perfectly good CONCEPT of zero (nullus) but lacked the notation for it, because they were (in CS terms) overloading their alphabet to do numbers too. Just as hexadecimal notitation does, feed face?

The reason that calculus is so common (not that I did it in my CS diploma, but then I have an M.A. in natural philosophy) a requirement is that Euler's formula brings together many of the (non-discrete) mathematical topics. I'm not sure to what degree (ha!) multiple differentiation (let alone integration) is relevant to a CS student, but a sound mathetical grounding is most certainly to be expected, just as biology and chemistry are to medical students, language to law and arts student, and ouija board usage to economists.

Furthermore, in a liberal (arts/science) degree, if you choose to be a science major of any kind, it would make sense that there is some sort of core curriculum which you are expected to be aware of at least, and where say a medical student might get away with slightly less on the maths front, I'd certainly hope they'd be able to understand that none/zero is one less than one in much the same way as one is one less than two.

Perhaps you are confused between ordinals and cardinals. It makes sense to say "I ate my first apple, then my second apple." It makes significantly less sense to then say "But before that, I ate my zeroth apple". If I have an apple, and you have an orange, then in the vector space of apples and oranges, I have (1, 0) and you have (0, 1). Those look remarkably different to me. However, if we both had 42 apples and 13 oranges, then the difference between our possessions would be NONE.

There's no ready analog (to zero) in the natural world.

None?

There'll be a Beowulf Cluster of these along soon!

2+2=4 is indeed a theorem of arithmetic, but it does not preclude it from being an axiom or the only member of a theory.

Ah, a little knowledge is a dangerous thing. What are these "+", "2", "=" and "4" things?

Over ZZ3 (integers modulo 3), 2+2=1.

When you learned to count (pre-school) you were actually learning what mathematicians call the successor function, and although the concept of zero was hard to understand, not only in Roman times, but even in the early Renaissance, current the symbol "2" is defined to be the successor of the successor of "0", and "+" is defined as moving an s() from one side to the other until a "0" has been reached on one side, at which point it can be dropped. So "2+2" = s(s(0)) + s(s(0)) = s(s(s(0))) + s(0) = s(s(s(s(0)))) + 0 = "4". IIRC, 0 can be defined as {} the empty set, s(s(x)) as {{x} u s(x)} or summat like that (not being rigorous, just lazy).

Anyway, a theorem of set theory may turn out to be used as an axiom for arithmetic, and that in turn used as an axiom (or given) for say calculus. But that doesn't make "2+2=4" a theorem at any sensible level, not even a lemma, but rather the definition of the symbols being used.

It turns out that many of the axioms of used in mathematics correspond to our natural understanding at an early level, and that in physics somewhat weird axioms can predict actual results, as in relativity and QM. When counting sheep jumping fences, integer arithmetic is enough. When counting cats in boxes, it isn't.

Unfortunately,

Hey! Which side are you on?

eldavojohn writes: *Some two thousand pages of Plato's works have now been uncovered to have a hidden meaning. According to the research, he hid a complex musical code to match his writings inside his works. From the article, 'Dr Kennedy spent five years studying Plato's writing and found that in his best-known work the Republic he placed clusters of words related to music after each twelfth of the text — at one-twelfth, two-twelfths, etc. This regular pattern represented the twelve notes of a Greek musical scale. Some notes were harmonic, others dissonant. At the locations of the harmonic notes he described sounds associated with love or laughter, while the locations of dissonant notes were marked with screeching sounds or war or death. This musical code was key to cracking Plato's entire symbolic system.' Thousands of years later, we continue to learn from Plato (PDF).*

and DOM, don't forget the DOM.

I've been trying to for years and then you mentioned it, you insensitive clod

stronghawk writes *"The creator of the Nickel-O-Matic is back at it and has now built a Turing Machine from a Parallax Propeller chip-based controller, motors, a dry-erase marker and a non-infinite supply of shiny 35mm leader film. From his FAQ: 'While thinking about Turing machines I found that no one had ever actually built one, at least not one that looked like Turing's original concept (if someone does know of one, please let me know). There have been a few other physical Turing machines like the Logo of Doom, but none were immediately recognizable as Turing machines. As I am always looking for a new challenge, I set out to build what you see here.'"*

All motherboards have em.

An anonymous reader writes with this quote from 1Up:
*"Trouble is brewing in Rapture. The recently released **Sinclair Solutions* multiplayer pack for *BioShock 2* is facing upset players over the revelation that the content is already on the disc, and the $5 premium is an unlock code. It started when users on the 2K Forums noticed that the content is incredibly small: 24KB on the PC, 103KB on the PlayStation 3, and 108KB on the Xbox 360. 2K Games responded with a post explaining that the decision was made in order to keep the player base intact, without splitting it between the haves and have-nots."

I scrolled down to see if there were any more relevant posts to reply to, but most of them also boasted about 80+ wpm.

I am by no means a touch typer, but I don't watch my keyboard either. So I correct a lot, and am about half your speed at best (say 50 wpm).

Still probably around 12 cps, but hitting Delete 3 times lowers the average, hehehe.

I still type faster than I can think, whether I am programming, translating, or writing for fun and pleasure. As the GP post said, any more is overkill for anything but data entry or transcribing.

As it happens, I didn't make many mistakes in the previous para, but I can regularly type stuff like: To be oare nto teo be, thatr ais the quzesition.

Thing is, when I'm typing text (using 9 fingers, not the right pinky for some reason, although I do sometimes use my left hand for control (thumb to C for copy, for example), I am aware of my mistakes and often want to change for other reasons anyway. And when programming, I want to type two or three letters and then code-complete.

At the time of posting, 14% say they have written more than a million lines of code. Seriously, I don't think so.

I guess I'll total a million in just another 90 years of coding.

You are forgetting the masses of seriously shitty coders out there who pump out 2000 "lines" per day, 250 days per year, who can get there in just 2 years.

I say 2000 lines because I can "create" that many in 8 hours no problem - it is only 4 lines per minute and I type at roughly 40 wpm when not thinking that much but still being aware of what I'm doing. But agreed, no way can a competent programmer do that every day!

Uncertain fortune is thoroughly mastered by the equity of the calculation. - Blaise Pascal