Ok, you see that you're wrong and trying to distract.
Hubris, dude. Hubris.
Everyone is mistaken from time to time; it's what you do when you realise it that characterises you.
Uh, pretty sure he would.
Sorry dude, but you're wrong about this. At some point you'll realise it and be horrified. Don't feel bad, we all grab things by the wrong end sometimes.
If I follow your logic, your hang-up is on the fact that "annual" implies 1 year, and raising something to the power of 1 is not raising it at all, and thus non exponential. But you're wrong. Lemme give you an example: e=mc^2. ARGH: Einstein was wrong! If you measure c in "light years per year", then c becomes 1 and 1^2 is still 1 therefore the formula is misleading! Nonsense, right? Unfortunately, so is your argument.
You're confusing something which is exponential across a number of time periods, versus the size of each time period when measured in years. Stop getting so outraged that t=1 because it doesn't. T is the number of intervals. 1 is the size of the interval when measured in years.
Primary school stuff.
This is a man standing too close to the forest to see the trees. He's right, but also completely wrong.
What is being taught is "computational thinking", not coding. Coding is just the conduit.
It's a skill that has application far beyond the keyboard. It's not about learning the syntax of a for-loop, it's about the epiphany that follows. Seeing a kids face when they (all too rarely) get it that they've become wizards and the sky is the limit, is priceless. They are visibly empowered and their view of their relationship with the world around them alters.
*That's* what it's about.
Agreed the verboseness argument is bogus, otherwise the whole debate would consist of COBOL people on one side, and APL people on the other. They are at opposite ends of the spectrum of verboseness, and both languages largely suck, but for different reasons.
+1 to parent. Strong typing raises compile-time errors that otherwise would end up as pernicious run-time bugs. That means less time debugging and more reliable software. It's the total opposite of the sort of hippie "oh man, just grow the data structure however you feel at the time" approach of Python.
Yeah, and it's also *tomorrow*
After a few back and fourth texts
Mmmm... couple of things here: 1) learn how to spell; it will do wonders for your credibility, especially if you live by text, and 2) learn not to finance things unless you have an ethical imperative for financing companies to profit at your expense. Yes, I know everyone does it, but everyone are stoopid.
Scratch is awesome, and I've worked with many dozens of kids on it.
Doesn't matter much which one, just learn to so some even trivial things in assembler. Then understand *this* is reality, and everything else is an abstraction.
For bonus points, then do it in hex without the benefit of an assembler to translate mnemonics into opcodes and calculate your relative addresses.
Passed what? A kidney stone? Another car?
Oh, you mean "died". I get it now.
also, start at 2m20s to watch the cows run away!
And at 4:20 as it lands, the cows run back again. Very suspicious.
Well, based on what I've seen in my time on this plant
You're living on a plant ??
My god, this place is bugged !!
Hey! Illegible, ill-informed and incorrect. Nice trifecta
The bigger the theory the better.