Literally never done before? This person perhaps isn't familiar with other computerized enterprises that have been witnessed by millions of people. Space shuttle launches? How about massive light shows for concerts?
Get over yourself.
That aside, I hope it's a good show, and gets more folks interested in art and technology and keeps money flowing into those kind of works.
North Korea is just waiting for the US to give up on the South so they can walk in and take over.
Wanted to chime in on this NK/SK comment, which is--the US has 28,500 soldiers in Korea. South Korea has 640,000 active personel, and 2,900,000 in reserve. South Korea also has plenty of other allies besides the US, should the US ever decide to go into isolationist mode.
Don't fool yourself into thinking the US presence has anything to do with the stability of the Korean peninsula. I would say the impact of the United States Forces Korea is negligible, and in fact, may be contributing to hostilities rather than keeping peace. It has a great deal more to do with China and Japan, both who have a great interest in keeping the peace in Korea.
I don't know about calculus but doing formal proofs.
Thank you.
As someone who went through a very "theoretical" CS program at a top 20, I am certain I was forced to spend WAY too much time doing calculus instead of exploring other areas of math. The core tenets of it are very important, but putting everyone through class after class of multi-var blah blah is just a waste of time. Most students didn't get a chance to take analysis or anything that would teach them about WHY the shit works or what the theory or point of it was. We just had to do course after course of symbolic manipulation that none of us would ever use.
More discreet math, more graph theory, more analysis, more formal proofs, more whatever else I never got a chance to explore, less calculus.
Economically, while many schools say that a main reason for legacy preference is to increase donations[6], at an aggregate (school-wide) level the decision to prefer legacies has not been shown to increase donations.
Sounded reasonable though!
are ya sure it hasn't just been retooled to become super_secret_function()
I don't think you've seen the iOS SDK.
I'd guess something more like [NSReallyInternalDeviceIdiomDetector superSecretFunction:host:port:withDelegate:inSection:byAppendingString:context]
For all the things Apple has done right and does well, clinging on to Objective-C is not one of them.
It'd be nice if you pointed out, you know, actual reasons rather than just make snide comments. I'm sure some knee-jerk Apple haters will vote you up though.
My issues with iOS development lie not with Objective-C, but with Apple's frameworks and libraries. It's frustrating to only have header files and not be able to check out what a method actually does when debugging. Fortunately, the documentation for their classes is top-notch. The objc runtime is also a pretty wild ride, but once you know your way around you can poke at it and find out where your messages are going at least. Can check out the source for the runtime here http://opensource.apple.com/source/objc4/
Another issue of course, is XCode. I've switched to writing most of my iOS code in vim, building my code with the xcodebuild command. I still rely on XCode to do things like add files to the xcodeproj and manage the build configurations. XCode has a mind of its own, wacky completions, a completely fucked up undo buffer, strange locations for settings, and more frustrating joys. Would love to do away with that.
Check out the cocoa.vim plugin, and also, while I'm at it, you can get your vim for your local environment pimped out in minutes with Vimlander 2: The quickening. Test driving my apps with Pivotal's Cedar framework.
The thing with ASCII is that it's easy to write on standard keyboards.
Why should the notations which we use to express our programs be limited to 'standard keyboards'?
I'm sure there could be decent schemes for writing alternate symbols with meta-keys and such. Learn a new keyboard layout, it won't kill you. Reminds me of folks refusing to learn a language other than C++/Java/whatever because they are afraid it'll cause them some irreparable mental damage.
For example, I'd love use standard logic symbols to express statements in my day to day coding, why not? Well, because I'm writing C/Ruby. But hey, I'd like to see them available as an alternative perhaps, not required?
Shooting this down because the keyboard we're all using in 2010 doesn't accommodate it well doesn't seem like the best way forward to me. Seems like the whole Ford 'faster horse' sort of thing. Take a longer view. Think about the possibilities. Maybe there's some cool things this would open up.
I don't think lines of code are taking up storage such that we'd have any trouble moving to UTF-8, 16, or any other longer format than ASCII.
Have you tried the "Not giving a fuck" method?
Fair, and we could probably all do with a little less stress on ourselves.
Really the main thing I think for the OP is just to make sure you're not trying to juggle remembering all the things you need to do.
For me, documenting what needed to be done freed my mind up from wondering frequently if I was remembering to do things. Saves you from that constant "Oh yeah, I was supposed to ______ today/tomorrow/yesterday" feeling.
That's really the main takeaway for me from GTD, not all the methods and contexts and everything, just lowering mental overhead.
Where there's a will, there's a relative.