Just my simple opinion, but Oracle only sees Java as a cow to be milked, not one to be nourished.
Right, sure, but there is certainly a fat profit in that, to use the COBOL example from the previous poster, IBM has been quietly upgrading COBOL & Mainframe technologies for years. Sure it doesn't have the spotlight that things like HTML5 and iOS (iPad, iPhone) have. And one other thing to consider is that Oracle is heavily invested in Java because their apps use a bunch of Java/J2EE technologies, for example Oracle Fusion and Call Center Anywhere. So Java won't get the fame and glory it once did, but they will still be significant investments. It's a little disappointing to see all the stuff that *won't* be in the newer version of Java (1.7? I can't even keep track anymore), but after now starting to use Java 1.5, and being fairly impressed by annotations and seeing the implications (who needs Spring? I can use Guice), I certainly hope I can continue to use Mac OS X as a development platform. Because utlimately making a less than adequate support for stuff like this is not a good idea. what next? Deprecate the Apple gcc?
Mac has a LOT of catching up to do before their package management is as nice as that of Ubuntu et al. Granted, it's better than the one on Windows, but that's not saying much.
Actually, for OS X, there is macports. Personally, I like things like apt-get, but since that steers you towards downloading binaries, and macports compiles the source, you get an application built exactly for your system. Anyway, main point is that, while I do believe that, I think it was Red Hat first with the RPM standard, Linux and other distros (SuSE) have pushed the envelope on making it easier to install software, I would say it's just as easy on the Mac. But golly, with this "deprecated" business, I'm just as cautious as everybody else here. At the very least, Apple should *communicate* things of this nature, so you don't have a bunch of
This is because they don't want people developing Android apps on OS X
to
No worries, this is just that Apple's work is done, they've contributed everything back to the Sun/Oracle JVM, and we will all happily run the Oracle JVM when it comes out
One thing you are seeing now is the proliferation of versions of Android out there. In other words, maybe Google is making the same mistakes Microsoft did with "DLL hell", only much worse. This would seem to me to make it difficult for the third party vendors out there. With Apple, I have to test for iOS 4.1, 4.0, 3.2, 3.1, maybe iOS 2.0. With Android, throw in more versions, and more hardware, and you've got some additional complexity.
Another interesting advantage of iOS is that Apple doesn't have to convince Linus, as Google does, to make a change to Linux to support devices vastly different from the typical hardware Linux runs on, from big iron like IBM mainframes, to powerful Unix servers, to laptops. There was a fascinating thread on the Linux Kernel a while back about Linux Power Management, all about sleep mode, etc. Fascinating in that it gives insight into the tremendous amount of thought that goes into what might seem a trivial problem, but then you realize how this might impact other systems, well, it isn't so simple. Is this due to Linux monolithic kernel, vs. the mach kernel used in iOS? Just a thought. Anyway, it occurred to me reading this thread, that Apple has a significant advantage in not having to convince a third party to make a change like this.
The course material is very good too, such as the lecture presentations and the assignments. In the '09 semester I believe they did a Twitter-style app, in the Spring 2010 semester, which you can also download from itunes, they do a flickr app. They bring in various speakers, including Apple employees working on the various supporting libraries.
The only minor quibble I would have is that the Xcode app has changed from the version used in the course, so sometimes you can't follow the instructions exactly.
Given the OPs background in C and hardware, I'd agree with the folks on this thread that suggest going the iOS/Android/Flash route vs. using a web application. It would seem to me easier, given that background, of using a GUI framework like Cocoa, than figuring out a web framework. And this gets to the most important point - what are his requirements? The school uses iPads, so he probably has some ideas for some time of application that would benefit the school. This is probably the best way to learn - scratch an itch, and start from the top down, rather than learning every nuance out there, since these frameworks and platforms can be huge and complex.
And even if you do use something like Qt for your app, not a lot of people have the time, money or resources to debug the app across multiple OS's, and a jillion or so phone models, all with slightly different versions of these OS's, with different screens, buttons, and capabilities.
Of course, and that's the whole reason to use something like Qt, or Java (android), or Adobe Flash/Flex,SWT (eclipse.org)etc.
The whole philosophy behind Qt, is that you let TrollTech, or Nokia now I guess, handle all the fun stuff of getting it to work across multiple platforms. Sure, there are bugs, just as if you would be using Adobe Flash, or any other x-platform kit, i.e. same thing with Google's version of Java (Android). Right now I'm using bouml, a UML modeling tool built in Qt. Runs on Linux (I think the author does most of his work in that environment), but also Windows.
He has not acquired a fortune; the fortune has acquired him. -- Bion