The main problem however is that all your transactions will started on a Web2.0 site written in PHP on some old x86 machine before it is shipped of to the highly reliable platform. So it's actually quite useless in practice to have this capability.
The only companies using mainframes that I know of have been using them since 1960. Benchmarks are irrelevant, legacy software is the prevailing driver for keeping the mainframe in house.
Well, how open are the apps? This has nothing to do with open/versus closed (which applies only to the applications) but to the screening process for applications. One of the reasons open-source is deemed more secure is that if a bug is found you will be publicly flogged for doing such a stupid thing. Apps however are pretty anonymous as far as the author is concerned (even via a legitimate appstore).
The real problem I think is the combination: allow outgoing calls & internet connectivity are a fishy combination. However, even banking apps require these two privileges here in The Netherlands at least, so it's real hard for users to validate the necessity of these privileges. I'm not sure the OS can help here, except by giving the users the ability to disable a requested privilege for an application (the application wants internet, but it's a single player Tetris clone: yeah, right). Even better: only allow 'dangerous' privileges for signed & verified applications (mainly, the ones linking the phones primary functions).
I wouldn't know how Visual Studio is these days, but some developers I know say that Eclipse beats VS2010 in most respects. I do now Eclipse and Netbeans fairly well and I'd say that Eclipse out-of-the-box is nothing compared to netbeans. With plugins they are comparable, but the GUI builder of Netbeans beats all the ones I've used so far. Also the Maven integration is much better in Netbeans than Maven. So it depends on what features you want to show tour coworkers and I'd say give it a try.
Looking back at my CS University degree, I think the most valuable courses I had were those in computer structures (ie. what's an IRQ, DMA etc.), programming language design, CPU design, realtime systems, mathematical logic and some other courses. I did do an OO course and it was immediately clear that OO is just a thin layer on an existing language design, so dropping that and teaching the underlying structures would be a good thing (as universities should teach about abstract concepts, if you want hands on tutorials go to some college). As to dropping math: it depends, there's good math (coding theorie to explain checkums and chiphers for example) and non-appropriate math (differential equations, while useful, are not really needed to explain concepts related to CS). On the other hand, giving more abstract concepts to think about is not bad perse, but should be added if we have nothing useful to say. So dropping OO and replacing that with, say, differential equations, would make sense imho.
The point being that I will pay extra for a GPU that I will not use. I want less hardware, not more and loose the complexity too. Drop the second graphics chip and also the price instead of including a pricey extra graphics chip and increase the price even more so I can switch it of.
I tried looking for a sandy bridge laptop with a 15" screen showing 1920x1200 resolution using built-in graphics, but it seems vendors are now using a power slurping external GPU as a luxury that you must have if you want a decent screen. I don't game nor do I have any need for CAD/CAM like applications, I just need a decent resolution/dpi on my laptop and integrated graphics would make the machine cheaper and less power-hungry, so ideal for developing. Alas, I will probably end-up with some Quadro or other high-end GPU just I want a normal screen.
Well... if he's granted the 2.3b by the judge is shouldn't be too hard to seize 2.3b in assets from the PRC on US soil. I wonder what the sound of 500 million marching chinese soldiers makes
For now yes. The next MacOS release will require signed applications and guess what.... only Steve gets to sign.
We got a talk about Silverlight in 2007 from some MS-exec telling us that this would be the next best thing since sliced bread. When I asked some akward questions asking about continued multiplatform support, both the MS and internal management told me to shut up and told me that the 'community should step in' in the Linux case (moonlight). In 2008 they launched their Silverlight app and not all customers could access it (basically, none could due to bugs in the app, but after these were fixed, at least a small ammount of customers who went through the hassle of installing Silverlight could access it). Some customers were never able to access the application (due to Silverlight issues on their platform or the absence of Silverlight). And now finally MS finds out that they cannot deliver anyway in their usual 180 degrees turn. Oh how I'd love to do that meeting again...
Am I getting old or is everyones memory that bad?? The gummibear attack was already shown in 2002: http://www.theregister.co.uk/2002/05/16/gummi_bears_defeat_fingerprint_sensors/