Modularization is what java applications (well, backend servers powering too complex enterprisey-apps) need, and that should be achieved through the means of easy to use osgi tools instead of yet another (sun|oracle) screwup mimicking an "oss standard".
I think you're missing one of the main points of Jigsaw - which is modularizing the platform, not the application. This is especially important if Java is to get back into the embedded space, where JavaME and CDC are so antiquated it's just not funny any more. Having a range of well defined platform profiles which span everything from headless embedded devices up to a full enterprise stack (while using the same underlying codebase) would be a major step forward. Personally, I don't care what the implementation details are - the changes aren't going to stop anyone from using OSGi to modularize their applications if they want to.
It's a great idea, but the execution is the only thing that matters and I just don't see them pulling it off. Who is going to teach these kids programming?
The kids can teach themselves. That may seem like a rather flippant comment, but I was impressed that Gove was aware of and referenced Scratch. One of the nice things about Scratch is that it comes with a whole bunch of 'self directed' learning materials which encourage hands on learning. It means you don't need a teacher with a C.S. degree to run an introductory programming course, just one who is sufficiently technically literate to get to grips with Scratch themselves.
The best choice would be if you could incorporate those algorithms into your hardware. Can you add a small DSP do the hardware? That doesn't just protect your code, it actually may also make your hardware easier to use (fewer software dependencies). On the other hand, that way, you won't get any improvement from the community.
I agree with this 100% - to the extent that it's the approach I've decided to take with my own startup. If your 'secret sauce' would benefit from real-time performance or hardware acceleration (FPGA or DSP), then proprietary firmware plus an open source host application stack is a great combination. The open source benefits wouldn't come from other people hacking on the core algorithms anyway - the main justification is to make it as easy as possible for other people to adapt and extend the technology to meet their own needs.
When I read this article I had flashbacks to the spurious crap that people used in ye olde Internet bubble. Or maybe the CDO credit bubble. In short, making arbitrary valuations by looking at second or third order artifacts and completely ignoring the value of the underlying thing.
What makes a good patent is the exact opposite of what these guys suggest. The membership of a patent 'thicket' that they regard as indicating patent quality is really an artifact of the way in which a single potential invention now gets salimi sliced into the maximum number of applications. This allows the corporation which owns the patents to brag about the size of its patent pile, it allows the employees who wrote the patents to maximize the number of patent bonuses they get and it obviously results in the greatest number of billable hours for the patent lawyers. In short, it's a win-win-win!
In reality, the most valuable patents should be ones that are as unrelated as possible to anything that went before and which stand completely on their own merits. Patents where any expert would look at and say 'I've never seen anything quite like that before'. However, making that judgment call requires that you actually analyze every patent in the portfolio in detail. Just as I'm sure the bankers carried out a detailed analysis of every underlying debt when they were trading CDOs...
Concept-wise Symbian is a great system, but frankly, the SDK is a pain in the ass.
Spot on - I wonder how many developers they lost to Apple/Android just because they couldn't get their act together with the development environment.
I guess they did this already for a low footprint kernel (N900, N850, N770,...)
I wouldn't really class Maemo/Meego as low footprint - more like a full Linux workstation in your pocket. There is a big gap in capabilities between the deeply embedded open source OS platforms like eCos and something like Linux. There are proprietary solutions which fill this gap, but Symbian was probably the most promising open source option - especially if SymbeOSE had taken off.
Nokia does invest time and money in open source. It was Nokia which put Qt from GPL to LGPL and still invested a lot of effort in further developing it, embracing others to use this framework.
Historically that has been true, but I'm not confident that's a reliable predictor of the future! On the upside, I've just found the Sourceforge dump of the last EPL Symbian release, so as an open source project it's not quite dead yet...
S40 is not symbian based (see http://en.wikipedia.org/wiki/Nokia_Series_40#Operating_system ).
Thanks for pointing that out - my post was a bit ambiguous. I meant to say 'Migrating to Symbian is still a much better option', which is what a lot of people pre-Elop assumed was the obvious upgrade path.
So either way, using Linux or Symbian, the OS needs to be adapted to the S40 hardware. Also, e.g. with RT-Linux it should be also possible to run the protocol stack on the same CPU.
I think we can safely say that any new hardware will be adapted to the OS, rather than the other way round. I'm also coming round to the idea of Nokia spending a whole bunch of cash on building a commercial quality, low footprint Linux distribution with proper real time support. I could use one of those myself - and thanks to the GPL, the source code will have to remain open this time.
Why not just keep updating/upgrading S40?
Short answer - because Nokia senior management have now completely lost the plot. Symbian is still a much better option at the low end because underneath all the shiny stuff is an RTOS designed specifically to run on resource constrained devices. Proper real time capabilities were baked into the current Symbian kernel specifically so that a single processor could be used for both the protocol stack and the applications. As someone pointed out earlier, other vendors pay good money to use proprietary RTOS platforms like Nucleus for their low end phones because they deliver the same benefits.
Putting a full Linux workstation in your pocket in the form of the N950 is cool - and I wish they'd let me buy one. But this is a different market, and it's not one where using Linux makes a hell of a lot of sense.
a Londoner when asked by a television reporter: Is rioting the correct way to express your discontent?
"Yes," said the young man. "You wouldn't be talking to me now if we didn't riot, would you?"
The TV reporter from Britain's ITV had no response. So the young man pressed his advantage. "Two months ago we marched to Scotland Yard, more than 2,000 of us, all blacks, and it was peaceful and calm and you know what? Not a word in the press. Last night a bit of rioting and looting and look around you."
I guess they must have been marching on Scotland Yard to protest at the lack of policing in their area and that the police were taking too much of a 'softly softly' approach on gun and gang crime, then. Because the net effect of the riots and the media coverage is going to be an increased police presence, possible increases in police powers and wide support in the general population for the 'robust' use of those powers. I'm sure that's what he and his fellow 'demonstrators' would have wanted.
The ASA acts if someone complains. Maybe nobody complained about Apple.
Maybe nobody complained about Apple because there wasn't much advantage in it, whereas now there just happens to be review of UK IP law on the cards, in order to promote UK technology innovation. And maybe we can all now point to a UK technology company which is being put at a disadvantage by the arse-backwards UK copyright laws. And maybe somebody has been playing the ASA in order to highlight this stupid situation at just the right time. Just a thought...
In every hierarchy the cream rises until it sours. -- Dr. Laurence J. Peter