Please create an account to participate in the Slashdot moderation system


Forgot your password?
Operating Systems

Symbian Introduces Open Source Release Plan 92

volume4 brings news that David Wood of the Symbian Foundation has made a post detailing their plans for a release schedule, with new versions due out every six months. We discussed Nokia's acquisition of Symbian for the purpose of open sourcing the popular mobile OS last year. Quoting: "There's a lot of activity underway, throughout the software development teams for all the different packages that make up the Symbian Platform. These packages are finding their way into platform releases. The plan is that there will be two platform releases each year. ... Symbian^2, which is based on S60 5.1, reaches a functionally complete state at the middle of this year, and should be hardened by the end of the year. This means that the first devices based on Symbian^2 could be reaching the market any time around the end of this year — depending on the integration plans, the level of customisation, and the design choices made by manufacturers. Symbian^3 follows on six months later — reaching a functionally complete state at the end of this year, and should be hardened by the middle of 2010."

Comment Re:No surprise (Score 1) 255

That's because C deals with how computers actually think.

No. C presents a highly idealised representation of how computers actually think. According to C, there are two types of data storage; disks and memory. One is accessed through fopen() etc., and the other through malloc() and its kin.

However, on any modern OS, calling fopen(), read() may well return me data that was residing in-memory in a file system cache, and a call to malloc() may well result in a pointer to data that's actually residing on the harddrive in the swap file. Then throw in things like L1, L2 cache into the mix, and we realise that we are writing code against an imaginary, idealised machine. I'd suggest that even by the time we reach 'C', we're in the realm of 'nothing at all to do with how the processor works'.

So why not keep going, and build newer storage models that don't make an arbitrary division between 'memory' and 'disk', and certainly don't force these upon the end-user?

Some of the nicest programs I've used recently, such as Adobe Lightroom, have done away with the concept of a 'Save' button entirely - changes the user makes to data are immediately reflected in the SQLite datastore. The end user, who's not a programmer, doesn't need to maintain a mental model of memory, filesystems etc.


"Netbooks" Move Up In Notebook Rankings 139

Ian Lamont writes "For the first time, a list of popular notebook reviews shows three 'netbooks' in the top 10. The netbooks use Intel's Atom processor.'s editor says there has never been more than one netbook in its monthly ratings. The reason for the netbooks' sudden popularity no doubt relates to the price and basic functionality, but there's a catch. Despite calling Atom a 'high-performance' chip, Intel cautions people not to confuse netbooks with notebooks, as netbooks will be unable to take on video editing or other processor-intensive tasks. This leads to the question of how netbooks will be able to handle demanding Web apps — or whether Web apps will have to be slimmed down to accommodate millions of netbook owners."

Submission + - Paul Graham finally releases arc...

alyosha1 writes: Paul Graham has spent many years writing insightful essays about language and software design. He also wrote the first implementation of Yahoo! stores, and credits his success with the fact that he wrote it all in Lisp.

Now, after several years of anticipation, he's finally released arc, his lisp-like language, onto the world. Designed to be a '100 year language', will this be a breakthrough in language design or an interesting toy that will fade quickly into obscurity?

You are in a maze of UUCP connections, all alike.