"We wouldn't have swallowed this at NeXT", said Ross perorally.
And if Slashdot become (e.g.) 90% articles about Justin Bieber and other teen heart-throbs, would you respond similarly to people expressing their discontent?
Not unless they also switch to Beta. I have my priorities straight.
Treating them as their own "special" category of evil people only strengthens their cause.
People not from Australia may not realise that we already do this with criminal gangs that also happen to like motorcycles.
I'm glad I'm not the only one who noticed just how unlike the real Caliphate this is.
Incidentally, QNX has an interesting design in this respect, in that it maps the the source buffer (a page at a time, IIRC) into the address space of the receiving process, and does the copy directly. Or it might map the destination buffer into the address space of the sending process; not sure about that. This allows messages to be arbitrary-length.
Monoliths, e.g. Linux, don't have IPC latency [...]
In the case of non-memory-mapped I/O (e.g. anything that goes through a file descriptor), the major cost isn't context switching, it's copying data. Microkernels have to transfer all this data between address spaces. Macrokernels, on the other hand, have to transfer it between user space and kernel space. It's basically the same operation with basically the same cost, but it falls under a different heading.
How is the objective-c compiled and ran on Dalvik? Are you doing: objective-c -> LLVM -> dalvik bytecode?
It isn't. It runs natively via the NDK.
I maintain the GNUstep / Clang Objective-C stack. Most people who use it now do so in Android applications. A lot of popular apps have a core in Objective-C with the Foundation framework (sometimes they use GNUstep's on Android, more often they'll use one of the proprietary versions that includes code from libFoundation, GNUstep and Cocotron, but they almost all use clang and the GNUstep Objective-C runtime). Amusingly, there are actually more devices deployed with my Objective-C stack than Apple's. The advantage for developers is that their core logic is portable everywhere, but the GUIs can be in Objective-C with UIKit on iOS or Java on Android (or, commonly for games, GLES with a little tiny bit of platform-specific setup code). I suspect that one of the big reasons why the app situation on Windows Phone sucks is that you can't do this with a Windows port.
It would be great for these people to have an open source Swift that integrated cleanly with open source Objective-C stacks. Let's not forget that that's exactly what Swift is: a higher-level language designed for dealing with Objective-C libraries (not specifically Apple libraries).
Objective-C is a good language for mid-1990s development. Swift looks like a nice language for early 2000s development. Hopefully someone will come up with a good language for late 2010s development soon...
It wasn't just about interface. People tend to forget how search engines did an absolutely horrible job of intelligently ranking the sites you wanted to see.
I find it pretty easy to remember - I go to Google today.
The UI was what made me switch both to Google originally and from it some years later. When I started using Google - and when Google started gaining significant market share - most users were on 56Kb/s or slower modem connections. AltaVista was the market leader and they'd put so much crap in their front page that it took 30 seconds to load (and then another 20 or so to show the results). Google loaded in 2-3 seconds. The AltaVista search results had to be a lot better to be faster. I switched away when they made the up and down arrow keys in their search box behave differently to every other text field in the system.
There's a lot more to government than military intelligence gathering and law enforcement (although it would be a good idea for someone to remind most current governments that those are two things, not one). And most government projects end up spending insane budgets. This isn't limited to the US. It amazes me how often government projects to build databases to store a few million records with a few tens to thousands of queries per second (i.e. the kind of workload that you could run with off-the-shelf software on a relatively low-spec server) end up costing millions. Even with someone designing a pretty web-based GUI, people paid to manually enter all of the data from existing paper records, and 10 years of off-site redundancy, I often can't see where the money could have gone. Large companies often manage to do the same sort of thing.
The one thing that the US does well in terms of tech spending is mandate that the big company that wins the project should subcontract a certain percentage to small businesses. A lot of tech startups have got their big breaks from this rule.