Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?

Comment Re:I guess they realised... (Score 1) 105

The Wayland API is nothing like X11 except in broad concepts that all display / input APIs share - listening for input events, connecting to a display, creating a drawing area and so on. It's just that Wayland's map onto modern concepts such as GPU surfaces and compositing so that rendering is as efficient as possible.

As for the primitives, nobody has ever said they slow modern clients. The point is that clients don't even use the primitives any more and its the same story applies for most of the rest of X11. It has a 1980s 2D-centric, damage based view of the desktop and extensions are used to fool it into supporting surfaces and composition. But those extensions are workarounds which are design compromised by the architecture and so they are slower and less efficient than they would be.

Hence the push for Wayland. It will ultimately lead to a more lightweight and responsive desktop. Fedora Core 23 will be released soon and GNOME 3 should be feature complete for Wayland. And 24 might flip the switch and make Wayland the default. It doesn't stop people using X11 if they want so I don't see the problem.

Comment Re:I guess they realised... (Score 1) 105

X11 isn't perfect. Nobody's ever argued that. It's just nobody's really asking for a replacement, and if they were, they wouldn't be asking for Wayland. X11 is an extraordinary piece of technology, it takes some gal to claim everyone should just throw it out and replace it with a ground up rewrite that adds no new features and doesn't support the major features X11 is famous and loved for.

I think you'll find lots of people are asking for a replacement, starting with the people most familiar with X11. And more generally anybody who wants Linux to be able to host a modern, responsive desktop experience without suffering the latency and other bottlenecks of an arcane and mostly obsolete architecture for no reason whatsoever.

Comment Re:I guess they realised... (Score 1) 105

Exactly. X11 was designed for a different world where there was clipping, damage, rendering primitives, immediate rendering etc. It doesn't have any concept of compositing or GPU surfaces and thus a raft of extensions have appeared to support those things. But of course they're still hampered by a brain damaged pipeline which still thinks in the old way and they are constrained by.

Comment Re:VR is going to land with a thud (Score 1) 174

I play Stumovik Cliffs of Dover which has some amazing cockpit models and I can't help but think what it would be like if I was actually looking around with my head instead of with some stupid hat control. But sims are a niche. Not everyone likes flying planes around (or trains, trucks, tractors etc.). And even of those that do, only a fraction stump up for a stick let alone a peripheral costing $350 and requiring high end hardware.

So I don't see that sims would save the tech. Nor do I expect Oculus would be happy if it launches with great fanfare and ends up being mostly used by someone driving a tractor on a virtual farm.

Comment VR is going to land with a thud (Score 4, Insightful) 174

There is no denying VR sounds cool. In some cases it might actually be cool - I'm thinking particularly of racing / flight / space sims where you sit in a cockpit and the range of movements in game roughly correspond to real life - you sit in the game, you sit in real life, you have buttons and controls in the game, you have buttons and controls in real life.

But for other kinds of game I really don't see the benefit. Yeah it could be used for first person shooters (for example) but then the game has to somehow reconcile a person running, spinning, jumping, aiming, shooting, standing, crouching and throwing stuff to someone in real life sat on a couch. It's likely that it will be extremely disorientating and puke inducing.

And aside from FPSs what can we expect? Probably some lame jump scare horror games. Probably some table top style games. But nothing that particularly justifies the experience. I bet most games will work as well if not better in 2D.

The strange part is there are at least 3 major efforts to do VR plus a number of smaller ones and they'll end up cannibalizing the market for what it is. It's going to be a bloodbath.

Comment Re:700 ms latency, though... (Score 1) 58

And the latency is because the the satellites are in a geostationary orbit a horribly way long way away - signals having to go out 35000km out to the satellite and back again to some base station.

The other alternative is lots of satellites in a low earth orbit, with one coming into a range as another one leaves and some kind of data relay mechanism for sending data to a base station. A more complex solution but latencies would be much lower and it would probably scale better. The same satellites could even be used to service parts of Africa and South America.

Comment Of course it increases security (Score 1) 317

Chip and pin devices are more secure than magnetic stripes in a number of ways.

Buying something with a magstripe normally involves swiping the card in a reader and scrawling a signature onto a screen. Theoretically the cashier might ask for ID or compare the signature to the card but they rarely do. And the cashier might even be cahoots with the thief, knowing the card is stolen and not do any check at all. On top of that the merchant might store transaction details insecurely, or their software may be hacked. And in some scenarios such as bars & restaurants, the card might be taken from the sight of the customer which increases the risk of it being skimmed. All of these are major vulnerabilities that thieves have been known to exploit.

A chip and pin reader means that the card holder must authenticate themselves before proceeding. That stops someone from picking up a card, or cloning one and being able to use it without the pin. And authentication is to the payment processor and not to the store or cashier so it's not possible to bypass this check. It also means the store never captures the credit card info (they only get partial info and some payment authorization code) so hacking the store does not put details at risk. And chip & pin devices are portable so payments in bars & restaurants can be made in the presence of the customer so they are less likely to be swiped.

So yes it closes some very obvious security flaws. Is it perfect? Of course not, but it's a hell of a lot better than a magnetic stripe. It's a damned shame that it's taken the US so long to even switch to chip and pin. The next step would be to get rid of the magnetic stripe altogether but I expect we can look forward to years of lobbying by ATMs and banks how this couldn't possibly be done.

Comment Should have done this from the get-go (Score 1) 127

Blackberry should have gone with Android from the get go. They could have produced a security hardened version of Android where personal apps and business apps resided in separate personas protected from each other. This was a strong and compelling option for companies that wanted BYOD but without the risk. They could have thrown in their software stack too and their own front end and would have made a lot of money.

Now they're belatedly bringing out devices that run Android but it's basically just Samsung and KNOX under the covers. It's probably too little too late. I wouldn't be surprised if Samsung buy Blackberry outright and use the brand to sell a bunch of security hardened phones.

Comment Re:Can anyone explain in actual meaningful terms? (Score 1) 143

I wish Google would support a universal ABI using bitcode in their NDK. Apps that have native shared libs (i.e. most games) are faced with either bundling up all the libs compiled for each architecture into a single package (bloat), or producing separate packages for each architecture (hassle). And of course if a new architecture comes along, it means repackaging the app again.

It'd be much nicer to just bundle up a single shared library that was compatible with any architecture. It could be turned into a native lib when the app is downloaded or when the app was installed on the device. It could also just execute via a JIT.

The weird thing is that Android does contain an llvm compiler for renderscript, but not the NDK.

"Falling in love makes smoking pot all day look like the ultimate in restraint." -- Dave Sim, author of Cerebrus.