Forgot your password?
typodupeerror

Comment: Re:What for? (Score 5, Interesting) 110

by TheRaven64 (#47915739) Attached to: Why Apple Should Open-Source Swift -- But Won't

I maintain the GNUstep / Clang Objective-C stack. Most people who use it now do so in Android applications. A lot of popular apps have a core in Objective-C with the Foundation framework (sometimes they use GNUstep's on Android, more often they'll use one of the proprietary versions that includes code from libFoundation, GNUstep and Cocotron, but they almost all use clang and the GNUstep Objective-C runtime). Amusingly, there are actually more devices deployed with my Objective-C stack than Apple's. The advantage for developers is that their core logic is portable everywhere, but the GUIs can be in Objective-C with UIKit on iOS or Java on Android (or, commonly for games, GLES with a little tiny bit of platform-specific setup code). I suspect that one of the big reasons why the app situation on Windows Phone sucks is that you can't do this with a Windows port.

It would be great for these people to have an open source Swift that integrated cleanly with open source Objective-C stacks. Let's not forget that that's exactly what Swift is: a higher-level language designed for dealing with Objective-C libraries (not specifically Apple libraries).

Objective-C is a good language for mid-1990s development. Swift looks like a nice language for early 2000s development. Hopefully someone will come up with a good language for late 2010s development soon...

Comment: Re:If there was only one viable choice ... (Score 1) 116

by TheRaven64 (#47915717) Attached to: Court Rules the "Google" Trademark Isn't Generic

It wasn't just about interface. People tend to forget how search engines did an absolutely horrible job of intelligently ranking the sites you wanted to see.

I find it pretty easy to remember - I go to Google today.

The UI was what made me switch both to Google originally and from it some years later. When I started using Google - and when Google started gaining significant market share - most users were on 56Kb/s or slower modem connections. AltaVista was the market leader and they'd put so much crap in their front page that it took 30 seconds to load (and then another 20 or so to show the results). Google loaded in 2-3 seconds. The AltaVista search results had to be a lot better to be faster. I switched away when they made the up and down arrow keys in their search box behave differently to every other text field in the system.

Comment: Re: Government s a crappy investor (Score 2) 52

by TheRaven64 (#47915703) Attached to: Funding Tech For Government, Instead of Tech For Industry
My 'precious electronic toys' use about a tenth of the power that the ones I was using a decade ago for the same purpose did. Even lighting power consumption has dropped. My fridge, freezer and washing machine are the big electricity consumers in my home - efficiency has improved there, but nowhere near as fast as for gadgets.

Comment: Re:Tricky proposition (Score 1) 52

by TheRaven64 (#47915695) Attached to: Funding Tech For Government, Instead of Tech For Industry

There's a lot more to government than military intelligence gathering and law enforcement (although it would be a good idea for someone to remind most current governments that those are two things, not one). And most government projects end up spending insane budgets. This isn't limited to the US. It amazes me how often government projects to build databases to store a few million records with a few tens to thousands of queries per second (i.e. the kind of workload that you could run with off-the-shelf software on a relatively low-spec server) end up costing millions. Even with someone designing a pretty web-based GUI, people paid to manually enter all of the data from existing paper records, and 10 years of off-site redundancy, I often can't see where the money could have gone. Large companies often manage to do the same sort of thing.

The one thing that the US does well in terms of tech spending is mandate that the big company that wins the project should subcontract a certain percentage to small businesses. A lot of tech startups have got their big breaks from this rule.

Comment: Re:why? (Score 1) 180

by TheRaven64 (#47907301) Attached to: Oculus Rift CEO Says Classrooms of the Future Will Be In VR Goggles
Add to that, about 10-20% of the population get motion sick using the kind of VR in Oculus Rift (myself included - I can use it for 2-5 minutes, depending on the mode). It's ludicrous to imagine building a school that would exclude 20% of the potential pupils on some random criterion. You might as well make schools that didn't let in gingers...

Comment: Re:intel atom systems keep 32 bit systems around (Score 1) 129

by TheRaven64 (#47907067) Attached to: Chrome For Mac Drops 32-bit Build
Apple already ships 64-bit ARM chips and a lot of other vendors are racing to do so. The Android manufacturers that I've spoken want 64-bit for the same reason that they want 8-core: It's a marketing checkbox and they don't want to be shipping a 32-bit handset when their competitor is marketing 64-bit as a must-have feature. ART is in the top 10 worst-written pieces of code I've had to deal with and is full of casts from pointers to int32_t (not even a typedef, let alone intptr_t), but it should get a 64-bit port soon.

Comment: Re:The ones I witnessed... (Score 1) 129

by TheRaven64 (#47907057) Attached to: Chrome For Mac Drops 32-bit Build
64-bit is here for a while. A lot of modern '64-bit' CPUs only support 40-bit physical addresses, so are limited to 'only' 128GB of RAM. Most support 48-bit virtual addresses (the top bit is sign extended, so all 1 or all 0 depending on whether you've got a kernel or userspace address), limiting you to 'only' 32TB of virtual addresses. If RAM sizes continue to double once every year, then it takes another year to use each bit. We currently have some machines with 256GB of RAM, so are using 41 bits. 64 bits will last another 23 years. RAM increases have slowed a bit recently though. 10 years ago, you always wanted as much RAM as possible because you were probably swapping whatever you were doing. Now, most computers are happy with 2GB for programs and the rest for buffer cache. As SSDs get faster, there's less need for caching, but there might be more need for address space as people want to be able to memory map all the files that they access...

Comment: Re:It's not Google's fault. It's Mozilla's. (Score 1) 129

by TheRaven64 (#47906847) Attached to: Chrome For Mac Drops 32-bit Build

The real problem for Firefox is not the interface changes that people like you whine about, it's mobile. Now 30% of traffic is mobile and Firefox doesn't have an app for any Apple mobile devices and is effectively excluded from Android by Google's Microsoft-like illegal anti-competitive licensing deals with manufacturers (you can get the app, but it's not preloaded and only a few geeks ever would).

Huh? It's in the Google Play Market and is no harder to install than any other app. Once it's installed, the first time you click on a link from another app you're asked to choose the app that will handle links. I fall into the geek category (and so installed it from F-Droid, not Google Play), but found it trivial to switch to Firefox on the mobile. I mostly did because Chrome has spectacularly bad cookie management and I'd been trying to find a browser that did it better. Early Firefox ports were as bad, but now it's quite nice and with the Self Destructing Cookies add-on does exactly what I want.

The mobile is actually the only place I use Firefox...

Comment: Re: No, no. Let's not go there. Please. (Score 1) 857

by TheRaven64 (#47902613) Attached to: Why Atheists Need Captain Kirk
I recall reading some years ago that there are two kinds of atheists:
  • Those that disbelieve all religions.
  • Those that disbelieve all except one religion.

For some reason, people in the second category describe themselves as 'religious'. And yet you'll be hard-pressed to find, for example, a Christian who requires the same standards of evidence for the non-existence of the Norse, Egyptian, Greek or Hindu gods as he requires that an atheist from the first category provides for the non-existence of the Abrahamic god.

Comment: Re:Shortest version (Score 1) 326

by TheRaven64 (#47847237) Attached to: Stallman Does Slides -- and Brevity -- For TEDx
Talking about open-source businesses is missing the point entirely. Most businesses that are successful as a result of open source (or Free Software, for the RMS-style folks) or that contribute significantly to open source are not 'open-source businesses' any more than companies that use Windows and Office are 'closed-source businesses. The difference is that one category of businesses realises that writing software is expensive and copying software is trivial, so spends its investment on the software parts of its infrastructure paying people to write software (typically customising and improving existing projects), whereas the other pays someone for copies of software and hopes that that will give them an incentive to produce software that's more like they want.

Comment: Re:Amiga (Score 1) 169

by TheRaven64 (#47832899) Attached to: Steve Ballmer Authored the Windows 3.1 Ctrl-Alt-Del Screen

You're comparing apples and oranges as far as the technical details. I'm saying Win 3.x let me continue when it saw problems, and NT could also do that.

Not really. The kind of situations where Windows 3.x let you try to continue, Windows NT just handles transparently. In Windows 3.x, with cooperative multitasking, a single application can refuse to relinquish the CPU. If this happens, you have three choices (outlined by the dialog box):

  • Just wait and see if it eventually recovers.
  • Kill that application and hope that it isn't holding any handles that other processes need to be able to do useful work.
  • Restart the entire computer.

In a system with protected memory and preemptive multitasking, an application that refuses to relinquish the CPU will just have its priority downgraded and the only thing that you'll notice is the CPU getting warm. Eventually, you may choose to kill the program, but it never affects system stability.

I'd like to have the *option* to continue to save my work even if there was a chance of data corruption. For example, take the common NT blue screen IRQL_NOT_LESS_OR_EQUAL. That fact that my buggy network driver tried to access paged memory in the wrong sequence is miles away from catastrophic. And it certainly doesn't take priority over something I've been working on for hours. IRQ 0 is me, motherfuckers!

It means that there's a high probability that something has damaged some kernel data structures. If you continue, there's a good chance that this corruption will spread to the buffer cache and you'll end up writing invalid data to disk. If you kill the system, the corruption is limited to the RAM.

Comment: Re:"Stuff that matters" (Score 2) 169

by TheRaven64 (#47832833) Attached to: Steve Ballmer Authored the Windows 3.1 Ctrl-Alt-Del Screen
Agreed on Chen's blog, but the summary is horrible. This message hasn't been part of Windows since Windows 95 (which introduced preemptive multitasking to the Windows world, so a single application could no longer freeze the system trivially), so the odds are that if you used Windows in the last two decades you've never seen this notice...

Comment: Re:Silly (Score 5, Insightful) 448

by TheRaven64 (#47826173) Attached to: Could Tech Have Stopped ISIS From Using Our Own Heavy Weapons Against Us?
The idea is to have a timer that would automatically disable the equipment unless it received an enable signal, either from a satellite or removable medium. It's possible to make such a system that is, at the very least, very difficult to tamper with. Many of the systems on tanks and so on are computer controlled and if the computers stop working then it's a lot less valuable. The goal of such systems is similar to that of crypto: it's not to prevent the enemy from ever using the tanks that they've stolen, it's to prevent them using them quickly. If you have a few weeks to bomb the stolen equipment before it can be used, and the enemy has to invest a lot of high-tech resources into cracking the systems, then that's probably good enough.

Vax Vobiscum

Working...