Given the trouble Patrick had squeezing down a full DB dump of Wikipedia to fit into 2GB (for the app store), I find it impossible to believe that the 162 MB files I've found so far for Wikipedia in MDict format are anywhere near the full text (which Patrick's app is).
Automatic update will do an automated install including changing the browser. Running Windows Update manually will prompt you.
My roommate has a wireless Mighty Mouse, and it never right-clicks properly for me.
Take your finger off the left side. It's touch sensitive: if you click with a finger on the left, it's a left click; if on the right, it's a right click. If both, it treats it as a left click (to prevent confusion for users who don't know the difference).
Click with just one finger on the mouse, and I bet it'll work just fine.
I've had the issue you've described on my iMac; worse yet was that I discovered how to quickly and easily reproduce it, and was about to start trying to track down the issue to figure out how to stop it. Bad timing (IT emergency) and then a power failure meant that by the time I had a chance to sit down and figure it out, the batteries in my mouse died anyway.
I'm hoping that one day I'll stumble across some reliable reproduction method again and be able to figure out what the problem is and stop it.
Or, put another way, Sun released ZFS code under an open-source license, and that should be good enough, but the GPL is too focused on rigid adherence to a strict set of rules, and is thus incompatible with many open-source licenses, including Sun's.
How is it that FreeBSD, for example, got Dtrace support included, but Linux can't? Oh, that's right, it's Sun's fault somehow.
And are fixed storage. You obviously haven't R'ed TFA or you'd see that this is removable storage, like DVDs or Blu-ray.
These are essentially DVDs that store around 450G for $45. Even Blu-ray discs are about $0.50/GB.
That's a lot cheaper, and even if it takes so long for them to come out that BD discs are $0.05/GB, $0.10/GB for ten times the storage will be definitely affordable. These could be great backup solutions for homes or servers depending on the write speed.
RMS argues against having anything that is not under your direct control (or cannot be brought under your direct control). I wonder how he computes?
Does he have the source code to his BIOS? And to that of his video card, DSL modem, and cellphone? Does he host his own website, routing his packets using open-source routers that run only Linux?
Sure, all of this is likely possible to some extent, but not entirely. Should we avoid software as a service and do everything ourselves? I want a good issue tracking system. Lighthouse is pretty good. Github's new system is pretty good. All the open-source systems out there are pretty awful. Trac is awful. RT is awful. It's all junk.
I use a Mac. I'd use Linux, but it doesn't do what I want. It's not up to snuff. At the last job I worked at, we all used Linux on the desktop (we were essentially a team of sysadmins), and you know what? Not a week went by when someone had to spend an entire day 'fixing' their broke Fedora machine because some minor Xorg point update had broken, or their yum database was corrupt and they couldn't upgrade their systems. I had a button in my taskbar that ran 'killall -9 soffice.bin' because OpenOffice kept locking up on my machine (but not on anyone else's).
Open-source is great, and I use it whenever the benefits outweigh the drawbacks, but all I see lately is RMS talking about how everything should be free, but not helping to make good things free or free things good. Until he finally grounds himself in reality, I'm not interested anymore.
The same thing happens in World of Warcraft. A new 70-man raid instance is made available, and all the high-level, best-gear-available guilds on the server all start hitting it, doing it as much as they can, trying and trying desperately to be the first to beat it.
Eventually, one group beats it, and then there's a cascade. The second group finishes it, maybe faster. Then the third. Then 60 people. Then 40. Eventually you have three hotshots essentially solo'ing something that used to be nearly incomprehensible.
In a lot of cases, it's just a progression of knowledge and skill. Once you know exactly what needs doing, you can refine it further and further, hone the edge sharper and sharper, until you can make one swift stroke instead of the dozens it once took.
It's kind of amazing really. If you think about it, the scope of Geocities was likely huge, requiring what must have been, for the time, a colossal amount of bandwidth and hardware to handle the traffic being served by all those users.
Now, however, since it's largely static pages with some minor ad munging, you could probably serve the entirety of their content from a single server, largely from memory, without a lot of fuss.
We've come a long way from Geocities' (almost) static pages in 1995 to our current 'request per user per second' dynamically updating AJAX-enabled user-generated socially-networked drop-shadow rounded-corner web-font lifestyle. Time marches ever onward, and while there's something to be said for simplicity, it's hard to fathom a website that doesn't change any time I do something with it.
Funny I know, but it's not far off â" Acrobat only bugs me about updating when I'm about to try doing something else. 'I know you said you wanted to see this PDF, but wouldn't you be happier waiting 10 minutes for a software update instead?'
Acrobat needs some method of downloading updates in the background and then just asking you if you want to apply them (yes/no) when you start it, but applying them later, when you're done.
Then again, most apps need to do things like that.
As far as medical radiology goes, a pencil-thin beam would be nice for added precision, but also for dramatically reducing the radiation dose. My local hospital has stopped giving me CT scans because I've had so many in the past (out of necessity) that they don't want to fry me any more than necessary.
Replacing the emitters in a CT scanner, which basically spray you with radiation and rely on carefully-placed sensors to create the line-of-sight they want, with a directed, low-power beam that only hits with radiation those cells that actually need it, will dramatically reduce the amount of radiation that patients receive.
I didn't overclock my sink and bling it out with tube lighting and a giant plexiglass window just so I could settle for a measily 100gpm. I want power, and damnit I'm going to get it, no matter how much those pipes cost!
Part of the efficiency is the arrows, so the data knows which direction it should go. Otherwise it gets confused and just goes round and round in circles. You can save some money by drawing arrows on the cables you already have. I've done it on all the cables in our office building, and the tests don't show it, but it FEELS faster!
How does this reduce the number of certificates required? It might reduce the number of copies of the certificate, but you still need either one certificate per subdomain, or one wildcard certificate per domain.
I'll grant that it makes certificate management simpler, but not significantly so â" it really only saves two minutes every year.
I personally don't like the concept of fail2ban as it is permanently adding an IP address to your banned list. As most of these IPs are dynamic, keeping them in your banned list isn't really serving any useful purpose. I personally prefer a system that temporarily bans an IP.
fail2ban temporarily bans IPs. It removes them after a configurable time limit.