The DOI link is more than just a shortener. It's supposed to guarantee a permanently available link, while your second link might not. DOIs are commonly included in references now instead of normal http links.
There are some bad assumptions in here... The $10 a month plan (for a single app) is only good for 1-year and you can only get it if you own a previous CS6 product. (Plus you have to sign up for a full-year to get it.) So the price will jump up to $20 a month after the first year... maybe. That's assuming that $20 a month is what they will be charging in year. And you can only "easily drop it" assuming you don't have to sign-up for a multi-year contract to get that $20 a month offer. You have no idea what Adobe will be charging in the future or what the conditions of getting a contract will be. So if you are lucky you will get 33-months out of $540 on Illustrator.
I concede the pricing is not $10/mo. The article's one sentence on it was very misleading. This changes the attractiveness considerably. 2 years and 3 months of subscription (at $20/mo) in exchange for the current retail price is the appropriate comparison, and that just doesn't sound very good at all.
Even if you kept up with every release, and each release came every 2 years, you'd pay $600+$250*3 = $1350 for 8 years at full Adobe pricing. A subscription costs $1920. So the single app pricing, at least for this product, looks pretty poor (not as poor as the $50/mo the summary implies, of course).
It's only $10 a month if you owned CS3 or better previously. If you didn't it's $20 a month.
It seems that's true - TFA was misleading here. Adobe's site makes the discounted single app pricing clear. In addition, it seems pretty clear that the discount pricing applies for only one year of the subscription, and then you pay the same as everyone else.
It's a shame, really. $10/mo is tempting for me, but for $20/mo I'll stick with the tools I have (e.g. Inkscape).
Except it says in the article that minimum subscription period is a year. So you're still paying full retail for the software, you're just only getting it for a year.
$50/mo if you pay for a year, $75/mo if you don't.
Also, the $50/mo if for the entire suite (retail $2500). You pay $120/year for individual products.
This means that before too long, anyone who wants an up-to-date version of Photoshop won't be able to buy it – they will have to pay $50 per month (minimum subscription term: one year).
This pricing seemed off. Sure enough, TFA:
For those who don't want the entire suite, Adobe offers subscriptions to individual programs. And now they're cheaper, down from $20 a month to $10 a month, Morris said.
So if you want Photoshop, Illustrator, etc. etc., the suite will be $50/mo. If you only want Photoshop, it's $10/mo. Furthermore, if you really only need software for a month, you can rent the suite for $75.
I can't say I'm a big fan of subscription only (even MS is keeping some purchase options for Office), but pricing like this does create some winners (besides Adobe). Short term projects, for example, may benefit from being able to purchase what was a $2500 package for only a month or two at $75/month. The losers, of course, are those that purchase upgrades infrequently and use their software for years.
Frankly, I'm tempted by $10/mo for Illustrator. The retail box of CS6 is $540, and I have no product from which to upgrade. So for the cost of the boxed version (with its potential resale or upgrade value factored in), I get 4 1/2 years of use of the latest version. One key difference is I can easily drop it after 1 year (and $120), if I don't need it any more. Still, I understand how abandoning box sales will make some people unhappy.
Replacing 40g of HFCS with 40g of sugar significantly changes the taste. Mountain Dew Throwback tasts way better.
I'm pretty sure they change the formulas on the Throwback series. European/Mexican Coke tastes better than U.S. Coke, for example, but they're still fairly close. Mt. Dew Throwback tastes more different. Pepsi Throwback is, in my opinion, remarkably different (a bit weaker, actually).
It makes sense when you think about it. Some people don't mind the taste of HFCS, and if they tried a simple HFCS->sucrose switch, they wouldn't tell much of a difference. By mucking with the formula a bit (emphasizing a light body among other things I'm not capable of describing), it's clear to almost everyone that Throwback is a different product.
I have two NASes, one at home and one off-site. I've recently learned that when a drive fails, in order to keep using your NAS, you have to have spare drives on hand. Even if you report the failed drive to the manufacturer immediately, it takes time for the new drive to arrive. In that time, your data is unprotected by the redundancy of RAID unless you have a spare drive to take the place of the failed one. Otherwise, it's best if you take the NAS offline or use it read-only.
Of course replacement drives take time to arrive. If uptime is essential, you should configure your RAID to have a hot spare. Then any two drives can die sequentially. I have no problem using an array in degraded mode so long as the data is backed up elsewhere...and obviously any data of importance should be backed up elsewhere.
Once you start getting into massive disk arrays, it starts to pay to have spares sitting around. For a home or small office environment with a handful of drives, you're better off using any spare drives as a USB backup or getting a large enough unit to support the drive online. Big datacenters have people around all the time to manage disk failures, but you might be off on vacation.
These things are built for file serving, and it's about as easy as it gets to set up. They also package all sorts of stuff as add-on services, though I don't personally use DNS. My complaint with the home-designed versions in the past is that they skimped on RAM, making them less useful for any kind of real server application. The higher end models like the 1512+ do better, and for just DNS and file serving it should be more than sufficient. Don't expect it to compete with a $1500 server in terms of computational performance, obviously, but it should be able to pretty much max out the drives' performance.
I had a drive die on my personal NAS, and the process went exactly as it should: it emailed me saying there might be problems; I did an extended SMART test via the GUI to double check it; I obtained an RMA for the drive and installed it; it restored to the new drive without incident.
iOS seems to have HTML5 local storage available, Facebook just chooses not to use it.
This is definitely true. The Financial Times native app got replaced with a nearly-equivalent HTML5 "app". Safari asks for permission to store extra data locally, and then it generally feels pretty responsive (relative to other news apps, which in all honesty feel a bit bloated). I'm not sure how compatible it is with other platforms, though - it might have a bunch of really ugly ipad-specific hacks behind it, who knows.
SSDs and hard drives fail in different ways, so it doesn't make much sense to me to combine them into one physical unit. Having both in one system does make a lot of sense, however, and making intelligent use of them isn't all that hard.
Put your OS and basically all applications on the SSD. RAM is cheap, so unless you're doing something unusual you should not be hitting the SSD for swap. Documents and other small but important data can go on the SSD as well. Larger media, like movies, music, and large photo collections, go on the hard drive. The hard drive can act as the first backup for the SSD as well (but not the only backup, of course). I get that companies like Seagate want to have software figure out an optimal mix of where to store data based on usage, but I'm not sure that's such a huge advantage. SSD lifespan can be extended by reducing writes, and storing mostly applications there can really cut down on those, versus using it as a large cache.
On a desktop, having these as separate physical devices is straightforward and very useful. If one starts to die (likely the hard drive), it can be replaced without affecting the other. An added bonus is that either the SSD or the HD could be upgraded separately as you need or as components become cheaper.
On a laptop, things are trickier. Most modern laptops only have one hard drive slot, but it wouldn't be hard to keep a traditional hard drive slot and include, say, 64 GB of SSD on a small chip. Apple does this with most of their Macbook line now; an unfortunate side effect is that proprietary sizing or connectors make third party replacement more difficult, but there's no reason that your standard non-Apple companies have to go that way. There are already several SSDs in the 1.8" form factor, which should be reasonable to fit alongside the standard 2.5" hard drive form factor. A setup like this would be much better than a hybrid disk with a measly 4GB of flash; you're better off making greater use of suspend on your laptop and spending a little more to bump up your RAM.
I think this is because most apps use stock widgets and at least some stock icons, and those are drawn at native resolution. More importantly, all text is also drawn in native resolution. So a bunch of low-res upscaled images scattered around really stands out.
My point is that some apps seem to look worse on an ipad 3 than on an ipad 2, as if the pixel doubling wasn't quite right for some reason. Honestly I haven't put the same app side by side on two devices to see, so it's possible that as you suggest it's all in my head.
The bottom line is that at the distances people view their desktop monitors, they don't want the buttons, graphics, and text to be any smaller than they appear with about 80-100 ppi screens. Give all but the most recent applications and operating systems a 130 PPI screen, and there will be UI items that don't scale. Some UI items, like text, will, leaving the interface feeling out of proportion. Higher PPI screens are available on laptops because 1.) Some people demand the implied screen real estate and 2.) Laptop screens are generally closer to your face. I won't discount the whole 1080p standard putting a natural cap on many screens these days (though my 24" BenQ is one of a dwindling set of 1920x1200 panels).
This is why when Apple put high PPI screens in the iphone and ipad, it doubled the PPI. Existing apps would look the same*. Apps can trivially use perhaps the single greatest feature of high PPI: more crisp text with less dependence on antialiasing to mimic round corners.
And it's why, I suspect, if Apple does release Macbooks this year with "retina" displays, they will be double the PPI. While Mac OS X in theory supports a reasonably scalable UI, applications may not. And web browsers will want to operate as if they're rendering in the lower PPI, though rendering text and non-bitmapped elements at the higher resolution. Eventually (maybe next year), we'll see expensive Apple Cinema displays doing the same thing. And there will be the normal competitors (especially Dell).
But until recently, a 150 or 200 DPI LCD was crazy expensive to produce. Judging from the ipad 3, it also takes significantly more backlight capacity (provided by very bright LEDs in that case). We're just now entering a stage in which there are rumors about the 11" and 13" Macbook line getting them, maybe the 15". It will be a while before the 27" and 30" panels can be produced at a price people are willing to pay. That said, I'm holding off buying any more monitors or replacing my T series Thinkpad until they're available. I'm hoping I don't have to wait past 2013.
* Or at least ought to. Some apps on a third gen ipad will actually look fuzzier than it should.
yeah cause somebody who works exhausting menial labor for 8-12 hours a day is comming home to extrude noodles in their combo bathroom/kitchen sink, using fresh eggs from a grocery store they had to hop 3 busses to get to.
I'm not going to deny the existence of "food deserts" consisting of areas with few choices of fresh fruits and vegetables. However, eggs are ubiquitous. Places like CVS and 7-11 tend to carry them, and they're all over. You pay a small premium over a large grocery store price ($3-4 instead of $2-3 per dozen), but they're still a good value for their protein content. Making one's own egg noodles has to fall pretty far down on the list of cost-saving measures, though they are tastier than dry noodles and cheaper than refridgerated store-bought noodles.
I think the biggest barrier to healthy eating (even at low income levels) is knowledge and some practice. Master a handful of simple recipes, especially those which make leftovers for a few days, and you won't need to spend much time or money (either in food or equipment) to be healthier than most Americans. It's an up-front cost of time, sure, but it's doable in several evenings. Given that Americans watch 30+ hours of TV each week, I think most of us have that time to spare.
Over the last several years the World Bank has been moving in this direction. It used to be that most of the World Development Indicators required a subscription. This is an extremely detailed country-level database of everything from GDP and prices to infant mortality and refugee populations. Now, the database in its entirety is free and it has been loaded into statistical applications like Stata and made available by these folks (along with other World Bank datasets).
It's been my experience that the academic-oriented economists at the World Bank try to disseminate their work as widely as possible. Still, centralized repositories for datasets and code under a reasonable CC license should only make this easier. In economics, potential journal articles tend to spend months or years as working papers while they undergo the referee process; this means that keeping up with the latest research involves relatively less access to expensive journals (compared to other disciplines) than it does with being able to easily find the latest version of a working paper. It's still a long way from being able to cut out the for-profit journals, though some open-access journals do exist.
And that's because of the simple fact that a 44.1 KHz sample rate isn't fast enough to catch the details of sounds
That's not a limitation of FLAC as a codec; you can certainly generate higher sample rate files if you want to.