I'm not sure I'd right off the USPS. Their parcel business seems to be growing rapidly, with very competitive prices for small fixed price boxes.
In Canada, I greatly prefer Canada Post to UPS or any of the other corporate carriers like DHL. I've had carriers stick my packages in the mailbox (which is fine) and fake a signature (which is not). They ring the doorbell and walk away without leaving a note to tell me they were here. If I'm home I have to run to the door to have any chance of catching them. Sometimes they leave my packages sitting on my front door (I live downtown on a busy street). With UPS and DHL, sometimes I have to go out to the burbs to pick up a package I missed, sometimes they leave it at a local location.
With Canada Post, though, it's almost always smoother and more pleasant. If I'm not home the package either goes into the mailbox or I get a note to pick it up the next day someplace local.
I think the big shopping mall anchor stores (Macy's, JC Penney, etc) are all likely to fail in the next 20 years. Sears is already a dead man walking, Penney's is close and the others are living on borrowed time.
This will be interesting. I'm not sure that *all* of them will fall, although I think most of them will. As tepples said, fitting rooms are pretty damn useful. However, the stores that survive might be smaller, clothing-specific stores like Old Navy or Eddie Bauer.
Sears has been horribly mismanaged for a long time. Even if department stores were healthy I think they'd be in trouble.
Just give us the same thing that got rid of them the last time around. DDT works.
Bedbugs were apparently resistant to DDT by the 1950s.
There's absolutely no reason I should be footing the bill for a service I have no intention of using.
You realize that a caching appliance for a heavily-used service like Netflix could save an ISP bandwidth costs, right? Presumably more than enough to offset the cost of switch ports, rack space and electricity.
Lord knows where they develop the film, though. (Unless setting up your own darkroom is a hipster fad I've overlooked.)
You don't need a darkroom to develop film and scan it. You just need a changing bag, which is basically a black bag with arm holes. It's designed to keep the light out while allowing your hands to work with whatever's inside. It takes a little practice, but it's easy enough to wind the film around a reel and put it inside a light-proof canister. From there you just pour in whatever developer you're using through a tiny hole at the top.
What you would need a darkroom for is making prints from your negatives. I have actually never done that.
Two item: 1) They were smelting iron in the Lake Region of Africa (Rwanda) thousands of years ago. So it is possible that the Egyptian either knew how or they could of traded for it if they needed iron. 2) The Egyptian used iron from Meteor for sacred purpose. It was important to them that this iron came from the stars/heaven. The item was made of Meteor iron not because the Egyptian couldn't smelted iron but because it was important that the object be sacred.
Iron smelting in Africa dates back to somewhere in the range of 1500-1750 BC (see Google books link and wikipedia link on the topic). However, per the Nature article the artifact in question dates back to about 3300 BC, over a thousand years earlier. So at the time point 1 is invalid (at least based on present evidence). Point 2 seems pretty likely, though.
I say "Because OMFG, gross!!!"
We already eat other arthropods, like shrimp, crab, crawfish and lobster.
All we have to do to treat you...
Compare it to chemo and radiation therapy. If they can deliver the bacteria safely (which is a big if) and if it ends up delivering less radioactivity to the patient than ordinary radiation therapy, it might end up being safer. Treating cancer is often about trade-offs.
Let me rerephrase that for you. "Developers don't care jack shit..." Show me a developer who is incapable of being a successful sys. admin and I will know you a terrible developer. It's all about time and interests.
Absolutely. I know plenty of people who've been both good sysadmins and good developers at various points in their career.
Like you say, developers aren't necessarily interested in the things that make for good administration, though. The ability to create virtual machines at will, for instance, means that developers can create more virtual machines than are needed, which results in greater administrative overhead and greater costs. They can also sidestep normal administrative procedures.
We used to have enough problems with this back in the day when everything ran on physical machines. A developer would slip a box into the data center and not tell anybody or document anything. Later we'd have to clean up the mess when we discovered that nobody had been patching the machine and it had a security problem, or they hadn't configured something properly so that it didn't start up properly on reboot. My favorite was the time the developer (who was in my opinion very otherwise very good) forgot to configure the IP addresses of a machine with anything more than ifconfig. The machine came back up after its first reboot without IP addresses and a good chunk of the customer's site was down for an extended period of time.
With virtual machines, this is so much easier.
If those ideas were so great, there would be no way to be sceptic about them, would there?
That's not how it works. There is always a way to talk yourself out of an idea, good or not.
If you want to use multiple links all at the same time, with the packets spread over them, you're supposed to get an Autonomous System number.
This is more akin to link aggregation than it is multihomed Internet connections. Any two hosts could use this. They could be in the same autonomous system. They could be on the same subnet. There's no need to get a separate AS number for each host.
Note that one of the other use cases suggested is for smartphones.
Being a "Thinkpad" used to mean something... it meant you were buying a laptop built to survive Armageddon (well, at least one that's neither wet nor sandy) that you'd feel compelled to hang on to forever as a future family heirloom, because it just seemed morally wrong to ever throw one away.
Even before IBM sold them to Lenovo, the quality wasn't consistent. Going way back to the late nineties, the 380 series was rock-solid. As of a couple of years ago, we still had a few of those scattered around in test labs to use for serial access to routers and switches (albeit with no battery).
The next model we bought, the 390 series, was a piece of crap. Nearly everyone who had one of those returned it in a bag. They just fell apart. At the time I was pretty easy on my laptop, and just opened it twice a day (once at work, once at home). The display literally fell off--the plastic where the screws held it in place broke. We had large numbers of users with the same or similar problems.
It was a pity,since I remember that as being a very nice laptop otherwise.