Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Compare cell phone plans using Wirefly's innovative plan comparison tool ×

Comment Re:RAID is not backup (Score 3, Informative) 294

The problem with cloud-based solutions is that the cost for backing up several terabytes of data is typically several orders of magnitude higher than building your own RAID array, and the performance of Internet-based backup absolutely sucks beyond measure unless you're the sort of person whose data needs are measured in tens of megabytes.

  • To back up 2 TB over a typical cable modem (say 3 megabit upload speed) will take you 61 days. Over typical DSL (300 kilobits per second), it will take almost two years.
  • If you lose your original copy, getting the data back will be almost as painful. On a fairly fast cable modem (30 mbps), assuming the cloud-based backup server can completely saturate your downlink (which is by no means guaranteed), it will take you 6 days of continuous downloading to restore the backup. Over 3 megabit DSL, again, that number goes up to 60 days.

The ideal solution, if you can pull it off, would be to build a small concrete bunker in your yard, run power out to it, put a UPS and power conditioner in there to protect against bad power, put a RAID array in there, wire it with Ethernet to your house underground, put a watertight door on the thing, add a power cutoff that shuts down power if water does get inside (e.g. a GFI breaker and an unused extension cord whose output end is lower than your equipment), and hope for the best.

But more realistically, I would tend to suggest an IOSafe fireproof RAID array loaded up with five 6 TB drives (or maybe even 8 TB drives). Put it in a closet somewhere, and hope for the best. If you want to increase your protection a bit, you could also get two RAID expansion cabinets, store them at work, and periodically bring one home, clone your main RAID array to it, and bring it back

Comment Re:Sour Grapes (Score 1) 80

Actually, try #3. That's the only term that is generic enough to encompass both the individual recording artists (regardless of the degree of artistry) and the record companies that represent them. I'm talking collectively about everyone involved in the process of bringing that content to market who might plausibly be involved in the decision-making process.

Comment Re:Numbers not adding up... (Score 1) 172

You have k(a) Android devices and k(i) failed devices. k(i) divided by n(i) gives you 58%.

No, that's what failure rate is supposed to mean. However, what the numbers actually said are:

  • iPhone 6 had the highest failure rate of 29%
  • iOS devices as a whole had a failure rate of 58%

These two statements cannot both be true simultaneously by any proper definition of "failure rate". The iPhone 6 is a subset of all iOS devices. The claim is made that its failure rate was 29%. For the failure rate of all iOS devices to be 58%, that would mean that at least one iOS device must have a failure rate greater than 58% to pull the average up from 29% to 58%, which contradicts the statement that the iPhone 6 had the highest failure rate at 29%.

Q.E.D.

The only way you could even halfway make those numbers plausible would be if you erroneously divided the iPhone numbers by either the total number of iOS devices or worse, the total number of devices. Either of those approaches makes the numbers meaningless because you don't know the relationship between... to use your terminology... k(i) and n(i) at that point.

In your ramblings, you fail to consider that the vast majority of people who want to avoid expensive shipping charges will often bring their unit into a store... which eliminates many of the simpler problems.

The vast majority of people who want to avoid expensive shipping charges will Google the problem and find an answer themselves. People go to a store when that fails.

Comment Re:Sour Grapes (Score 4, Insightful) 80

I don't really understand how this benefits Spotify as it doesn't improve the service in any way that I can see, and such a move likely makes it worse for users for petty business reasons that have nothing to do with the users.

In the short term, the only negative impact would be if the songs they're demoting are extremely popular and if the public perceives their absence as a loss in quality. Given the size of the musical corpus these days, that seems unlikely.

In the long term, this serves notice to content creators that there's no such thing as a free lunch. Normally, those content creators would have to balance the cost of exclusivity (fewer plays on those exclusive songs) against the benefits (presumably dramatically improved promotion and possibly higher royalty per click. With this policy in place, those content creators have to factor in the loss of the vast majority of their income from the other providers—not just on new content, but also on old content. That significantly changes the balance in a way that discourages these exclusive deals.

And that's a good thing. Vendor exclusivity is inherently anti-consumer.

Comment Re:Not until the laws are changed (Score 1) 175

Under 32 hours and the law would say no benefits are required.

That's not true. You're required to pay for health insurance for anyone working 30 hours or more. Similarly, you're not allowed to restrict 401k for any employer working more than 1,000 hours per year (a little over 19 hours per week).

They could cut the number of sick days or vacation days offered, but that's probably roughly the maximum extent to which they could reduce benefits other than salary.

Comment Re: Congrats Linus (Score 1) 108

...Steve Jobs, Einstein and many ither geniuses, ...

There is no doubt that Linus Thorvalds is clever, and he is certainly successful; likewise Steve Jobs. But comparing them to Einstein is absurd; Einstein being clever was when he invented a fridge with no moving parts (see https://en.wikipedia.org/wiki/..., it wasn't successful, incidentally). But he was far outside the league of those two, and most of the other brilliant scientists throughout history. One of his brilliant achievements was the explanation of the photoelectric effect, which won him a Nobel price (and made him one of the founding fathers of QM) - and that was one of his smaller achievements, just think of that. He was right up there with the likes of Euler, Gauss and Newton.

Yes, Linus is clever, and when linux came out, it was a true revelation to all of nerdkind, because where we had been used to DOS and Windows, or to paying hugely for a real OS, suddenly we had a real OS for free. It was great, and it has got better ever since. But I don't think we do Linus or linux a service by exaggerating.

Comment Re:Numbers not adding up... (Score 2) 172

The percentages are percentages of the 58% of failing devices. Of the devices that failed, 29% were iPhone 6, 23% were the 6s, and 14% were the 6s Plus. Add those together and we're missing the final 33% of failed devices but it's safe to assume that a random collection of 6 Plus, 5SE, 5s, 5c, etc. make up that final 33% of the 58%.

So let me see if I understand this epic math fail correctly. Given n devices, there were k devices that were brought in for repair. Of those k devices, 58% were iOS devices, and of those 58%, 29% were iPhone 6 devices.

Which tells us absolutely nothing about the actual failure rate without knowing how the makeup of those n devices relates to the makeup of those k devices. It tells us nothing about the actual failure rate without knowing what percentage of each model within k were junked and replaced without notifying the service center in question. It tells us nothing about whether the Android and iOS users have similar levels of self-sufficiency in terms of figuring out how to solve their own problems. And there are probably at least three or four other fairly fundamental errors that make this data essentially pure noise.

Arguments over minor methodology points, such as whether to count specific types of failures in the reliability numbers, are basically moot, because the "data" is purely anecdotal and is not mathematically related to the actual rate of failure to begin with. This isn't statistically any better than saying, "Of my friends, more people have had problems with Android phones than iOS phones" or vice versa. If you know nothing about whether the sample population has similar distribution to the general population and you know nothing about whether the data is even an accurate measurement of the sample population itself, then these numbers are quite literally no better than a random number generator with a Gaussian distribution. You might as well arrive at the results by throwing darts at a dartboard. It will be approximately as meaningful.

Am I missing something?

Trust me, if even 1% of iPhone hardware failed during its warranty period, heads would roll, much less 58%.

Comment Re:is it worth the upgrade? (Score 1) 158

As a 6D user, in my experience, the Wi-Fi is really nice if you're part of a group trip. You can have your cell phone out, and once in a while when there's a pause, you can snag a photo off your real camera and upload it to Facebook so that the folks back home can see what you're all doing. It's much easier than trying to take photos with two devices at once, because the extra time spent fiddling with your phone is while you're on a bus riding somewhere or whatever instead of while you're out sightseeing on a schedule.

It is also occasionally useful if you don't have (or forgot to bring) a remote controlled trigger release. You can use it to see what the camera sees (in live view/EVF mode) and tell it to take photos, albeit with a lot of shutter lag. With the dual-pixel AF in the 5D Mark IV, it should be even better because you'll have actual phase-detection autofocus with continuous focusing in live view mode instead of contrast-detection AF.

Comment Re:5 years old news ? (Score 1) 158

Canon puts the stabilization in the lens for a good reason. Sensor-based stabilization is only useful on point-and-shoot cameras or mirrorless cameras with electronic viewfinders. As soon as you have an optical path to your eye, sensor-based stabilization is worthless, because it won't help you frame the shot. By contrast, lens-based stabilization locks the image in place so that you can actually see what you're taking a picture of.

This makes a huge difference even at 300mm. By 600mm, you'd be hard pressed to ever get a shot of anything without lens-based optical stabilization.

Comment Re:Pixels density (Score 2) 158

Except that it doesn't, because it doesn't. The 5D Mark IV sensor uses a gapless microlens array. There are no boundaries between the pixels, period. All light that hits the sensor's surface goes into the sensor except for any that gets reflected when it hits the surface.

Comment Re:Pixels density (Score 2) 158

True but the image will always suffer from less thermal noise on an equivalent sensor with larger photosites.

Realistically, thermal noise is almost irrelevant except for long-exposure photography (e.g. astrophotography). For normal photographic purposes, it's the shot noise that kills you in low light. When the difference between one and zero photons makes a visually noticeable difference in the resulting value, individual pixels are going to have noticeably different values than the pixels next to them even when they're getting approximately the same amount of light, because a pixel either gets the photon or it doesn't.

But that shot noise basically goes away when you downsample. If you double the number of pixels, a "pixel crop" (one pixel on the individual photo to one pixel on your screen) will give you more noise on the one with smaller sites, but it will also be looking at a much smaller area. If you crop them to cover the same area and average the signals, you'll find that the same number of photons hit both sensors and were detected, so the result is approximately the same, with the exception of the small amount of loss caused by the wiring around the pixels. And by the time that starts to become significant, you're roughly at cell phone pixel densities, and you're either doing back-side illumination, microlens arrays, or both to get rid of that problem.

Comment Re:Followed by: (Score 2) 445

We may still get periods of smaller and less frequent storms even with extreme global warming just as we do today.

Absolutely true. And if I may add a bit along the same lines: As the atmosphere gets warmer, it gets more turbulent - this happens in any fluid medium (ie. water and air); we have probably all seen this experiement in science class in school, where you have a large glass bowl of water, put in a few crystals of something strongly coloured and heat it at the bottom, ad the colours start swirling around. If you were to measure the temperature in different places, you would find that the water rising up is warm, and the water sinking down towards the heat source is cold. This is almost exactly what happens when North America has record cold winters at the moment - the hot air rises up in the atmosphere, the Coriolis effect or something sends it towards the poles, and the cold air is displaced to the south: the Arctic is warming very quickly and the mid-latitudes are experiencing severe winters. Which is why climatologists say this is consistent with global warming; but the "skeptics" insist that is proof of the opposite. The skeptics are of course mistaken - looking only at data that are very localised in time and area is simply cherry picking.

Slashdot Top Deals

Happiness is a hard disk.

Working...