Comment Inexperienced drivers are inexperienced (Score 5, Insightful) 217
Young drives have always been at risk because they have the least experience, only the distractions have changed over time.
Young drives have always been at risk because they have the least experience, only the distractions have changed over time.
MM wave
According to wikipedia ( I know I know ) Salt Lake City uses MM wave and not backscatter. Either way they microwaved and damaged a piece of medical equipment after assuring the user that it was perfectly save for that equipment. Unlike an implanted medical device the insulin pump would be susceptible to MM scans.
Especially in light of the fact that MS considers such basics as "network backup" as Pro features.
Then the GPS industry should buy the spectrum if it's necessary for proper operation of their devices.
You want semantics.
LightSquared does not interfere with GPS transmitters and GPS receivers with proper filter are not affected either. The FCC ruling admits that the utility of the GPS receivers outweigh the right to use the adjacent bands ( even with buffers ) for alternate uses because manufacturers didn't install proper filtering on the receivers.
FYI, it's the GPS fault for making the presumption that the adjacent spectrum would always be quiet. With this ruling the FCC admits that the GPS receivers are in violation of their license.
The basic fact is GPS was in violation of their part B license first. LightSquare would be able to operate if it were not for the GPS industry cheaping out on their filters. The FCC ruling is a tough one since they are forced to take the side of a ubiquitous service which is in violation of their license or rule for the startup that could potentially bring competition to the broadband market nationwide.
Studios live on a strong distribution model where they control the vast majority of the content and the distribution channels. Any tool that is viable for "piracy" is also viable by independent distributors as well. While I don't condone copyright infringement I think studios are more interested in their long term viability than to protect their content from "piracy". I expect similar behavior from the major publishing houses in the next couple of years as ebooks break their hold on the distribution channels.
Backups simply are not really an option past 20+ terabytes of storage, and simply not feasible if the storage is volatile in nature. AFAIK everyone has gone to redundancy over backups at scale.
200TB/130TB usable clustered/distributed system with 4x LTO5 drives and we do a full snapshot to tape every week. With data that size you either pay up-front for proper engineering or you pay for the life of the system for poor performance and eventual cleanup of the mess.
Well since anything over 100TB is not supported by the vendor I would say not really a great idea. The reason it's not supported is there is no reasonable way to maintain ( things like an error would result in days worth of outages to fsck and/or restore from backup ).
That is only when you use the minimal guarantees from the datasheets. In practice, with healthy disks, read errors are a lot less common.
Are you willing to bet 70TB+ on it, because that's what you are doing.
jornal checksumming only prevents errors in the journal, not once the data has been written to the main storage area. This was done primarily to ensure the atomic nature of the journal is not violated by a partial write.
Our BTRFS evaluation resulted in rejecting it for some very serious problems ( what they claim are snapshots are actually clones, panic in low memory situations, no fsck, horrible support tools, developers who are hostile to criticism, pre-release software,
Unless every read does a checksum ( they don't or it would kill performance ) then there is still the possibility of a silent read corruption. At 70TB it would be rare, but not as rare as many would think and would depend on the sector size and checksum on the individual drives.
Always draw your curves, then plot your reading.